[QUOTE=Dark-Energy;28057300]Dual GF114 would make a lot more sense, and will allow them to keep the costs down, while still retaining the performance and overclocking headroom they desire. Using dual GF110 IMO is idiotic if it's true. It just doesn't make any sense. Designing a dual GPU card is about efficiency, not speed (that's already a given). We know that GF114 or even the GF104 clock for clock are more efficient cores, and would suite a dual GPU card much better. But slapping the high end GF110 just seems like an act of desperation to maintain their "fastest GPU in the world" scheme or something rather. This will have to be massively downclocked, and probably will be extremely expensive to produce due it being a high end chip and the cores will have to be binned to maintain a lower voltage to remain within the power spec. The only thing you lose are the extra shaders and texture units, but that is made up by the cheaper costs and higher core clocks which would make it a better candidate.
The problem with that is, we shouldn't have ever hit the 300w spec to begin with. Nvidia stated after the 8800 that they would not make another card that consumes more power than the G80, and I think AMD implied the same thing. Slowly things have inched upwards and due to competition there were 20W increases here and 20W increases there and suddenly we have power hungry, hot cards. Mind you, the x1800 and x1900 ran hot, but consumed a lot less power. It just goes to show how desperate companies are to maximize heat dissipation, and are always coming up with new designs, such as Vapor chambers, which used to be on Dual GPU cards, but now they are on single cards.[/QUOTE]
I don't really think anyone buying a card like this cares about efficiency or price, to an extent.
It looks pretty. I want to mount one on my wall.
I wish I had money to buy these things
[QUOTE=Takkun10;28057769]I wish I had money to buy these things[/QUOTE]
It's easy if you try! Get a job and save up money... in this case, a LOT of money :v:
[QUOTE=Odellus;28057667]I don't really think anyone buying a card like this cares about efficiency or price, to an extent.[/QUOTE]
But the point is, the faster card at 300W will be the card to buy, and that's all dependent on the efficiency. It doesn't matter if the consumer does or doesn't care, the efficiency will result in better performance assuming the power consumption of both competing cards are the same (300W) which is quite obvious. If they use GF114 chips instead of GF110, the performance per watt would be better, which would allow for the same power consumption but a faster card and probably cheaper, and all these would make it a more appealing card to the market and a better selling point which would result in more sales. Yes, most people buying these cards probably don't give 2 shits about how efficient it actually is but making a more appealing card certainly would help people who are stuck in a dilemma.
[editline]14th February 2011[/editline]
Of course, I'm assuming GF114 would have better P/watt in a dual GPU configuration, and if Nvidia is choosing to use GF110, I'm either wrong, or Nvidia is incredibly stupid.
I wonder if that could keep me warm in the winter....
[QUOTE=Dark-Energy;28057300]Dual GF114 would make a lot more sense, and will allow them to keep the costs down, while still retaining the performance and overclocking headroom they desire. Using dual GF110 IMO is idiotic if it's true. It just doesn't make any sense. Designing a dual GPU card is about efficiency, not speed (that's already a given). We know that GF114 or even the GF104 clock for clock are more efficient cores, and would suite a dual GPU card much better. But slapping the high end GF110 just seems like an act of desperation to maintain their "fastest GPU in the world" scheme or something rather. This will have to be massively downclocked, and probably will be extremely expensive to produce due it being a high end chip and the cores will have to be binned to maintain a lower voltage to remain within the power spec. The only thing you lose are the extra shaders and texture units, but that is made up by the cheaper costs and higher core clocks which would make it a better candidate.
The problem with that is, we shouldn't have ever hit the 300w spec to begin with. Nvidia stated after the 8800 that they would not make another card that consumes more power than the G80, and I think AMD implied the same thing. Slowly things have inched upwards and due to competition there were 20W increases here and 20W increases there and suddenly we have power hungry, hot cards. Mind you, the x1800 and x1900 ran hot, but consumed a lot less power. It just goes to show how desperate companies are to maximize heat dissipation, and are always coming up with new designs, such as Vapor chambers, which used to be on Dual GPU cards, but now they are on single cards.[/QUOTE]
Above:
Yeah it's about having the single "best" SKU in the "world", which again, you stick two Gainward Phantoms in something and you'll take up the same space and scaling at the resolution this is good for is already 85-90% to begin with, onboard bridge interconnect or not, much less someone with actual chip choosing power like EVGA or MSI.
Below:
In nVidia's case, it was inevitable. Their first "elegant" chipset was a G80 refresh, and their first "from the ground up scalar design" won't be seen until 2013. For AMD's part, their design was forward-and-economic state-thinking, but it also has a ceiling, and I don't think the next two iterations will be far from that paradigm. In both cases companies make claims about things that ACTUALLY come true several generations later in practical approach.
Sounds great but im waiting for Kepler cards to be out(their next chipset out this year). Also that card should come with a fire extinguisher.
[QUOTE=n0cturni;28057709]It looks pretty. I want to mount one on my wall.[/QUOTE]
You're certain that you have a wall big enough?
I'd love this too but I would need to cut a hole in the front of my case and get enough power stations to power it
That EVGA card looks pretty slick. Oh well. I'm saving up until LGA 2011 and probably Keplar.
nVidia you dog.
Also, I like how EVGA will have a standard custom cooler.
[quote]We had a chance to see Geforce GTX 590 at Cebit. It is real, it runs and it’s coming soon. It is quiet and packs enough power to run Crysis 2 in 3D at very nice frame rates.
The card has two PCIe 2x8 power connectors and it is rather hot, something that doesn’t come as a surprise. The card looks slightly different than the EVGA card we saw at CES 2011, but the EVGA card had a custom cooler that will be different than the reference one.
The launch has not been set, sources confirm, but there is a chance for a March, Q1 2011 launch. It looks like Nvidia wants to see Radeon HD 6990 performance and then decide on the final clocks and specs.
Once this card launches, it will be time for us to start sniffing about possible Q4 2011 28nm part launch dates. [/quote]
[url]http://www.fudzilla.com/graphics/item/21991-we-saw-geforce-gtx-590-in-action[/url]
This is just awesome, I hope it come with a awesome pricetag aswell.
Why should the card be so long when they could make smaller? :S
[QUOTE=Carlios;28416834]This is just awesome, I hope it come with a awesome pricetag aswell.[/QUOTE]
$1000 probably
I can't keep up with the rapid succession of video cards. I'm stuck in 2009 when it comes to what I view as the best technology out there.
[QUOTE=tomatmann;28416867]Why should the card be so long when they could make smaller? :S[/QUOTE]It's a dual GPU that's why. And it's probably going to be around $700-800. Not 1000.
iv just bought myself a gtx 5700 for $400 FUCK i want this one now :(
[QUOTE=deaththrea10;28419736]iv just bought myself a gtx 5700 for $400 FUCK i want this one now :([/QUOTE]
what is a GTX 5700
[QUOTE=ZombieWaffle;28429902]what is a GTX 5700[/QUOTE]
4096 Cuda cores
4600 Mhz core clock
32GB Video Memory
418 Double Precision TFLOPS
1400 Watt TDP
Quadruple slot cooler with 4 separate vapor chambers
[QUOTE=ZombieWaffle;28429902]what is a GTX 5700[/QUOTE]
I think he meant the 570
[QUOTE=TankHawk500;28430373]I think he meant the 570[/QUOTE]
he did.
It greatly annoys me when people say "Goddamnit, I just bought this %model% recently" right after seeing the same hardware developer announces a new flagship hardware; where %model% isn't even the current flagship device.
Well, it needs to be that long so it could fit that many fans on their. And since its already made that long they added more features to it. Those features added heat, thus adding more fans.
[QUOTE=ZombieWaffle;28429902]what is a GTX 5700[/QUOTE]
Well theres the Nvidia Geforce FX 5700 from 2003. lol
Man at this rate we're gonna be naming video cards with two digit numbers soon, maybe with decimals.
[QUOTE=K3inMitl3id;28432001]Well theres the Nvidia Geforce FX 5700 from 2003. lol
Man at this rate we're gonna be naming video cards with two digit numbers soon, maybe with decimals.[/QUOTE]
Or drop the name GeForce all together
[QUOTE=B!N4RY;28432539]Or drop the name GeForce all together[/QUOTE]
no
instead GeNorse.
[QUOTE=Odellus;28433549]no[/QUOTE]
why not? we could call it Droprate or Atmosphere. Or "bro, just got my new Nvidia 5700 FT Faceplant. It's fuckin' rad!"
Shait, this looks sweet. Anyone knows if it will fit with an Asus P6T SE motherboard?
Sorry, you need to Log In to post a reply to this thread.