• nVidia GTX 590 rumored to release late Febuary
    65 replies, posted
Been awhile since we have seen a dual GPU from nVidia. Should be interesting. Also, sorry If it's already been posted I searched and nothing showed so yea. [url]http://www.tcmagazine.com/tcm/news/hardware/34774/nvidia-rumored-launch-geforce-gtx-590-dual-gpu-card-february[/url] [img]http://fileconnect.net/sites/default/files/resize/imagecache/tcm-inline-default/images/tcm/inline/evgadual-gpugeforceces201101-575x445.jpg[/img] [img]http://fileconnect.net/sites/default/files/resize/imagecache/tcm-inline-default/images/tcm/inline/evgadual-gpugeforceces201102-575x237.jpg[/img]
Fuck that's a long card. It could be the new 5970.
And it'll be $800-$1000.
[QUOTE=chipset;28044240]And it'll be $800-$1000.[/QUOTE]Ehh I'm going to guess 600-700. But I'm leaning for like $699 mostly.
depends, MSRP? yeah, about $700. but the going rate? upward of $850.
[quote]as the tweaked 40nm GF110 GPU would be used on the incoming GeForce GTX 590, allowing the card to have 1024 of CUDA Cores, a 2 x 384-bit memory interface and 3GB of GDDR5 memory. Its frequencies are unknown but it's very likely they won't be higher than those on the single-GF110 GeForce GTX 580 flagship (772/1544/4008 MHz for the GPU/shader/memory).[/quote] Magic everywhere in this bitch
Hopefully by early summer when I build my new PC, this would have been out for a few months and will be at a good price (Hoping £400~~ but that might be a little optimistic) And finally 3 DVI on one graphics card.
hnnnnnnngggggg
I want to run 3 monitors off one card.
6990 Rival, I suppose.
Oh cool, a suitable replacement for my gtx 295.
Soo, a real oven in your PC now? Aw shyt. Somebody quadruple- SLI those beasts.
[QUOTE=tratzzz;28048749] Somebody quadruple- SLI those beasts.[/QUOTE] You can't.
[QUOTE=B!N4RY;28048903]You can't.[/QUOTE] Yes you can. Why wouldn't you? It's just that two cards = quad SLi.
[QUOTE=Within;28048929]Yes you can. Why wouldn't you? It's just that two cards = quad SLi.[/QUOTE] You don't call that quad SLI.
[QUOTE=B!N4RY;28048971]You don't call that quad SLI.[/QUOTE] Technically it is quad-SLI because the two GPUs are using an integrated SLI bridge. So three bridges in total (two integrated, one external) would make it quad-SLI.
[QUOTE=Wiggles;28049203]Technically it is quad-SLI because the two GPUs are using an integrated SLI bridge. So three bridges in total (two integrated, one external) would make it quad-SLI.[/QUOTE] Yes I know. However, most people only count the physical number of cards to avoid confusion.
[QUOTE=B!N4RY;28049231]Yes I know. However, most people only count the physical number of cards to avoid confusion.[/QUOTE] But quad-SLI sounds cooler.
That without a doubt could not fit in my case.
[QUOTE=luck_or_loss;28046190]I want to run 3 monitors off one card.[/QUOTE] You can, not sure if nvidia can though. I have a 5850 and run 2xdvi and 1xdp.
Well I don't have any reference to it, so it doesn't really look that big to me...
i wonder how many nuclear power plants you need to run this
[QUOTE=B!N4RY;28049231]Yes I know. However, most people only count the physical number of cards to avoid confusion.[/QUOTE] But the correct term is Quad-SLI. [editline]14th February 2011[/editline] [QUOTE=Flubadoo;28050519]Well I don't have any reference to it, so it doesn't really look that big to me...[/QUOTE] Look at the slot cover.
Oh shit. Why would you buy that?
My heart just beat a little faster.
[QUOTE=MTMod;28051723]Oh shit. Why would you buy that?[/QUOTE] Wondering that too, 4 GTX 580's would probably outperform 2 of these.
[QUOTE=thf;28051819]Wondering that too, 4 GTX 580's would probably outperform 2 of these.[/QUOTE] I'm pretty confident that buying 4 separate GTX 580's would turn out to be more expensive than buying two 590's.
[QUOTE=BURG;28051865]I'm pretty confident that buying 4 separate GTX 580's would turn out to be more expensive than buying two 590's.[/QUOTE] Sure, but when you're that extreme, you probably have the cash for that anyway.
[QUOTE=thf;28051819]Wondering that too, 4 GTX 580's would probably outperform 2 of these.[/QUOTE] 2 3Gig 580s will outperform 2 of these, much less 3-4 580s. The heat and circuitry bleed off on these is going to be immense. Buying a current sammich card on the current PCIE spec is frankly, in a word, stupid. About this time next year when the PCIE spec is much more robust and GPUs are on a smaller process, with much broader buses, a sammich card will be amazing. Now is not that time.
Dual GF114 would make a lot more sense, and will allow them to keep the costs down, while still retaining the performance and overclocking headroom they desire. Using dual GF110 IMO is idiotic if it's true. It just doesn't make any sense. Designing a dual GPU card is about efficiency, not speed (that's already a given). We know that GF114 or even the GF104 clock for clock are more efficient cores, and would suite a dual GPU card much better. But slapping the high end GF110 just seems like an act of desperation to maintain their "fastest GPU in the world" scheme or something rather. This will have to be massively downclocked, and probably will be extremely expensive to produce due it being a high end chip and the cores will have to be binned to maintain a lower voltage to remain within the power spec. The only thing you lose are the extra shaders and texture units, but that is made up by the cheaper costs and higher core clocks which would make it a better candidate. [QUOTE=27X;28052901]2 3Gig 580s will outperform 2 of these, much less 3-4 580s. The heat and circuitry bleed off on these is going to be immense. Buying a current sammich card on the current PCIE spec is frankly, in a word, stupid. About this time next year when the PCIE spec is much more robust and GPUs are on a smaller process, with much broader buses, a sammich card will be amazing. Now is not that time.[/QUOTE] The problem with that is, we shouldn't have ever hit the 300w spec to begin with. Nvidia stated after the 8800 that they would not make another card that consumes more power than the G80, and I think AMD implied the same thing. Slowly things have inched upwards and due to competition there were 20W increases here and 20W increases there and suddenly we have power hungry, hot cards. Mind you, the x1800 and x1900 ran hot, but consumed a lot less power. It just goes to show how desperate companies are to maximize heat dissipation, and are always coming up with new designs, such as Vapor chambers, which used to be on Dual GPU cards, but now they are on single cards.
Sorry, you need to Log In to post a reply to this thread.