• NVIDIA GTX 680
    187 replies, posted
[QUOTE=AshMan55;35244613]apparently it's $679 here in aus damnit :([/QUOTE] atleast it's not... [I]$680[/I] *cough* but yeah I feel sad about that price too
I'm still buying one :D
In depth Tom's Hardware review: [URL="http://www.tomshardware.com/reviews/geforce-gtx-680-review-benchmark,3161.html"]http://www.tomshardware.com/reviews/geforce-gtx-680-review-benchmark,3161.html[/URL]
[QUOTE=Coridan;35244654]That's a horribly incompetent and painfully biased review. I'd wait for a more professional and/or unbiased review. [editline]22nd March 2012[/editline] Like this one: [URL="http://www.pcmag.com/article2/0,2817,2401953,00.asp"]http://www.pcmag.com/article2/0,2817,2401953,00.asp[/URL][/QUOTE]It was the only one up when I was writing the post.
I received word that the GTX 680 is backwards compatible with the 2.0 PCI slot. Is this true... and if so, would it make a difference in performance if you used the 2.0 slot? Just curious!
[QUOTE=Live2becool;35246669]I received word that the GTX 680 is backwards compatible with the 2.0 PCI slot. Is this true... and if so, would it make a difference in performance if you used the 2.0 slot? Just curious![/QUOTE] [url]http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-680/specifications[/url] *GeForce GTX 680 supports PCI Express 3.0. The Intel X79/SNB-E PCI Express 2.0 platform is only currently supported up to 5GT/s (PCIE 2.0) bus speeds even though some motherboard manufacturers have enabled higher 8GT/s speeds.
I heard that middle of next month we should see 4GB cards out. If that's true, I'll be buying atleast two. No way I'm going to 2GB from 3GB.
[QUOTE=Brt5470;35247679]I heard that middle of next month we should see 4GB cards out. If that's true, I'll be buying atleast two. No way I'm going to 2GB from 3GB.[/QUOTE] Source?
[QUOTE=Live2becool;35246669]I received word that the GTX 680 is backwards compatible with the 2.0 PCI slot. Is this true... and if so, would it make a difference in performance if you used the 2.0 slot? Just curious![/QUOTE] PCI-E is designed to be backwards compatible. You should be able to use a 680 in a PCI-E 1.0 slot if you wanted to.
[url]http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/8[/url] what is this 50% better about now? all I see is 1-2%
[QUOTE=Live2becool;35246669]I received word that the GTX 680 is backwards compatible with the 2.0 PCI slot. Is this true... and if so, would it make a difference in performance if you used the 2.0 slot? Just curious![/QUOTE] "Received word"? huh? PCIe is always backwards compatible. That's like saying you just heard that BD players magically play dvd's.
[QUOTE=Ehmmett;35241468]I'll take 2, thanks.[/QUOTE] Aw fuck they sold out as soon as I woke up :(
So the GTX 680s hype was once again blown widely out of proportion, as Tom put it, it's about the same as a 7970 in games, give or take a little.
[QUOTE=David Tennant;35248679]So the GTX 680s hype was once again[B] blown widely out of proportion[/B], as Tom put it, it's about the same as a 7970 in games, give or take a little.[/QUOTE] Not entirely though. We need to remember that this card was initially meant to be a lower range offering until Nvidia saw the performance of AMD's 7 series cards. With the architecture Nvidia has developed we still have a lot of offerings to see on their part.
[QUOTE=GreenDolphin;35248880]Not entirely though. We need to remember that this card was initially meant to be a lower range offering until Nvidia saw the performance of AMD's 7 series cards. With the architecture Nvidia has developed we still have a lot of offerings to see on their part.[/QUOTE] There were so many people that told me the 680 was going to be double, triple the performance of a 580, of-course this was unrealistic but there were people who believed this was going to be the case and probably still is.
i like how that article is says its affordable also i read it was supposed to replace the 560 Ti so why the fuck is it $500
i love EVGA [img]https://fbcdn-sphotos-a.akamaihd.net/hphotos-ak-ash4/318168_10150758109305421_39537220420_11917007_533475908_n.jpg[/img]
All of those cards :flashfap:
All going to Cyberpower PC.. [editline]22nd March 2012[/editline] [QUOTE=David Tennant;35249033]There were so many people that told me the 680 was going to be double, triple the performance of a 580, of-course this was unrealistic but there were people who believed this was going to be the case and probably still is.[/QUOTE] They were telling you that because they're idiots who can't read.
[media]http://www.youtube.com/watch?v=VLCDYVXiL-0&feature=g-u-u&context=G22cb935FUAAAAAAALAA[/media]
[QUOTE=David Tennant;35249033]There were so many people that told me the 680 was going to be double, triple the performance of a 580, of-course this was unrealistic but there were people who believed this was going to be the case and probably still is.[/QUOTE] Those people were idiots, no one from anywhere credible said it would have triple the performance. It had triple the processors.
[QUOTE=GreenDolphin;35248880]Not entirely though. We need to remember that this card was initially meant to be a lower range offering until Nvidia saw the performance of AMD's 7 series cards. With the architecture Nvidia has developed we still have a lot of offerings to see on their part.[/QUOTE] It doesn't really make sense to call it lower range and a GF104 replacement when it consumes more power on a smaller die and smaller process (even though the number is still the same I don't get why Nvidia markets it like that) and costs $500. Well I guess partly it's $500 because of it's performance relevant to the 7970 but it's still surprising to see a 294mm2 chip priced that high. Honestly none of that bullshit matters anyways when the only relevant things to consider are power consumption, performance, and price. Considering Nvidia took a shit on compute performance with the 680 it doesn't surprise me that combining the die shrink and shader improvements that they were able to reduce the die size and hence increase efficiency. Also, transistor count pretty much tells you zero about how well a card is going to perform as there's parts of the chip that aren't designed for gaming (GPGPU (including Cuda, OpenCL) etc) One thing Nvidia has AMD on the ropes for is GPU efficiency and that's one thing AMD GPU's have had ahead of Nvidia that doesn't exist anymore. It wouldn't surprise me to see AMD release the 8k series and scrap compute or something similar.
Heh, I'd be tempted to sell my current 560ti for £120 as i've only had it for two months. Then possibly put funds to a 680 :v:
Really tempting, but I think I'll stick with my 580's.
Seeing as I don't have the finance to purchase this slice of heaven I'll be sticking with my 580.
Getting a GTX 680 for $172, will not show up til the mid-end of April most likely. I love EVGA's step up program.
wonder how much it would be for a 670 then with EVGA. going to look into it
[QUOTE=Dark-Energy;35252099]It doesn't really make sense to call it lower range and a GF104 replacement when it consumes more power on a smaller die and smaller process (even though the number is still the same I don't get why Nvidia markets it like that) and costs $500. Well I guess partly it's $500 because of it's performance relevant to the 7970 but it's still surprising to see a 294mm2 chip priced that high. Honestly none of that bullshit matters anyways when the only relevant things to consider are power consumption, performance, and price.[/QUOTE] Nvidia is pricing it at $500 because they know they can label their midrange card as high end when it's on par with the AMD 7970. This is an amazing time for Nvidia to profit, but a terrible one for us consumers.
What? $500 is a great price for that card.
[QUOTE=Makol;35254531]What? $500 is a great price for that card.[/QUOTE]this
Sorry, you need to Log In to post a reply to this thread.