• NVIDIA announces its GeForce GTX 1060, ($249, 6GB GDDR5, '980 level performance')
    83 replies, posted
If i'm going to make an investment for a new graphics card, would it be better to go for this or the 1070, if the 1060 is similar to the 980.
I'm still using the gtx660 but it's starting to have problems with drivers crashing and the encoding simply not working, is that worth the upgrade?
[QUOTE=Smoot;50666962]How do companies like NVIDIA release "revolutionary performance" every year? Do they do R&D to the point of "Okay, this is what we have by this date, release it" or is it just never ending "We've discovered better newer technology!"? What I am asking is, do they limit how much work they do each year so they can have another better system next year and so on? I feel like you can only tweak a technology so much before you reach the limit.[/QUOTE] They (and AMD) basically hit a limit for "revolutionary" performance gains last year with the release of the Fury line-up and the GTX 980 Ti. They literally maxed out what they could do on that process, size-wise. The cards coming out are a combination of a big process-shrink and improvements to the architecture. So now they can "sit back", enlarge the die size (for example, doing a new card with a 400mm^2 die on the same architecture could increase performance by maybe 30% compared to the 1080, which is significant) and tweak the architecture for new generations for improved performance until they hit the size limit of this process. That may be oversimplifying it, but it's the gist of it.
[QUOTE=Instant Mix;50666967]If i'm going to make an investment for a new graphics card, would it be better to go for this or the 1070, if the 1060 is similar to the 980.[/QUOTE] Well what kind of card do you have now?
[QUOTE=Smoot;50666962]How do companies like NVIDIA release "revolutionary performance" every year? Do they do R&D to the point of "Okay, this is what we have by this date, release it" or is it just never ending "We've discovered better newer technology!"? What I am asking is, do they limit how much work they do each year so they can have another better system next year and so on? I feel like you can only tweak a technology so much before you reach the limit.[/QUOTE] There's probably a bit of "ehhh we should probably avoid pushing too much this year". But I wouldn't say it's to artificially limit what we can do just to show off next year. The technologies in these things evolve rapidly, we have been seeing constant gains in transistor shrinkage for the last few decades now; [t]https://bavneetsingh.files.wordpress.com/2014/08/transistor-sizes-timeline.gif[/t] If, in the best case scenario, this means they can shrink the manufacturing method every two years, each alternating year can be used instead to improve the architectures themselves. Which is how Intel have been doing it (they can't keep it up, we're hitting the limit of transistor size without weird quantum effects kicking in). But there are also points where throwing the extra money at one generation just wouldn't pay off if it then led to a bit of a rush job to research the technology for the release. Rather just keep researching in the background and only feed off the stable research. So you're not spending excessive amounts of cash on minimal improvement.
[QUOTE=hexpunK;50667044]There's probably a bit of "ehhh we should probably avoid pushing too much this year". But I wouldn't say it's to artificially limit what we can do just to show off next year. The technologies in these things evolve rapidly, we have been seeing constant gains in transistor shrinkage for the last few decades now; [t]https://bavneetsingh.files.wordpress.com/2014/08/transistor-sizes-timeline.gif[/t] If, in the best case scenario, this means they can shrink the manufacturing method every two years, each alternating year can be used instead to improve the architectures themselves. Which is how Intel have been doing it [I](they can't keep it up, we're hitting the limit of transistor size without weird quantum effects kicking in)[/I]. But there are also points where throwing the extra money at one generation just wouldn't pay off if it then led to a bit of a rush job to research the technology for the release. Rather just keep researching in the background and only feed off the stable research. So you're not spending excessive amounts of cash on minimal improvement.[/QUOTE] Actually I think they've been dealing with "weird quantum effects" for quite a while now. Maybe some new effects are taking place, but my guess is that the current issues are just much bigger at the smaller sizes. Also, how did I make that double post?
[QUOTE=Squad1993;50664742]Makes me regret getting a 980Ti[/QUOTE] I believe if you bought that recently (like 90 days) and it's an EVGA, you can use their step up program to pay the difference between your card's price and the 1080/1070 price.
[QUOTE=GoDong-DK;50667063]Actually I think they've been dealing with "weird quantum effects" for quite a while now. Maybe some new effects are taking place, but my guess is that the current issues are just much bigger at the smaller sizes. Also, how did I make that double post?[/QUOTE] I'm not entirely sure, but teach me you ways. Yeah, the quantum effects have likely been a slight bugger for a while, but never enough to destroy an architecture. As we hit sub 10nm, tunnelling is supposed to become a genuine problem, with electrons just straight up ignoring closed transistor gates. Which would cause massive issues unless we then slap more hardware on for self correction.
[QUOTE=GoDong-DK;50667063]Actually I think they've been dealing with "weird quantum effects" for quite a while now. Maybe some new effects are taking place, but my guess is that the current issues are just much bigger at the smaller sizes. Also, how did I make that double post?[/QUOTE] They're much worse as you get smaller. To deal with electron tunneling we moves to the FinFET design, which basically adds more material vertically rather than the flat planar ways we've built before to reduce leakage.
I should upgrade my computer But would this card work with the old 790i ultra sli motherboard ?? I have a 750 but it's been doing wonders lol
[QUOTE=Smoot;50666962]How do companies like NVIDIA release "revolutionary performance" every year? Do they do R&D to the point of "Okay, this is what we have by this date, release it" or is it just never ending "We've discovered better newer technology!"? What I am asking is, do they limit how much work they do each year so they can have another better system next year and so on? I feel like you can only tweak a technology so much before you reach the limit.[/QUOTE] Of course they limit it to very direct boundaries. All business technologies do.
[QUOTE=Squad1993;50664742]Makes me regret getting a 980Ti[/QUOTE] its still a legit great card unless youre pushing 1080p. on benchmarks, the 1080 is basically a 980TI with a little more optimization: [url]http://www.roadtovr.com/nvidia-gtx-1080-benchmark-review-performance-head-to-head-against-the-980ti/2/#standard-benchmarks[/url] a whole 10-20 frame difference at 1440p on max.
[QUOTE=Instant Mix;50666967]If i'm going to make an investment for a new graphics card, would it be better to go for this or the 1070, if the 1060 is similar to the 980.[/QUOTE] generally its like this 1060= 1080p 60fps 1070= 1440p 60fps, 1080p ~100fps 1080= 1440p 90 fps, 1080 144hz
[QUOTE=pyschomc;50667293]I should upgrade my computer But would this card work with the old 790i ultra sli motherboard ?? I have a 750 but it's been doing wonders lol[/QUOTE] Why wouldnt it? Pci slots are backwards compatible.
Are there any cards that can run 4K at at least 60fps, if not more?
Time to start saving up to replace my 750Ti.
[QUOTE=Headhumpy;50667549]Are there any cards that can run 4K at at least 60fps, if not more?[/QUOTE] Nope, unless you lower graphics settings significantly for more intensive titles. The 1080 handles QHD(2560x1440) >60fps on games like Witcher 3, but anything more is unrealistic currently.
[QUOTE=RandomGamer342;50667667]Nope, unless you lower graphics settings significantly for more intensive titles. The 1080 handles QHD(2560x1440) >60fps on games like Witcher 3, but anything more is unrealistic currently.[/QUOTE] The days of 4K gaming are still a bit away, I guess.
oh boyy gonna sli 970's on the cheap 8)
[QUOTE=General J;50667825]oh boyy gonna sli 970's on the cheap 8)[/QUOTE] I wonder how soon the 1060 will affect the prices on the 970s to drop
[QUOTE=RandomGamer342;50667667]Nope, unless you lower graphics settings significantly for more intensive titles. The 1080 handles QHD(2560x1440) >60fps on games like Witcher 3, but anything more is unrealistic currently.[/QUOTE] In DigitalFoundry's video, the 1080 pushes ~40FPS at 4K (Max settings, Hairworks disabled) - I don't think they'd necessarily have to drop everything to the floor to get to 60. The 1080 Ti or whatever will probably be able to almost push 4K 60FPS in current titles.
[QUOTE=GoDong-DK;50667910]In DigitalFoundry's video, the 1080 pushes ~40FPS at 4K (Max settings, Hairworks disabled) - I don't think they'd necessarily have to drop everything to the floor to get to 60. The 1080 Ti or whatever will probably be able to almost push 4K 60FPS in current titles.[/QUOTE] Going to 60 from 40 is a 50% increase in performance, and it's not enough to have 60fps on average, it should also be consistent. I doubt even the rumored Titan P will be able to do that reliably, but i welcome surprises. You're right about not having to drop to low settings to run it at 4k, but i interpreted his question to mean at reasonably high settings too.
[QUOTE=pyschomc;50667293]I should upgrade my computer But would this card work with the old 790i ultra sli motherboard ?? I have a 750 but it's been doing wonders lol[/QUOTE] Yes. PCI-E 3.0 doesn't make a difference performance wise.
[QUOTE=RandomGamer342;50667976]Going to 60 from 40 is a 50% increase in performance, and it's not enough to have 60fps on average, it should also be consistent. I doubt even the rumored Titan P will be able to do that reliably, but i welcome surprises. You're right about not having to drop to low settings to run it at 4k, but i interpreted his question to mean at reasonably high settings too.[/QUOTE] I mean TW3 isn't the least graphically intense game either, maybe something like BF4 could be pulled off with only a few compromises.
this founders edition bullshit is just so nvidia can make prices higher while [i]technically[/i] having a lower price. In the end though there are no cards at the MSRP, because nvidia raised the prices on the reference cards. I'd rather get a $200 480 than a $300 1060. 10% performance difference between them doesn't justify the 50% price inflation.
[QUOTE=Squad1993;50664742]Makes me regret getting a 980Ti[/QUOTE] At least you're not alone in that.
Quick question: I'm probably going to be picking up this or the rx 480, whichever I find first. As i'm an nvidia user right now(and always have been), would all I have to do if I got an rx480 be wiping my nvidia drivers, installing the new graphics card, and downloading the drivers for it?
[QUOTE=GoDong-DK;50667910]In DigitalFoundry's video, the 1080 pushes ~40FPS at 4K (Max settings, Hairworks disabled) - I don't think they'd necessarily have to drop everything to the floor to get to 60. The 1080 Ti or whatever will probably be able to almost push 4K 60FPS in current titles.[/QUOTE] Yeah, I've heard from various speculation that 4k 60fps is still a bit off for a single GPU, possibly even beyond the 1080 Ti (or equivalent) for reliable 60fps. Some people seem to think that vega with its HBM2 memory will be the ideal card for 4k gaming, using the Fury X's performance in 4k as part of the reason why, but there's no sense hyping something without any concrete evidence. Personally I'd just wait and see.
[QUOTE=Ricenchicken;50668810]Quick question: I'm probably going to be picking up this or the rx 480, whichever I find first. As i'm an nvidia user right now(and always have been), would all I have to do if I got an rx480 be wiping my nvidia drivers, installing the new graphics card, and downloading the drivers for it?[/QUOTE] Pretty much. You should use driver sweeper though, just in case. [url]http://www.guru3d.com/files-details/display-driver-uninstaller-download.html[/url]
[QUOTE=Murkrow;50664877]And G-Sync and CUDA, just to name a few. I really hope AMD's push for non proprietary tech helps them out in the long run.[/QUOTE] Me too.
Sorry, you need to Log In to post a reply to this thread.