• Nvidia sued for false advertising
    75 replies, posted
[QUOTE=.Lain;47187547]it still has better price to performance than the 980 at stock.[/QUOTE] Yes, it does. It's an x70 card, that's how it works. With x60 being generally ideal price/performance. I was specifically pointing out that "overclocking to be a stock 980" is a stupid comparison when you're comparing 2 overclockable units.
[QUOTE=.Lain;47187526]i specifically said stock 980 in clear letters. my point was that for the money the 970 is very powerful. the 980 falls behind in price:performance.[/QUOTE] That doesn't even matter dude.
[QUOTE=Gum100;47187545]An omission of information that is likely to mislead a customer still counts as "false advertising", even if nothing false is literally stated. With the 970 being advertised with 4.0gb of vram, it's reasonable that consumers would assume it would all be equally functional. Given that we're starting to see games using 3.5gb+ of vram on higher settings, it's also reasonable to assume that customers may have bought it hoping for a degree of future proofing.[/QUOTE] [quote=Anandtech - Ryan Smith]On GTX 980, Shadows of Mordor drops about 24% on GTX 980 and 25% on GTX 970, a 1% difference. On Battlefield 4, the drop is 47% on GTX 980 and 50% on GTX 970, a 3% difference. On CoD: AW, the drop is 41% on GTX 980 and 44% on GTX 970, a 3% difference. As you can see, there is very little change in the performance of the GTX 970 relative to GTX 980 on these games when it is using the 0.5GB segment.[/quote] this card is hardly any less good for future proofing than it was before the 3.5gb issue isn't a performance issue, it's an advertisement and trust issue at the core of things.
[QUOTE=catbarf;47187143]They were extremely forthcoming with information, apologized for the mistake made by their marketing department, and reached out to hardware experts in the industry to explain their proprietary architecture and show exactly how it works so nobody could be mislead. They also helped people get refunds or exchanges through the stores they purchased from. And this is the horrific, terrible, unforgivable act of false advertising they [i]inflicted[/i] upon innocent, unwitting consumers: Next time I hear someone on this forum bitch about how we live in a sue-happy culture I'm just going to shake my head. You sue when a person or company isn't willing to cooperate to make things right for the consumer. What the fuck more do people want from nVidia over this?[/QUOTE] Oh please, if this hadn't been found out they wouldn't be doing any of that. If they actually had come forth about it and set things right BEFORE it was found out instead of only after everyone finds out then maybe I'd believe your bullshit and not that they're just doing "the right thing" now for PR, which apparently seems to be working.
[QUOTE=.Lain;47187562]this card is hardly any less good for future proofing than it was before the 3.5gb issue isn't a performance issue, it's an advertisement and trust issue at the core of things.[/QUOTE] Unless I've misunderstood the issues people have been reporting, doesn't the performance drop manifest as micro-stuttering? As in, it doesn't affect the average fps, and as such, wouldn't be seen in a % performance comparison?
[QUOTE=Rixxz2;47187378]Yeah, [I]after[/I] people found out what they had done. This wasn't their marketing department's fault, it's pretty obvious that they thought nobody would notice it until such a long time had passed that it wouldn't be relevant anymore[/QUOTE] Are you seriously suggesting that nVidia knowingly and deliberately falsified the specifications for a product that thousands of reviewers would be poring over and examining in detail? Let alone specifications that most customers don't even look at- can you tell me exactly what a ROP does without consulting Wikipedia, and is it really something you carefully consider when selecting a card? Or do you do what everybody else does and consult reviews and benchmarks and use them to pick a card? Because if that's your method, like 99% of the market, then a discrepancy in the paper specs doesn't mean anything. If it comes down to just the principle of the thing, and that I totally understand, then you can get a refund and from what I've seen they've been very helpful for people who want a refund over this. But to say you want to keep the card [i]and[/i] you expect them to pay you money for what seems like not only a genuine mistake, but a pretty minor one, and one that most people wouldn't notice nor care about, seems extremely petty.
[QUOTE=Gum100;47187588]Unless I've misunderstood the issues people have been reporting, doesn't the performance drop manifest as micro-stuttering? As in, it doesn't affect the average fps, and as such, wouldn't be seen in a % performance comparison?[/QUOTE] no, not really. the card is slower to access anything cached in that section of RAM which drops the framerate. it is more of an issue in some games than others (depends how the dev uses RAM) but regardless it is a much much smaller performance issue than it was ever initially made out to be
Just make sure to keep your proof of purchase, you won't be able to get a payout for a good 2-4 years when the case is settled. [editline]22nd February 2015[/editline] [QUOTE=Rixxz2;47187478]Or he could get an AMD Radeon 290X for about the same price as a 970 with better performance[/QUOTE] That's what my friend did, but at this point it might be better to wait for a month or two for the new AMD GPU's
[QUOTE=crazycory65;47187278]I'm about to order a 970 today, should I still do it?[/QUOTE] No because if you want to get a 970 you should drop an extra $250 on a 980 for that extra 15% performance increase because the 970 is now an AWFUL card for the money because you have 12% less VRAM :c. But seriously, I would personally get a 970 because hey, its $330, and you'll very rarely hit the 3.5gb vram mark unless your playing Dying light on Ultra or if your playing 4K. However if you are playing 4K I do recommend you get a 980 or a R9 290x/390x when it releases. I don't see why theres suddenly such a massive bandwagon for 970 hate when 45 days ago people thought it was one of the best cards out there. Because guess what, nothing has changed with the card. It's not suddenly a GTX 260, it still has an average score of 15800 on 3Dmark, its still one of the best cards for the money you could get out there. Now, I can see the anger behind being lied to, but the .5GB is used rarely outside of 4K gaming and the card has such a good price/performance ratio that I'd still be happy that I have that card for that price and not a 780Ti (Better performance than a 970 by 1.5%, $700), or god forbid a Titan, and although the Radeon R9 290X is technically a more powerful card (by 3%) I would still get the 970 because of the power consumption of the 970 and I hear NVIDIA drivers are a lot... simpler? (is that a good word to use?) than AMD drivers. But thats just my 2 cents, If I were you I'd do research on all cards around that spectrum, and then follow your graphics boner to what card you think is best for you.
I actually ordered one and the seller dropped it down his stairs so I got a refund. I think I'll xfire my 270x when the new amd card comes out instead of getting anything new.
[QUOTE=catbarf;47187602]Are you seriously suggesting that nVidia knowingly and deliberately falsified the specifications for a product that thousands of reviewers would be poring over and examining in detail? Let alone specifications that most customers don't even look at- can you tell me exactly what a ROP does without consulting Wikipedia, and is it really something you carefully [/QUOTE] Yes? It also seems like you answered your own question Most people don't know what a ROP is, most people also wouldn't have noticed the VRAM issue, the point still stands, Nvidia lied, and they obviously didn't expect anyone to notice in such a short time
[QUOTE=crazycory65;47187278]I'm about to order a 970 today, should I still do it?[/QUOTE] no, in 2 years when games need all of the vram you'll be angry that you ordered it
Well, I joined the Reddit hardware bandwagon and asked amazon for compensation. Got 50 GBP compensation from them. If you wanna try your luck: [t]http://oi60.tinypic.com/11lkj0l.jpg[/t]
[QUOTE=Rixxz2;47188548]Yes? It also seems like you answered your own question Most people don't know what a ROP is, most people also wouldn't have noticed the VRAM issue, the point still stands, Nvidia lied, and they obviously didn't expect anyone to notice in such a short time[/QUOTE] Why on [I]earth[/I] would they intentionally lie about something that A. won't earn any more sales since their customers don't know or care what an ROP is, and B. would certainly be noticed by the professionals who analyze hardware for a living? Why is the idea of it being a marketing screw-up not a plausible explanation?
Just for the record, the number of ROPs and cache is not advertised and so this lawsuit will be thrown out of court. The agreements between reviewers and OEMs are designed so that reviews AREN'T classified as advertising. NVIDIA made a mistake because partial cluster disabling is something brand new, and their marketing department would never have considered such a thing when making up the packets sent to reviewers. The only winners here will be lawyers, even more so than your typical lawsuit.
[QUOTE=Kaabii;47188853]Just for the record, the number of ROPs and cache is not advertised and so this lawsuit will be thrown out of court. The agreements between reviewers and OEMs are designed so that reviews AREN'T classified as advertising. NVIDIA made a mistake because partial cluster disabling is something brand new, and their marketing department would never have considered such a thing when making up the packets sent to reviewers. The only winners here will be lawyers, even more so than your typical lawsuit.[/QUOTE] This basically, the only reason why people even know what the "planned" ROP and Cache amount was, is because of some press kit info before release. You could argue that the whole 4gb of ram is implied to have the same performance, but thats not gonna hold in court, especially when the bandwidth info on the ram is technically still correct.
[QUOTE=GamerKiwi;47187523]Look at the benchmarks; does the 970 perform very well in its price range? (The answer's yes) If you want to not get it on principle, then get an AMD card, or drop the extra cash on the 980, but otherwise, the 970's an excellent card for the price.[/QUOTE] [QUOTE=.Lain;47187562]this card is hardly any less good for future proofing than it was before the 3.5gb issue isn't a performance issue, it's an advertisement and trust issue at the core of things.[/QUOTE] [b]Thank you.[/b] This post is for those who are considering buying a GTX 970 and are being put off from buying it [i]because[/i] there's a "performance drop"; Please note that ANY card that reaches 100% vram usage is going to see a performance drop, because if more resources are needed after that, your actual system ram will be used instead, and that causes an [b]even worse delay[/b] when your video card needs to read that data, which in turn slows your whole system instead of just the GPU. They could have slapped a sticker on it that said "3.5GB" and it'd still be a great card. Nvidia is basically doing your system a favour by giving it an extra 512mb buffer before the card needs to use your system ram. There's no reason to be able to disable the last 512mb, because if you did you'd just be using your system ram instead which would just slow things down moreso. [b]Yes, Nvidia did fuck up the advertising for this card, but for a 3.5GB card, it's still an amazing deal.[/b] If vram size is not a big issue for you, this card has a great price-to-performance ratio. Seriously, it may only be 3.5gb+512mb but it's still a damn good card.
I bought GTX 960 few weeks back and I have been really happy with this card. I have had 4 different AMD cards during the years and each of them had heat and noise problems. And now taking look at R9 series its still very much the case. Twice the power consumption compared to 900 series, noise levels can go as high as 70db and I don't even want to think about the heat. My 960 is sitting at 0% fan speed and <40c on desktop use and when gaming it rarely goes over 60c. I have had enough of struggling with AMD cards, I don't think R9 is the way to go right now and you should wait for new series.
[QUOTE=rulssi;47190001]I bought GTX 960 few weeks back and I have been really happy with this card. I have had 4 different AMD cards during the years and each of them had heat and noise problems. And now taking look at R9 series its still very much the case. Twice the power consumption compared to 900 series, noise levels can go as high as 70db and I don't even want to think about the heat. My 960 is sitting at 0% fan speed and <40c on desktop use and when gaming it rarely goes over 60c. I have had enough of struggling with AMD cards, I don't think R9 is the way to go right now and you should wait for new series.[/QUOTE] 70dB at what distance? Different sites use different methodology, you can't just pull numbers from wherever. I agree that the Nvidia line-up is stronger right now, but I don't think the R9 series is a [I]bad[/I] choice, it's just a worse one (unless you're in a segment where Nvidia doesn't have a 9** series card out).
[QUOTE=rulssi;47190001]I bought GTX 960 few weeks back and I have been really happy with this card. I have had 4 different AMD cards during the years and each of them had heat and noise problems. And now taking look at R9 series its still very much the case. Twice the power consumption compared to 900 series, noise levels can go as high as 70db and I don't even want to think about the heat. My 960 is sitting at 0% fan speed and <40c on desktop use and when gaming it rarely goes over 60c. I have had enough of struggling with AMD cards, I don't think R9 is the way to go right now and you should wait for new series.[/QUOTE] Got my R9 290 for 200 euros last week, except for the temperature : i'm really happy with it. Let's wait for DX 12 and see.
thread like a bowl of retarded pirahnas
Y'know, even though the .5GB of slow VRAM probably wouldn't make a difference in what I'm playing, this whole controversy has really put me off of Nvidia. I was planning to get a 970 when I had the money, but now I'm not so sure.
[QUOTE=ief014;47188949][b]Thank you.[/b] This post is for those who are considering buying a GTX 970 and are being put off from buying it [i]because[/i] there's a "performance drop"; Please note that ANY card that reaches 100% vram usage is going to see a performance drop, because if more resources are needed after that, your actual system ram will be used instead, and that causes an [b]even worse delay[/b] when your video card needs to read that data, which in turn slows your whole system instead of just the GPU. They could have slapped a sticker on it that said "3.5GB" and it'd still be a great card. Nvidia is basically doing your system a favour by giving it an extra 512mb buffer before the card needs to use your system ram. There's no reason to be able to disable the last 512mb, because if you did you'd just be using your system ram instead which would just slow things down moreso. [b]Yes, Nvidia did fuck up the advertising for this card, but for a 3.5GB card, it's still an amazing deal.[/b] If vram size is not a big issue for you, this card has a great price-to-performance ratio. Seriously, it may only be 3.5gb+512mb but it's still a damn good card.[/QUOTE] It's a really big issue [I]if[/I] the driver advertises 4GB [I]and[/I] e.g. UE with its texture shuffling tries to use that over asynchronously loading them from RAM. (UE games quite often have more assets than fit VRAM already.) That said it's quite possible Unreal will take this into account and put a fix into their engine to reduce the problem for future games. The best solution would probably be to have the driver advertise 3.5GB but allow loading more textures onto the card (since the "extra" memory is most likely still going to be faster than normal RAM).
[QUOTE=ExtReMLapin;47190042]Got my R9 290 for 200 euros last week, except for the temperature : i'm really happy with it. Let's wait for DX 12 and see.[/QUOTE] Why are high temperatures a problem? The card is kept there on purpose, they don't spin the fans higher then w/e it requires to keep it bellow 95*C, because there is no reason for them to do so. Like by default the fan during high load on a 290X goes upto 20-30%. It will be ~90*C, but the default settings/firmware don't increase the fan speed higher then what is required to keep it there.
I'm on the fence about getting a 970. I have a R9 270X right now. I'm not sure if it would be a worthwhile investment since I can max out Wolfenstine: The New Order with a few hitches and I'm not exactly looking foward to newer, more demanding games. My main concerns right now is I'm lacking CUDA (which would speed up Blender renders) and the drivers for my R9 make it impossible to stream W:TNO, not to mention I'm stuck with old drivers due to hardware skinning being broken for OpenGL.
Pretty sure Blender has support for OpenCL.
[QUOTE=Cold;47190436]Pretty sure Blender has support for OpenCL.[/QUOTE] By default, v2.71 only lets CUDA and CPU compute, if there is a way to fix that, that would be awesome.
[QUOTE=nagachief;47193331]By default, v2.71 only lets CUDA and CPU compute, if there is a way to fix that, that would be awesome.[/QUOTE] It can be enabled if you build Blender yourself. Supported are both CUDA and OpenCL, however only CUDA is enabled in official builds according to [url=http://wiki.blender.org/index.php/Doc:2.6/Manual/Render/Cycles/GPU_Rendering]this[/url].
[QUOTE=mastersrp;47193392]It can be enabled if you build Blender yourself. Supported are both CUDA and OpenCL, however only CUDA is enabled in official builds according to [url=http://wiki.blender.org/index.php/Doc:2.6/Manual/Render/Cycles/GPU_Rendering]this[/url].[/QUOTE]How stable is this? Being experimental doesn't sound too confident
[QUOTE=itisjuly;47193414]How stable is this? Being experimental doesn't sound too confident[/QUOTE] Probably nothing you should actually do unless you're working on it.
Sorry, you need to Log In to post a reply to this thread.