• Nvidia: DirectX 11 is not important
    245 replies, posted
[QUOTE=Cathbadh;17359082]... But the changes that come with DX11 are more than just graphics. As I understand it, it also includes a general-purpose comupting API for graphics cards (i.e. using the graphics card for something other than graphics) This would be the equivalent of OpenCL, which is something nvidia and amd have both pledged to support.[/QUOTE] Yeah, DX11 includes DirectCompute, which is like OpenCL but for DirectX (of course), And Nvidia have already hooked it up to CUDA, so any card with CUDA can run DirectCompute shaders. It would be nice to see Nvidia release a card with a tessellator though (like ATI did)
:sigh: I might move to ATI, nVidia is fucking themselves up the ass with this CUDA and computing shit.
[QUOTE=Zombii;17364916]:sigh: I might move to ATI, nVidia is fucking themselves up the ass with this CUDA and computing shit.[/QUOTE] They're doing worse behind the scenes. They'll disabled their stuff if it finds and ATI on the system. "It's either them or us" BS. Not to mention I heard somewhere else (other than the user in this thread) about how NV paid people not to use DX10.1 or something to try and kill off ATI's advantage. Their piss poor drivers (Vista/7). The fact they did something with their drivers to cheat in benchmarks.
All you kids are ridiculous, nVidia will probably release two DX11 cards, a high end and low end card, and use those sales to cover the expenses and go to DX12. Seriously, tessellation isn't something to spend hundreds of millions of dollars and thousands of hours of development time/money/resources just to have DX12 come out the year after. "Technology doubles every 2 years, making huge strides that makes all other technology nearly obsolete" "This statement is obsolete as technology makes a major stride every 6 months"
[QUOTE=DeadWeight;17364963]All you kids are ridiculous, nVidia will probably release two DX11 cards, a high end and low end card, and use those sales to cover the expenses and go to DX12. Seriously, tessellation isn't something to spend hundreds of millions of dollars and thousands of hours of development time/money/resources just to have DX12 come out the year after. "Technology doubles every 2 years, making huge strides that makes all other technology nearly obsolete" "This statement is obsolete as technology makes a major stride every 6 months"[/QUOTE] Blind fanboy/loyalist or what?
[QUOTE=Panda X;17364990]Blind fanboy/loyalist or what?[/QUOTE] Sounds like the kind of person that's talked about in the computer illiteracy thread. (Not Panda, the guy he's quoting)
[QUOTE=DeadWeight;17364963]All you kids are ridiculous, nVidia will probably release two DX11 cards, a high end and low end card, and use those sales to cover the expenses and go to DX12. Seriously, tessellation isn't something to spend hundreds of millions of dollars and thousands of hours of development time/money/resources just to have DX12 come out the year after.[/QUOTE] When you wrote this, did you realize that the first sentence contradicts the second, and the second sentence contradicts itself? "nVidia will probably release DX11 cards" -> "nVidia shouldn't bother developing tessellation (and by implication DX11)" and "they shouldn't develop tessellation" -> "they should focus on DX12 (which will include tessellation)"
bad reading all of u [editline]10:59PM[/editline] [quote=forumaster;17365026]sounds like the kind of person that's talked about in the computer illiteracy thread. (not panda, the guy he's quoting)[/quote] yah ma nwhat duz sys32 do you jerk off
[QUOTE=DeadWeight;17365211]bad reading all of u [editline]10:59PM[/editline] yah ma nwhat duz sys32 do you jerk off[/QUOTE] I can honestly say you're a fucking moron. Though you're that special kind of moron that makes me laugh.
[QUOTE=DeadWeight;17364963]All you kids are ridiculous, nVidia will probably release two DX11 cards, a high end and low end card, and use those sales to cover the expenses and go to DX12. Seriously, tessellation isn't something to spend hundreds of millions of dollars and thousands of hours of development time/money/resources just to have DX12 come out the year after. "Technology doubles every 2 years, making huge strides that makes all other technology nearly obsolete" "This statement is obsolete as technology makes a major stride every 6 months"[/QUOTE] If you keep waiting for the next big thing you'll never get there. Same goes for anyone upgrading their computer. Everything will become obsolete, if you keep waiting you're not doing any good.
[QUOTE=Panda X;17364990]Blind fanboy/loyalist or what?[/QUOTE] Okay, I'm going to completely stop you there because neither have you done anything except that. [editline]12:00AM[/editline] [QUOTE=Panda X;17364961]They're doing worse behind the scenes. They'll disabled their stuff if it finds and ATI on the system. "It's either them or us" BS. Not to mention I heard somewhere else (other than the user in this thread) about how NV paid people not to use DX10.1 or something to try and kill off ATI's advantage. Their piss poor drivers (Vista/7). The fact they did something with their drivers to cheat in benchmarks.[/QUOTE] Absolute bullshit. I constantly mix Ati and nVidia as it is part of my side-job and none of that is true. nVidia does not disable anything if it detects Ati. I mean, I'm not even quite sure what you mean by disabling if it finds Ati stuff. Not to mention, their drivers are undeniably better than Ati's. With the attitude you have, I highly doubt you have even seen nVidia drivers of today.
I bet Nvidia will change their tune when they release DX11 GPUs
[QUOTE=Levybreak;17358344]I still don't see any games using exclusively DX10. I don't think people developers will suddenly jump past it to DX11.[/QUOTE] There is a DX10 exclusive coming out (Shattered horizon).
Nvidia drivers undoubtedly have more options and features, but when I got my 4870, I like driver's flawless install and effective UI.
[QUOTE=Brt5470;17365772]Nvidia drivers undoubtedly have more options and features, but when I got my 4870, I like driver's flawless install and effective UI.[/QUOTE] There's always third party driver extensions to add more options, though. Look at ATI Tray Tools. If you look at it that way, the only thing Nvidia has over ATI is PhysX, and forced SSAO. PhysX is a gimmick, and SSAO can be added to [b]some[/b] games through ENBSeries. You don't lose much software wise.
I used Tray tools before, and it was decent. But even so it caused some games to crash. Uninstalled it. : /
[QUOTE=thisispain;17365638]Okay, I'm going to completely stop you there because neither have you done anything except that. [editline]12:00AM[/editline] Absolute bullshit. I constantly mix Ati and nVidia as it is part of my side-job and none of that is true. nVidia does not disable anything if it detects Ati. I mean, I'm not even quite sure what you mean by disabling if it finds Ati stuff. Not to mention, their drivers are undeniably better than Ati's. With the attitude you have, I highly doubt you have even seen nVidia drivers of today.[/QUOTE] In Windows 7 the new WDDM1.1 allows multiple cards from multi manufacturers. You can't have a PhysX nvidia doing solely PhysX work with an ATI rendering stuff. As for the drivers, I still see a fucking lot of people with the NVLDDMKM issue with GTX's and 190.xx drivers. In fact I was speaking to Chris Holmes ([url]http://chris123nt.com[/url]) about him getting a 4870x2. I asked him why and he replied that the nvidia drivers were shit in and performing just horribly. He had a GTX280 with the latest drivers. I know I sound like a fanboy, but I only sound so because of all the shit Nvidia's been pulling.
What's with all the ATI fanboyism in this thread? Every post that remotely mentions nvidia is rated dumb or disagree. Give nvidia a chance to get some cards on the market before you even compare the two... instead of riding the "LOLOLO ATI IS BETTER NAOW, FUCK YOU NVIDIA!" bandwagon. And now I await the inevitable dumb and disagrees from angry ATI fans. (Yes I did say nvidia in my post) [B]Edit:[/B] Point proven.
Nvidia's becoming really shitty these days. I mean for fucks sake they renamed 8800s (among with pretty much all their 8 series cards) TWO times and sold them as new products. First the 8 series, then the 9 series, then the GT/GTS/whatever 200 series. They did bring the GTX 200 series, which is at the top for now, but they couldn't keep up with ATI's prices. Now we have DX11 with compute shaders coming out and Nvidia's saying CUDA is more important. That's fucking retarded because they just want to milk cuda and physx while they still can. And they'll probably be successful with that. That's because Nvidia has the biggest market share so most of the gpgpu software will still be cuda for some time. Yup, over 50% of gamers (i'm not saying intelligent people who actually know something about graphics cards) will buy new rebranded nvidia shit just because LOLOLOL NVIDIA'S BETTER EVERYBODY USES IT, also manufacturers of prebuilts seem to like nvidia more. We could be having a lot of dx11 compute shader software in a year or so, which would work on ALL dx11 hardware, but it won't happen soon because of nvidia's shitty marketing strategies. Same thing's going to happen to dx11 games, untill nvidia supports them we wont get them. I hope they lose a lot of their market share, it would be good for everybody.
I don't see why anyone's getting worked up in a huff about this. That guy isn't the CEO or anything, but rather investor relations (I think). If I'm right, it's relatively silly to talk to an investor relations personnel about the "next big step in DirectX" when said relations personnel works for a major graphics card company/chain/something. He doesn't want to make the company look like hasty fools, and so will play off that "DirectX 11 isn't that big of a deal" to keep interest in the company. Also, they have plenty of time before DirectX 11 comes out. And even then, companies will have to start on development for DirectX 11.
I love how if one is not happy with Nvidia they are labeled an ATi fanboy.
[QUOTE=Soldier32;17370012]I love how if one is not happy with Nvidia they are labeled an ATi fanboy.[/QUOTE] That doesn't happen at all in this thread what are you on about?
Am I the only one who doesn't give a shit about DX 11 or GL 14.5 until I get a game that is actually a game rather than a technology demo.
[QUOTE=evilking1;17370085]Am I the only one who doesn't give a shit about DX 11 or GL 14.5 until I get a game that is actually a game rather than a technology demo.[/QUOTE] No you aren't, I couldn't care less about DX11 at the moment either. DX9 will continue to be the default for another few years at least. Hurry the fuck up Crytek. :colbert:
NVIDIA must be doing something right if over 65% of the Steam population uses their cards.
[QUOTE=Odellus;17370368]NVIDIA must be doing something right if over 65% of the population uses their cards.[/QUOTE] It's probably mainly customer satisfaction from previous purchases, and the high quality drivers.
Why is everyone complaining? DirectX11 is a while off yet. Also, OpenGL has better performance than DirectX. It's not used for the same reasons Linux isn't it used. It's not funded by billions of dollars.
[QUOTE=Dan The Man;17370584]Why is everyone complaining? DirectX11 is a while off yet. Also, OpenGL has better performance than DirectX. It's not used for the same reasons Linux isn't it used. It's not funded by billions of dollars.[/QUOTE] CoD4 (maybe 5?) uses OpenGL 2.0, one of the last big games to utilize it.
´So far as I hear OpenGL isn't used because it's the API equivalent of DLL hell.
Read the OP guys: [quote=Hara] “[B]DirectX 11 by itself is not going be the defining reason to buy a new GPU[/B]. [B][U]It will be one of the reasons[/U][/B]. This is why Microsoft is in work with the industry to allow more freedom and more creativity in how you build content, which is always good, and the new features in DirectX 11 are going to allow people to do that. [B][U]But that no longer is the only reason, we believe, consumers would want to invest in a GPU,”[/U][/B] explains Mr. Hara.[/quote] He did not say that Nvidia won't have DX11 cards, he's just saying that it's not [B]only[/B] DX11 that is important. What he is saying is that Nvidia won't be focusing solely on DX11.
Sorry, you need to Log In to post a reply to this thread.