• Borderlands 2 PC video shows off fancy PhysX effects
    197 replies, posted
So why can't AMD just make their own PhysX and get gaming companies to use it?
[QUOTE=Socram;37272911]So why doesn't AMD have an equivalent that they try to push? I still don't really see the issue. Clearly PhysX has some merits or developers wouldn't want to use it at all. It all comes down to competition, and this is how Nvidia (well one of many..) stays on top.[/QUOTE] because OpenCL can be done on both without some proprietary bs to give artificial incentives to buy one brand over the other? [editline]18th August 2012[/editline] or some other open thing
[QUOTE=Zet;37309266]So why can't AMD just make their own PhysX and get gaming companies to use it?[/QUOTE] Because that'd make the problem worse. The whole point is that you should be giving the exact same experience to all of your customers regardless of what hardware they have, if my AMD card can more than handle these PhysX effects were they not done in PhysX, then why was it done in PhysX? This doesn't make PhysX or Nvidia better, it gives them an artificial advantage by making the game more detailed in some way.
I don't really like most of the PhysX stuff because it's like "Oo0o0o0 Physics and cloth" But the cloth especially doesn't fit AT ALL into what the game looks like. It's so smooth and realistic it feels wierd.
[QUOTE=CakeMaster7;37309474]Because that'd make the problem worse. The whole point is that you should be giving the exact same experience to all of your customers regardless of what hardware they have, if my AMD card can more than handle these PhysX effects were they not done in PhysX, then why was it done in PhysX? This doesn't make PhysX or Nvidia better, it gives them an artificial advantage by making the game more detailed in some way.[/QUOTE] I'm sure AMD is a loving and caring company that doesn't do something like PhysX because they think everyone should have the same. It's business, if you can give yourself an advantage, you do it. Dunno about you guys but I hardly think "Oh gee I better not get this card because it promoted proprietary software/functions".
[QUOTE=acds;37310010]I'm sure AMD is a loving and caring company that doesn't do something like PhysX because they think everyone should have the same. It's business, if you can give yourself an advantage, you do it. Dunno about you guys but I hardly think "Oh gee I better not get this card because it promoted proprietary software/functions".[/QUOTE] psh i don't buy intel processors because a guy whose brother's uncle's father used to work at intel and was a criminal
[QUOTE=JerryK;37307710]it's ok the nvidia users need some sort of compensation for when their computers catch fire[/QUOTE] These jokes haven't applied in over a year.
This remind's me the bullshit Crytek pulled with a Crysis comparison. [media]http://www.youtube.com/watch?v=fmUCp9ZAgyU[/media]
[QUOTE=acds;37310010]I'm sure AMD is a loving and caring company that doesn't do something like PhysX because they think everyone should have the same. It's business, if you can give yourself an advantage, you do it. Dunno about you guys but I hardly think "Oh gee I better not get this card because it promoted proprietary software/functions".[/QUOTE] I don't see how is this an argument I'm not complaining about Nvidia here, they'll do what they do, I'm complaining about these game devs who had a choice to give all their customers the same experience but decided not to.
Fragmentation is the root of all evil.
[QUOTE=CakeMaster7;37313552]I don't see how is this an argument I'm not complaining about Nvidia here, they'll do what they do, I'm complaining about these game devs who had a choice to give all their customers the same experience but decided not to.[/QUOTE] It pales in comparison to devs deciding to give timed exclusive dlc.
I had the choice between a 560 and a 6950 and I chose the 6950. PhysX just doesn't appeal to me. It's cool and all, but it's entirely a gimmick. I'd prefer to support AMD, they're not as dirty as NVIDIA are.
I wish AMD hadn't killed off the ATI brand but fuck 'em, it's still ATI to me.
[media]http://www.youtube.com/watch?v=aQSnXhgJ4GM[/media]
[QUOTE=CakeMaster7;37309474]Because that'd make the problem worse. The whole point is that you should be giving the exact same experience to all of your customers regardless of what hardware they have, if my AMD card can more than handle these PhysX effects were they not done in PhysX, then why was it done in PhysX? This doesn't make PhysX or Nvidia better, it gives them an artificial advantage by making the game more detailed in some way.[/QUOTE] That's not how it works in the slightest. Physics computations run on your processor by default, NVIDIA just let you offload computations to their graphics card because they have an architecture set up to do that already called "CUDA". [QUOTE=geogzm;37316170]I had the choice between a 560 and a 6950 and I chose the 6950. PhysX just doesn't appeal to me. It's cool and all, but it's entirely a gimmick. I'd prefer to support AMD, they're not as dirty as NVIDIA are.[/QUOTE] Good to see facepunch keeping up it's tactic of not reading past the first 3 posts, PhysX literally cannot be a gimmick unless you believe that physics in games is a gimmick - what you are referring to as a "gimmick" is PhysX being able to support additional effects if you have a card that it can offload computations to. The reason it doesn't work on ATI cards is because they used their parallel computing architecture "CUDA" to do it. I'd like to add to this, that CUDA's been used to create a sort of home super computer called a Tesla box, which you can create by chaining a bunch of really powerful graphics cards - it's been really useful for engineers and scientist because it's within a reasonable price range and very powerful. A ghetto edition of this was done with PS3s before Sony went berserk on everybody.
I love the whole concept of PhysX (Fucking love HW acceleration), but Nvidia decided to make CUDA closed to other HW manufacturers, so PhysX was bought and developed on CUDA, not to mention the amount of money they have poured into companies to use CUDA for stuff; I really wish OpenCL and the DX11 DirectCompute shader was used instead, its open-source and a lot more hardware agnostic. The only reason CUDA has good documentation is because they have PAID a lot of good developers, and they've been forced to release the documentation for it. Oh also: Havok is developing for OpenCL now, for example their cloth physics is OpenCL powered, I'd love to see the rigid body sim also get the HW treatment.
[QUOTE=glitchvid;37319177]I love the whole concept of PhysX (Fucking love HW acceleration), but Nvidia decided to make CUDA closed to other HW manufacturers, so PhysX was bought and developed on CUDA, not to mention the amount of money they have poured into companies to use CUDA for stuff; I really wish OpenCL and the DX11 DirectCompute shader was used instead, its open-source and a lot more hardware agnostic. The only reason CUDA has good documentation is because they have PAID a lot of good developers, and they've been forced to release the documentation for it. Oh also: Havok is developing for OpenCL now, for example their cloth physics is OpenCL powered, I'd love to see the rigid body sim also get the HW treatment.[/QUOTE] Uh, what the fuck? "they've been forced to release the documentation for it"? Are you serious? I think what you're trying to say is Havok is adding support for OpenCL to offload computations to the graphics card, but either way what you're saying is ridiculous. A lot of really huge and successful libraries have awful documentation - if you had any programming experience you'd understand this. CUDA may be "closed", but it's only "closed" in the sense that the source code is available - NVIDIA aren't the only ones who use it. A lot of really cool stuff has been made with CUDA... incoming video spam. [media]http://www.youtube.com/watch?v=dg8DgKjJQD8[/media] [media]http://www.youtube.com/watch?v=7cjorOe810o[/media] [media]http://www.youtube.com/watch?v=EcM399E9MZs[/media] [media]http://www.youtube.com/watch?v=xw-J6gAMijs[/media]
[QUOTE=Elspin;37318983]Good to see facepunch keeping up it's tactic of not reading past the first 3 posts, PhysX literally cannot be a gimmick unless you believe that physics in games is a gimmick - what you are referring to as a "gimmick" is PhysX being able to support additional effects if you have a card that it can offload computations to. The reason it doesn't work on ATI cards is because they used their parallel computing architecture "CUDA" to do it. I'd like to add to this, that CUDA's been used to create a sort of home super computer called a Tesla box, which you can create by chaining a bunch of really powerful graphics cards - it's been really useful for engineers and scientist because it's within a reasonable price range and very powerful. A ghetto edition of this was done with PS3s before Sony went berserk on everybody.[/QUOTE] I'm sorry but this PhysX doesn't add anything to the game, and yet it's touted as a feature for people who have Nvidia cards. Just because I buy an AMD card suddenly means I don't get pretty particles and water effects? I get that it's a technological restriction, but think of the principle. Market variety is a good thing, but PC components are meant to do the same thing as components in the same category. Intel processors don't give you "Supreme intel physics calculations." I like options in the market, but fragmentation is silly. Choices between graphics cards needs to be like the Android market in that you pay for something that does the same thing at a cheaper price, not the iOS vs Android market where you're forced between two different things.
[QUOTE=Protocol7;37321187]I'm sorry but this PhysX doesn't add anything to the game, and yet it's touted as a feature for people who have Nvidia cards. Just because I buy an AMD card suddenly means I don't get pretty particles and water effects? I get that it's a technological restriction, but think of the principle. Market variety is a good thing, but PC components are meant to do the same thing as components in the same category. Intel processors don't give you "Supreme intel physics calculations." I like options in the market, but fragmentation is silly. Choices between graphics cards needs to be like the Android market in that you pay for something that does the same thing at a cheaper price, not the iOS vs Android market where you're forced between two different things.[/QUOTE] imo it was better when it was a separate card because it didn't restrict my choice of GFX cards. I could have an ATI card or an Nvidia card and still get pretty physics shit.
[QUOTE=chunkymonkey;37321302]imo it was better when it was a separate card[/QUOTE] are you crazy? you think having to pay 200 dollars just to get some arbitrary particle effects was better?
[QUOTE=Elspin;37319326]Uh, what the fuck? "they've been forced to release the documentation for it"? Are you serious? I think what you're trying to say is Havok is adding support for OpenCL to offload computations to the graphics card, but either way what you're saying is ridiculous. A lot of really huge and successful libraries have awful documentation - if you had any programming experience you'd understand this. CUDA may be "closed", but it's only "closed" in the sense that the source code is available - NVIDIA aren't the only ones who use it. A lot of really cool stuff has been made with CUDA... incoming video spam. [/QUOTE] I know many wonderful tools and opensource stuff lack documentation, ALA OpenCL. What I'm saying is: Nvidia have either payed, or have heavily influenced developers to use CUDA; since most big developers document their stuff fairly well, CUDA has been WELL documented; while OpenCL lacks that. A good example of developers who would like to stay agnostic on what APIs they use: DICE. They don't touch much stuff unless its available on most all hardware. Also: I'm all about HW acceleration, I just think CUDA is going about it wrong whilst OpenCL is doing the right thing in being compatible on as much as possible.
[QUOTE=Protocol7;37321187]I'm sorry but this PhysX doesn't add anything to the game, and yet it's touted as a feature for people who have Nvidia cards. Just because I buy an AMD card suddenly means I don't get pretty particles and water effects? I get that it's a technological restriction, but think of the principle. Market variety is a good thing, but PC components are meant to do the same thing as components in the same category. Intel processors don't give you "Supreme intel physics calculations." I like options in the market, but fragmentation is silly. Choices between graphics cards needs to be like the Android market in that you pay for something that does the same thing at a cheaper price, not the iOS vs Android market where you're forced between two different things.[/QUOTE] Did you miss the last 3 posts? I mean your post was literally debunked in the one you quoted, that's like a new record for not reading. Unless you're actually trying to insinuate that physics add nothing to a game your post is just a massive sign saying "did not read lol". PhysX is the entire physics library, not just optional stuff that mostly sticks to tech demos. [editline]19th August 2012[/editline] [QUOTE=thisispain;37321409]are you crazy? you think having to pay 200 dollars just to get some arbitrary particle effects was better?[/QUOTE] Why just particle effects? There's a lot of really cool things the library could do better if it had a dedicated card to work with. The only reason the particle effects even use the library in the first place is so they don't go through walls. That's a developers decision to have fancy colliding particles, the actual library is fantastic and runs great NVIDIA card or not. I don't see why they should stop using one of the best physics libraries I've ever used because it has an optional feature to offload some of the physics calculations to a specific set of graphics cards
[QUOTE=Elspin;37321448] [B]Why just particle effects? [/B]There's a lot of really cool things the library could do better if it had a dedicated card to work with. The only reason the particle effects even use the library in the first place is so they don't go through walls. That's a developers decision to have fancy colliding particles, the actual library is fantastic and runs great NVIDIA card or not. I don't see why they should stop using one of the best physics libraries I've ever used because it has an optional feature to offload some of the physics calculations to a specific set of graphics cards[/QUOTE] Because that's all companies seem to use the fucking thing for.
[QUOTE=chunkymonkey;37321944]Because that's all companies seem to use the fucking thing for.[/QUOTE] It doesn't have to be though, that'd be like disregarding all of gaming because tonnes of shitty games get produced. Don't get me wrong here, I'm not saying it's a -great- idea, but for the hardcore enthusiast there could be some really neat things done with physics if there was a dedicated card
[QUOTE=thisispain;37321409]are you crazy? you think having to pay 200 dollars just to get some arbitrary particle effects was better?[/QUOTE] No, but just in terms of not having it affiliated with a graphics card maker. It means you don't have to pick sides. The way it is now, if I want fancy sparks and water that looks like jelly I need to go with Nvidia but I don't like Nvida cards. With it being its own separate thing I can go with whoever I want and still reap the benefits*. I still think it's a silly gimmick(for now). *your definition of "benefits" may vary
[QUOTE=Elspin;37321976]It doesn't have to be though, that'd be like disregarding all of gaming because tonnes of shitty games get produced. Don't get me wrong here, I'm not saying it's a -great- idea, but for the hardcore enthusiast there could be some really neat things done with physics if there was a dedicated card[/QUOTE] But it won't ever be used for anything big, separate card or not. The problem with separate parts like these is they never will get fully used by the gaming industry. Its the same with the EyeToy for the PS2, Kinect for the 360, Move for the PS3, Circle Pad thing for 3DS. As long as its not something most of the consumer base has, it will never go anywhere and no one will attempt anything groundbreaking with it as it would be too much a gamble. This is why PhysX itself is not that great, regardless of how great it is to work with. Most developers don't want to throw half their player base out the window for these things that can really be done with everything else.
[QUOTE=Brt5470;37309676]I don't really like most of the PhysX stuff because it's like "Oo0o0o0 Physics and cloth" But the cloth especially doesn't fit AT ALL into what the game looks like. It's so smooth and realistic it feels wierd.[/QUOTE] I agree. It looks nice, sure, but it is style-breaking and looks out of place. If only it wasn't so blatant.
[QUOTE=WearingNothing;37322124]But it won't ever be used for anything big, separate card or not. The problem with separate parts like these is they never will get fully used by the gaming industry. Its the same with the EyeToy for the PS2, Kinect for the 360, Move for the PS3, Circle Pad thing for 3DS. As long as its not something most of the consumer base has, it will never go anywhere and no one will attempt anything groundbreaking with it as it would be too much a gamble. This is why PhysX itself is not that great, regardless of how great it is to work with. Most developers don't want to throw half their player base out the window for these things that can really be done with everything else.[/QUOTE] PhysX is a great library by any standard, even if you take out exclusively the components that are in other physics engines it runs great, so that's not an issue. Whether or not you like the idea of hardware acceleration for physics idea is another thing altogether. As for the game industry failing hard at using unique hardware... I kinda have to concede on that one.
I don't understand what the big deal is. You guys should be mad at AMD for not implementing their own physics system or trying to push an open-source standard, not mad at NVidia for adding more features to their cards and trying to capture a different subset of the market. It's not like a fully GPU-driven physics model that doesn't infringe on any licenses is easy to create, so why would you expect someone from the Borderlands 2 dev team to spend years of resources developing a solution that's compliant for both card manufacturers when dropping in PhysX support is so fucking easy and gives the game a little extra flare on the NVidia cards?
[QUOTE=Downsider;37322492]I don't understand what the big deal is.You guys should be mad at AMD for not implementing their own physics system or trying to push an open-source standard, not mad at NVidia for adding more features to their cards and trying to capture a different subset of the market. [B]It's not like PhysX is easy to implement,[/B] so why would you expect someone from the Borderlands 2 dev team to spend years or resources developing a solution that's compliant for both card manufacturers when [B]dropping in PhysX support is so fucking easy[/B]?[/QUOTE] Umm WHAT?
Sorry, you need to Log In to post a reply to this thread.