• Borderlands 2 PC video shows off fancy PhysX effects
    197 replies, posted
[QUOTE=chunkymonkey;37322521]Umm WHAT?[/QUOTE] I meant to say create, as in write from scratch, not implement.
[QUOTE=chunkymonkey;37322521]Umm WHAT?[/QUOTE] I think what he's trying to say is it's not easy to add support for an entirely different GPU, whereas it's easy to just use an all around better physics library that also just happens to support GPU acceleration for one kind of GPUs. Even though I have an ATI GPU in my computer I still prefer to use PhysX for everything. Think of it this way, say Havok and PhysX are compared side by side - and say the results come back something like for the same simulation (same scenario, same objects, same forces applied etc) PhysX uses 15% of the CPU and Havok uses 17% of the CPU doing the exact same task. Would you rather have them use Havok - and have a worse library for everybody, or have them use PhysX and give both a performance boost while NVIDIA cards get an even greater one? I'm not necessarily saying that PhysX performs better in every situation, but at least for me testing both out PhysX seemed to run better on my laptop (with an ATI card)
[QUOTE=Elspin;37322713]I think what he's trying to say is it's not easy to add support for an entirely different GPU, whereas it's easy to just use an all around better physics library that also just happens to support GPU acceleration for one kind of GPUs. Even though I have an ATI GPU in my computer I still prefer to use PhysX for everything. Think of it this way, say Havok and PhysX are compared side by side - and say the results come back something like for the same simulation (same scenario, same objects, same forces applied etc) PhysX uses 15% of the CPU and Havok uses 17% of the CPU doing the exact same task. Would you rather have them use Havok - and have a worse library for everybody, or have them use PhysX and give both a performance boost while NVIDIA cards get an even greater one? I'm not necessarily saying that PhysX performs better in every situation, but at least for me testing both out PhysX seemed to run better on my laptop (with an ATI card)[/QUOTE] Look, we're(or at least I'm not) not arguing that PhysX ain't good at doing physics shit. We're arguing that it being proprietary is killing it and has been for 6 years thus relegating it to essentially gimmick status. Devs don't use it for anything really cool because it means alienating customers who don't have it(which if Steams hardware survey is anything to go by that's around half of their potential player base). It's performance really doesn't mean jack squat if devs don't use it. When 95% of games released use it instead of the barely 1% that do now then I'll change my tune but as it stands that doesn't look to be changing anytime soon.
[QUOTE=chunkymonkey;37322882]Look, we're(or at least I'm not) not arguing that PhysX ain't good at doing physics shit. We're arguing that it being proprietary is killing it and has been for 6 years thus relegating it to essentially gimmick status. Devs don't use it for anything really cool because it means alienating customers who don't have it(which if Steams hardware survey is anything to go by that's around half of their potential player base). It's performance really doesn't mean jack squat if devs don't use it. When 95% of games released use it instead of the barely 1% that do now then I'll change my tune but as it stands that doesn't look to be changing anytime soon.[/QUOTE] You're still missing the point here - if PhysX can do just as good a job as another library, has fantastic documentation, and optionally boosts NVIDIA users performance, why not use it? EDIT: also are you actually serious thinking only 1% of games use PhysX? It's the Unreal Engine's physics library [URL]http://en.wikipedia.org/wiki/PhysX#PhysX_in_video_games[/URL]
[QUOTE=Elspin;37322934]You're still missing the point here - if PhysX can do just as good a job as another library, has fantastic documentation, and optionally boosts NVIDIA users performance, why not use it? EDIT: also are you actually serious thinking only 1% of games use PhysX? It's the Unreal Engine's physics library [URL]http://en.wikipedia.org/wiki/PhysX#PhysX_in_video_games[/URL][/QUOTE] Are you're disregarding mine. The point you're making is not the one we're making. Obviously lots of devs see no reason to use it otherwise it'd be in nearly everything. WOW, a whole 38 games!
[QUOTE=chunkymonkey;37323004]Are you're disregarding mine. The point you're making is not the one we're making. Obviously lots of devs see no reason to use it otherwise it'd be in nearly everything. WOW, a whole 38 games![/QUOTE] Those are just the ones on the wikipedia article - it's used in every unreal engine game, every gamebyro game (see - the elderscrolls series etc), every unity game, every torque game and the list goes on. Not to mention you're neglecting the fact that not all games have physics engines, some games physics are simple enough that they don't need a dedicated library to manage it. Honestly PhysX is one of the most major libraries, what are you doing
[QUOTE=Elspin;37323135]Those are just the ones on the wikipedia article - it's used in every unreal engine game, every gamebyro game (see - the elderscrolls series etc), every unity game, every torque game and the list goes on. Not to mention you're neglecting the fact that not all games have physics engines, some games physics are simple enough that they don't need a dedicated library to manage it. Honestly PhysX is one of the most major libraries, what are you doing[/QUOTE] If I pop in ME3 it ain't gonna look any different with either Nvidia or ATI and it uses the unreal engine, same with Skyrim. They may have the libraries but they don't have PhysX effects and shit. Like if I play Skyrim with an Nvidia card I'm not gonna suddenly have better blood splatter effects or "realistic" water.
[QUOTE=chunkymonkey;37323158]If I pop in ME3 it ain't gonna look any different with either Nvidia or ATI and it use the unreal engine, same with Skyrim. They may have the libraries but they don't have PhysX effects and shit.[/QUOTE] PhysX is not an optional feature for effects, it controls every physical interaction in a game - if you want to try Skyrim without PhysX the best way to do that would be to call up Bethesda headquarters and ask for an engine re-write :v:
Skyrim is Havok you cannoli.
[QUOTE=Elspin;37323196][B]PhysX is not an optional feature for effects,[/B] it controls every physical interaction in a game - if you want to try Skyrim without PhysX the best way to do that would be to call up Bethesda headquarters and ask for an engine re-write :v:[/QUOTE] Could have fooled me because guess what option I have in Cryostasis? Yep, in the options menu I can turn the PhysX shit off. Heck a number of games on that Wikipedia list had the option to turn it off and so will Borderlands 2 I bet.
[QUOTE=thisispain;37323230]Skyrim is Havok you cannoli.[/QUOTE] Weird, they must have hacked apart gamebyro more than I thought for Skyrim. [QUOTE=chunkymonkey;37323238]Could have fooled me because guess what option I have in Cryostasis? Yep, in the options menu I can turn the PhysX shit off. Heck a number of games on that Wikipedia list had the option to turn it off and so will Borderlands 2 I bet.[/QUOTE] The option isn't to turn off PhysX, unless it literally means you will fall through the floor and objects will do nothing when collided against. I'm guessing what you mean is there's an option to turn off hardware acceleration.
[QUOTE=Elspin;37323254]Weird, they must have hacked apart gamebyro more than I thought for Skyrim. The option isn't to turn off PhysX, unless it literally means you will fall through the floor and objects will do nothing when collided against. I'm guessing what you mean is there's an option to turn off hardware acceleration.[/QUOTE] Dude, even Oblivion used Havok. (Cryostasis) [URL=http://filesmelt.com/][IMG]http://filesmelt.com/dl/cryostasis4.jpg[/IMG][/URL]
[QUOTE=chunkymonkey;37323269][URL=http://filesmelt.com/][IMG]http://filesmelt.com/dl/cryostasis4.jpg[/IMG][/URL][/QUOTE] Lol, exactly as I expected - it doesn't say you can turn PhysX off, it says you can turn off advanced effects if they're lagging you. Vindictus IIRC lets you turn off things like cloth from your cape and armour breaking, but the entire engine still uses PhysX to check for collisions, simulate objects being knocked about, etc.
What Elspin is trying to get across is that PhysX is, at its core, just another physics engine that can be used, much like Havok. However, many people associate PhysX with the flashy accelerated particle effects such as in Borderlands 2. It is these that are limited to those with Nvidia hardware. The basic rigid body engine is generally run on the CPU and works with any GPU (or, indeed, without any GPU at all). The list in that wikipedia article is of games supporting those extra PhysX effects, not of all games using the PhysX engine, which would be much too long to list. The text at the top of that section puts it well: [QUOTE=Wikipedia Article] PhysX technology is used by the game engines Unreal Engine 3, Unity 3D, Gamebryo, Vision, Instinct, Diesel, Torque, Hero and BigWorld. As one of the handful of major physics engines, it is used in many games, such as Bulletstorm, Need for Speed: Shift, Castlevania: Lords of Shadow, Mafia II, Alice: Madness Returns, Batman: Arkham City. Most of these games use the CPU to process the physics simulations. Video games with optional support for hardware accelerated PhysX, often with additional effects such as tearable cloth, dynamic smoke or simulated particle debris,include:<list of games> [/QUOTE] Barring a few exceptions, most every game made with those engines uses the PhysX engine to run physics calculations (and as we all know UE3 is very widely used). So, the PhysX engine itself has achieved a very large market penetration, while the GPU-accelerated effects are limited to Nvidia cards and select titles. Also, just to consolidate this into one post, in response to post 107's Carmack video, it seems that his answer was aimed at the Ageia PPU, which was additional hardware that most would never purchase, while Nvidia's CUDA implementation of PhysX brings a far larger fraction of users to the table. His arguments are still somewhat relevant, as even though users no longer have to buy additional hardware, the usage of GPU-accelerated PhysX effects will still remain limited due to the exclusion of AMD users.
[QUOTE=BMCHa;37323516]What Elspin is trying to get across is that PhysX is, at its core, just another physics engine that can be used, much like Havok. However, many people associate PhysX with the flashy accelerated particle effects such as in Borderlands 2. It is these that are limited to those with Nvidia hardware. The basic rigid body engine is generally run on the CPU and works with any GPU (or, indeed, without any GPU at all). The list in that wikipedia article is of games supporting those extra PhysX effects, not of all games using the PhysX engine, which would be much too long to list. The text at the top of that section puts it well: Barring a few exceptions, most every game made with those engines uses the PhysX engine to run physics calculations (and as we all know UE3 is very widely used). So, the PhysX engine itself has achieved a very large market penetration, while the GPU-accelerated effects are limited to Nvidia cards and select titles. Also, just to consolidate this into one post, in response to post 107's Carmack video, it seems that his answer was aimed at the Ageia PPU, which was additional hardware that most would never purchase, while Nvidia's CUDA implementation of PhysX brings a far larger fraction of users to the table. His arguments are still somewhat relevant, as even though users no longer have to buy additional hardware, the usage of GPU-accelerated PhysX effects will still remain limited due to the exclusion of AMD users.[/QUOTE] For Havok, isn't it optimized for Intel CPUs?
[QUOTE=digigamer17;37323525]For Havok, isn't it optimized for Intel CPUs?[/QUOTE] Apparently they started developed a competing technology called Havok FX that worked with both NVIDIA and ATI cards, but when Intel acquired Havok they cancelled it.
[QUOTE=Elspin;37323542]Apparently they started developed a competing technology called Havok FX that worked with both NVIDIA and ATI cards, but when Intel acquired Havok they cancelled it.[/QUOTE] Ah, right. But the havok devs still say that it runs best on Intel or is that a gimmick?
[QUOTE=digigamer17;37323562]Ah, right. But the havok devs still say that it runs best on Intel or is that a gimmick?[/QUOTE] I imagine what they mean by that is in the case that a physics algorithm could be designed around function A or function B they chose the one that typically runs faster on an Intel processor, how much of an advantage that actually gives them I can't honestly say, it'd be interesting to see some bookmarks (although I don't think I've had a processor that isn't Intel anyway)
[QUOTE=Elspin;37323542]Apparently they started developed a competing technology called Havok FX that worked with both NVIDIA and ATI cards, but when Intel acquired Havok they cancelled it.[/QUOTE] Intel is interested in stopping GPU / Hardware acceleration at bay, or at-least making it proprietary as shit.
[QUOTE=Elspin;37321976]It doesn't have to be though, that'd be like disregarding all of gaming because tonnes of shitty games get produced. Don't get me wrong here, I'm not saying it's a -great- idea, but for the hardcore enthusiast there could be some really neat things done with physics if there was a dedicated card[/QUOTE] And so why doesn't Nvidia just make PhysX accelerator cards akin to the Ageia days? They get money, their software stays proprietary, and people still get PhysX when they want it, regardless of if they own an nVidia card or not It's still just a gimmick. Like, whoop de do, I can shoot holes in fabric and when I shoot slag it piles up! GAME CHANGING. That stuff doesn't need to be locked to GeForce cards.
[QUOTE=Protocol7;37326095]And so why doesn't Nvidia just make PhysX accelerator cards akin to the Ageia days? They get money, their software stays proprietary, and people still get PhysX when they want it, regardless of if they own an nVidia card or not It's still just a gimmick. Like, whoop de do, I can shoot holes in fabric and when I shoot slag it piles up! GAME CHANGING. That stuff doesn't need to be locked to GeForce cards.[/QUOTE] Would you as a company have your staff spend their time working to get your own physics engine, working on another companies hardware? Ati cards where originally sopported by nvidia until one ati employee had a big rant and pissed them off.
[QUOTE=alien_guy;37327169]Would you as a company have your staff spend their time working to get your own physics engine, working on another companies hardware? Ati cards where originally sopported by nvidia until one ati employee had a big rant and pissed them off.[/QUOTE] Not to deflect or anything, but is English your first language? I'm not entirely sure what point you're making. PhysX *can* be a game-changer, but having it supported on *only* nVidia cards is arbitrary when we can go back to PhysX accelerators. They shouldn't be expensive (~$100 sounds fair) and should be able to be used with any systems. Of course nVidia has well established their CUDA system so it's not going to happen, but still. I mean, I like PhysX, don't get me wrong, but until it can persist across all systems at a reasonable price point I'm just going to call it a gimmick.
[QUOTE=Protocol7;37328351]Not to deflect or anything, but is English your first language? I'm not entirely sure what point you're making. PhysX *can* be a game-changer, but having it supported on *only* nVidia cards is arbitrary when we can go back to PhysX accelerators. They shouldn't be expensive (~$100 sounds fair) and should be able to be used with any systems. Of course nVidia has well established their CUDA system so it's not going to happen, but still. I mean, I like PhysX, don't get me wrong, but until it can persist across all systems at a reasonable price point I'm just going to call it a gimmick.[/QUOTE] I don't see why AMD couldn't licence the tech in which case its their fault but nvidia cant make physx all open source cause it would be the most retarded buisiness dicision ever.
I'm not saying open source PhysX, but going back to the accelerator model lets them still control everything they need to while retaining market domination in that segment
[QUOTE=alien_guy;37327169] Ati cards where originally sopported by nvidia until one ati employee had a big rant and pissed them off.[/QUOTE] I'm gonna need a source on that.
[QUOTE=Protocol7;37328351]I mean, I like PhysX, don't get me wrong, but until it can persist across all systems at a reasonable price point I'm just going to call it a gimmick.[/QUOTE] Why would you call a physics engine a gimmick... that's like calling the wheels on a car a gimmick :v: I mean I can understand calling the optional features they add to some games when you have an nvidia card, but unless you're guilty of not reading the thread a 3rd or 4th time we've covered that referring to those extra features as "PhysX" would be like referring to the sparks metrocop batons make in half life as "the source engine".
I have yet to see a full game implement more than those extra features that wasn't CellFactor (and even then that game was terrible.) And PhysX is nowhere near as necessary as the wheels on a car. What the fuck do you think we were doing before PhysX? [editline]20th August 2012[/editline] For the record I don't mean PhysX as a whole, but PhysX is currently a gimmick because how it's currently supported by hardware means it will never see widespread use. And that leads to lazy, stupid and useless implementations like we're seeing in BL2 (and they have the nerve to call it a feature.)
[QUOTE=Protocol7;37334657]I have yet to see a full game implement more than those extra features that wasn't CellFactor (and even then that game was terrible.) And PhysX is nowhere near as necessary as the wheels on a car. What the fuck do you think we were doing before PhysX? [editline]20th August 2012[/editline] For the record I don't mean PhysX as a whole, but PhysX is currently a gimmick because how it's currently supported by hardware means it will never see widespread use. And that leads to lazy, stupid and useless implementations like we're seeing in BL2 (and they have the nerve to call it a feature.)[/QUOTE] It's honestly like you've never even read a thread before in your life... PhysX [b]is already in widespread use[/b]. It's one of the top physics engines, probably second only to havok. Only some of the games that use PhysX support hardware acceleration but either way, you're straight up wrong. As for it not being as necessary as the wheels of a car, it's a fucking physics engine - physics literally being the binding laws of the universe, at least in the games it's used in it [b]is as necessary as the wheels of a car[/b]. What did we do before PhysX? We used other physics engines or had physics simple enough to not need a dedicated library, duh?
I'M NOT TALKING ABOUT THE ENTIRE FUCKING PHYSICS ENGINE FOR THE THIRD FUCKING TIME. For someone who bitches about people not reading threads you seem to have plenty of your own comprehension problems.
[QUOTE=Protocol7;37335872]I'M NOT TALKING ABOUT THE ENTIRE FUCKING PHYSICS ENGINE FOR THE THIRD FUCKING TIME. For someone who bitches about people not reading threads you seem to have plenty of your own comprehension problems.[/QUOTE] The massive irony here is I've already explained to you that referring to just a single part of PhysX as "PhysX" is like referring to metrocop baton sparks as the source engine. Not like you read any of the thread though, even though it's [b]on this page[/b].
Sorry, you need to Log In to post a reply to this thread.