[quote]"DICE has a partnership with AMD specifically for Battlefield 4 on PC to showcase and optimize the game for AMD hardware."[/quote]
That is such an unbelievable amount of bullshit I almost vomited. Why tell a different story than AMD when they told us exactly why you did it?
[quote]"It makes sense that game developers would focus on AMD hardware with AMD hardware being the backbone of the next console generation."[/quote]
AMD tells us the truth, EA/DICE want to make sure that their community (which on PC is mostly nVidia cards) still buys their game by doing a little damage control.
Don't buy Battlefield 4.
This sucks like any other GPU partnership but probably wont effect me, gonna be waiting for the "BF4 Premium Edition" anyway so any driver kinks should be ironed out by then on the Nvidia side of things. Not being an idiot and spending nearly £60 on one game again.
[QUOTE=BANNED USER;41115680]
Don't buy Battlefield 4.[/QUOTE]
Don't be so dramatic.
Hasn't the same thing basically been going on with Nvidia for the last 5+ years, though?
[QUOTE=Cushie;41116088]Hasn't the same thing basically been going on with Nvidia for the last 5+ years, though?[/QUOTE]
Yes in the sense that Nvidia does sponsor devs, but it stands that it simply works with them to help optimize them in general, and provide them with Nvidia hardware to test with.
Dice can go fuck themselves. They have no reason to be signing this deal. A game should be ready for release on all hardware and I shouldn't be punished for purchasing the hardware I want. It's not going to make me swap out my Nvidia 780GTX for some Radeon crap.
I'll just wait for the game to come out and see how things are, best thing to do.
Holy fuck cry harder, there's not going to much difference wrt AMD vs nVidia hardware either way.
Holy shit who cares. Sure maybe the reasons are bullshit but it's not like it will run bad on nvidia cards, high end cards will still run it maxed out fine, nvidia or amd.
Companies have been doing this for years.
Valve - HL2/EP1/EP2 - "Runs best on ATI"
Crytek - Crysis - "Runs best with nvidia"
GSC - STALKER - "Runs best on AMD"
etc.
You don't have to act like this is a big deal, it really doesn't change much.
[QUOTE=Bloodshot12;41116265]Holy shit who cares. Sure maybe the reasons are bullshit but it's not like it will run bad on nvidia cards, high end cards will still run it maxed out fine, nvidia or amd.
Companies have been doing this for years.
Valve - HL2/EP1/EP2 - "Runs best on ATI"
Crytek - Crysis - "Runs best with nvidia"
GSC - STALKER - "Runs best on AMD"
etc.
You don't have to act like this is a big deal, it really doesn't change much.[/QUOTE]
Valve at the time had an extensive relationship with ATI cards. Lots of promotions, lots of advertisements, their website even had screenshots of Half Life on them. Buy Radeon get Lost Coast tech demo and CSS. The game was optimized nicely at launch and it definitely was no Frostbite engine.
Crytek I don't know the specifics of but that promotion was unnecessary. They had all the money in the world.
GSC definitely needed that money. Small time studio making a AAA game that had been delayed a good dozen times and overhauled twice.
I think Nvidia is worse with this. God damn Physx
[QUOTE=Cushie;41116088]Hasn't the same thing basically been going on with Nvidia for the last 5+ years, though?[/QUOTE]
10 years actually
Why are people throwing such a fit about this. Far Cry 3 was "AMD optimized". These partnerships have been around for years so they can sell gfx card / game bundles. I haven't heard anyone complain about that.
It's a marketing gimmick.
[QUOTE=Clavus;41118132]Why are people throwing such a fit about this. Far Cry 3 was "AMD optimized". These partnerships have been around for years so they can sell gfx card / game bundles. I haven't heard anyone complain about that.
It's a marketing gimmick.[/QUOTE]
The problem is it does not allow [b]OPTIMIZATION[/b] for Nvidia hardware so for all we know, it could come out and we'll be getting 10FPS because EA/DICE wanted to make a quick buck.
[QUOTE=Alex_DeLarge;41118276]The problem is it does not allow [b]OPTIMIZATION[/b] for Nvidia hardware so for all we know, it could come out and we'll be getting 10FPS because EA/DICE wanted to make a quick buck.[/QUOTE]
Stop being such a reactionary baby. You too Banned User.
No company worth their salt would totally block out a section of users (a pretty fucking big one too) for a tiny increase in funding and help with testing. The partnership with AMD is going to hopefully mean the game will work well on AMD cards on launch, AMD will be able to make very specific driver level optimisations before nVidia, and DICE can find out what AMD cards can put up with easier. This does not mean nVidia are being ignored, and that the game won't run at all. It will run perfectly fine as the way modern graphics programming works means it's pretty fucking abstracted from driver level shit.
DICE cannot make the game run worse on nVidia cards without intentionally tampering with the graphics API they are using to force you onto AMD hardware. It's firstly not profitable in the fucking slightest, and second is awful for PR. No company would do it without the most extreme of reasons, end of discussion. Go home, do your homework and stop being fucking idiots about this game. Fuck.
[QUOTE=hexpunK;41118403]Stop being such a reactionary baby. You too Banned User.
No company worth their salt would totally block out a section of users (a pretty fucking big one too) for a tiny increase in funding and help with testing. The partnership with AMD is going to hopefully mean the game will work well on AMD cards on launch, AMD will be able to make very specific driver level optimisations before nVidia, and DICE can find out what AMD cards can put up with easier. This does not mean nVidia are being ignored, and that the game won't run at all. It will run perfectly fine as the way modern graphics programming works means it's pretty fucking abstracted from driver level shit.
DICE cannot make the game run worse on nVidia cards without intentionally tampering with the graphics API they are using to force you onto AMD hardware. It's firstly not profitable in the fucking slightest, and second is awful for PR. No company would do it without the most extreme of reasons, end of discussion. Go home, do your homework and stop being fucking idiots about this game. Fuck.[/QUOTE]
I develop games for a living. Don't tell me to do my research when you don't know what you're talking about yourself. The way it works is you optimize for AMD (usually what workstations use), then you optimize for Nvidia, then you optimize for Intel/AMD APUs and then after release, you find out about a bunch of weird hardware configuration problems and then you fix them.
If this contract is written in the way it's stated above, it means that they're not able to hand over their game to Nvidia prior to release to improve driver performance. That's a huge part of any major AAA title
The only difference is that you'll see
[media]http://www.youtube.com/watch?v=MvnTNSsIHdk[/media]
and not
[media]http://www.youtube.com/watch?v=J-6EFBlybD8[/media]
[editline]20th June 2013[/editline]
[QUOTE=Alex_DeLarge;41118276]The problem is it does not allow [b]OPTIMIZATION[/b] for Nvidia hardware so for all we know, it could come out and we'll be getting 10FPS because EA/DICE wanted to make a quick buck.[/QUOTE]
AMD: Frostbite 3 partnership won't prevent our competition from optimizing pre-release.
[QUOTE=BANNED USER;41115680]That is such an unbelievable amount of bullshit I almost vomited. Why tell a different story than AMD when they told us exactly why you did it?
AMD tells us the truth, EA/DICE want to make sure that their community (which on PC is mostly nVidia cards) still buys their game by doing a little damage control.
Don't buy Battlefield 4.[/QUOTE]
nvidia does this all the fucking time, chill out
you still shouldn't buy battlefield 4, but that's just because it's gonna be a shit game
[QUOTE=Alex_DeLarge;41118519]I develop games for a living. Don't tell me to do my research when you don't know what you're talking about yourself. The way it works is you optimize for AMD (usually what workstations use), then you optimize for Nvidia, then you optimize for Intel/AMD APUs and then after release, you find out about a bunch of weird hardware configuration problems and then you fix them.
If this contract is written in the way it's stated above, it means that they're not able to hand over their game to Nvidia prior to release to improve driver performance. That's a huge part of any major AAA title[/QUOTE]
Uhhh...pretty sure you optimise for whoever is paying your bills man. Of course you then ensure that other vendors have optimisations and can perform driver level optimisation, but the guys who are funding you kinda take priority over whatever system you think you have.
The game is going to run on nVidia GPUs with or without launch day optimisations just fine, graphics APIs are abstract enough to allow for that, believing otherwise is kinda dumb. Yeah, there's going to be hiccups until nVidia can get all the problems their drivers have sorted, but that is to be expected. DICE are still going to allow for nVidia to optimise to some extent, and ensure their game works considering they would loose more than AMD are giving if the game doesn't work on nVidia GPUs.
Again, stop being a reactionary baby and think shit through logically. If this is how your thought process works, you probably shouldn't be developing games.
[QUOTE=Alex_DeLarge;41116417]Valve at the time had an extensive relationship with ATI cards. Lots of promotions, lots of advertisements, their website even had screenshots of Half Life on them. Buy Radeon get Lost Coast tech demo and CSS. The game was optimized nicely at launch and it definitely was no Frostbite engine.
Crytek I don't know the specifics of but that promotion was unnecessary. They had all the money in the world.
GSC definitely needed that money. Small time studio making a AAA game that had been delayed a good dozen times and overhauled twice.[/QUOTE]
valve had that partnership because at the time of hl2 the flagship nvidia cards were utter shit and didn't support the shader model that hl2 used afaik
[QUOTE=Generic.Monk;41118858]you still shouldn't buy battlefield 4, but that's just because it's gonna be a shit game[/QUOTE]
So, you're from the future?
Mind, I don't have high expectations for BF4 either, but I think I'll try the beta (demo) before calling it a shit game.
it's bound by its nature to be boring as sin, it's a modern warfare shooter
[QUOTE=Alex_DeLarge;41118519][B]I develop games for a living[/B][/QUOTE]
Doesn't make your false information any less false
[QUOTE=Bloodshot12;41120401]Doesn't make your false information any less false[/QUOTE]
Okay well I'll take my word for it, you know, since I've done it before over some guy named Bloodshot12 on Facepunch forums.
Nvidia users need to check their privilege. They get PhysX and 100 times more ~~~~~OPTIMIZED (not marketing scheme, i know because i develop games)~~~~~ games, while all AMD users get is the "powered by amd" when their game starts.
[QUOTE=Alex_DeLarge;41120455]Okay well I'll take my word for it, you know, since I've done it before over some guy named Bloodshot12 on Facepunch forums.[/QUOTE]
lol
[editline]20th June 2013[/editline]
[QUOTE=milkandcooki;41120467]Nvidia users need to check their privilege. They get PhysX and 100 times more ~~~~~OPTIMIZED (not marketing scheme, i know because i develop games)~~~~~ games, while all AMD users get is the "powered by amd" when their game starts.[/QUOTE]
lol
[QUOTE=milkandcooki;41120467]Nvidia users need to check their privilege. They get PhysX and 100 times more ~~~~~OPTIMIZED (not marketing scheme, i know because i develop games)~~~~~ games, while all AMD users get is the "powered by amd" when their game starts.[/QUOTE]
Everyone gets PhysX it's a decent library.
The hardware acceleration of it is something else though.
[QUOTE=Brt5470;41121666]Everyone gets PhysX it's a decent library.
The hardware acceleration of it is something else though.[/QUOTE]
i thought he was joking
[QUOTE=Brt5470;41121666]Everyone gets PhysX it's a decent library.
The hardware acceleration of it is something else though.[/QUOTE]
Pretty sure physx is the universally accepted shorthand for 'nvidia's gimmicky particle effects'
[QUOTE=Generic Monk;41121932]Pretty sure physx is the universally accepted shorthand for 'nvidia's gimmicky particle effects'[/QUOTE]
Its quite a lot more than that, developers just don't use the other features, take a look at a game with PhysX used correctly:
[video=youtube;w0xRJt8rcmY]http://www.youtube.com/watch?v=w0xRJt8rcmY[/video]
It still shouldn't be exclusive to Nvidia cards though.
Sorry, you need to Log In to post a reply to this thread.