• EA's Johan Andersson wants Frostbite games to require Windows 10 and DirectX 12 as a minimum by 2016
    37 replies, posted
[url]http://www.pcgamesn.com/eas-johan-andersson-wants-frostbite-games-to-require-windows-10-and-directx-12-as-a-minimum-by-2016[/url]
[quote]“[It’s] likely a bit aggressive, but [has] major benefits” explained Andersson.[/quote] At least he's very upfront about it Article suggests that's it something that he's going to push for, while the tweet sound like something he's still contemplating. [img]http://www.7proxies.pw/a/arxq[/img] This wasn't mentioned, and probably should be
A game developer wanting to have more powerful tools available isn't a bad thing.
[QUOTE=Agoat;47488701]A game developer wanting to have more powerful tools available isn't a bad thing.[/QUOTE]As a poor consumer with a card that does not support DX12, it is bad for me.
Well I don't plan on playing EA games any time soon, regardless of my computer capabilities anyway.
Hopefully with graphic cards becoming cheaper etc, people will upgrade their hardware when purchasing or upgrading to Windows 10. As a developer, its basically a dream to have DirectX 12 and Windows 10 as a minimum, imagine, no backwards compatibility.
Doesn't seem like a customer-friendly move at all, and could backfire should the Vulkan gamble pay off.
[QUOTE=itisjuly;47488772]As a poor consumer with a card that does not support DX12, it is bad for me.[/QUOTE] To be honest it seems a bit unclear if he means card or runtime support. Considering win10 will be a free upgrade for both 7 and 8, I can see that kind of reasoning. Not to mention if your card will be unable to run dx12 in some way, it might not actually be able to run stuff in 2016.
Windows 10 will be free, and getting a nice graphics card isn't too hard if you have some extra cash. If it makes for a more "next-gen" experience, I'm all for it.
[QUOTE=~Kiwi~v2;47488791]I don't see why this a bad thing. We already know the huge benefits DX12 has.[/QUOTE] I don't. And this isn't the first time I've tried to find said benefits. The key benefit they are marketing seems to be that multiple cores can send data to your GPU. This has a huge potential to [U]increase performance in badly programmed games.[/U] Your CPU shouldn't need to send a whole lot of information to your GPU, and your GPU's total processing power should be a bottleneck, if your game is optimized. There are certain exceptions for some unique games, but these games would have to be very unique. Basically, if your bottleneck was occurring in CPU->GPU transfer, you fucked up bad. There is also the reduced CPU overhead benefits. [URL="http://blogs.msdn.com/b/directx/archive/2014/03/20/directx-12.aspx"]This page here[/URL] explains that CPU time for this case was reduced from 6 ms to 3 ms. This does not mean that you will get double the number of frames. There is no implication whatsoever that it will increase your framerate. The actual result is your GPU receiving data 3 milliseconds earlier, which, with vsync on and on OpenGL4/DX11, can mean your frame would have reached your eyes up to 15 ms later than it should have. With vsync off, it would have only reached your eyes 3 milliseconds later. [U]The framerate you experience will not change.[/U] Basically, CPU performance doesn't matter. [sp]My posts on performance and graphics almost always get showered in boxes and no one steps up to the plate to explain why they think I'm wrong. If this is you, explain yourself.[/sp]
[QUOTE=wraithcat;47488986] Not to mention if your card will be unable to run dx12 in some way, it might not actually be able to run stuff in 2016.[/QUOTE]I think this is false. I doubt they will be making games in 2016 so advanced that hardware from 2014 won't be able to handle them at least on low. It has never happened before and I doubt it will in 2016.
I'm okay with this.
[QUOTE=ThePanther;47489152]Windows 10 will be free, and getting a nice graphics card isn't too hard if you have some extra cash. If it makes for a more "next-gen" experience, I'm all for it.[/QUOTE] What if windows 10 ends up being shit? though? or at least for games? who knows what'll happen. it could happen also, if anyone replies to me with the le witty "BAD OS GOOD OS WINDOWS CYCLE", eat a bag of dicks
Well people who bought a GPU recently are already DX12 compatible, and those who do not own a DX12 card are most likely going to upgrade before 2016 so it's a non issue imo
[QUOTE=maxolina;47489664]Well people who bought a GPU recently are already DX12 compatible, and those who do not own a DX12 card are most likely going to upgrade before 2016 so it's a non issue imo[/QUOTE] i take it cards that are 3+ years old won't support it that would make a lot more sense then you can't let the past hold the future back at the same time, old/low end specs aren't necessarily a bad thing to own, but they do slow graphics progress down sometimes.
from someone with minimal tech experience, what's the significance of the video the article posted comparing 11 and 12
I really don't mind that much PC gaming to me is always about upgrading and changing up your hardware every few years to have the best experience, otherwise it isn't worth it imo They could also just add support for Vulkan for players on older systems. They added Mantle which was another Render API so it isn't that far out.
[QUOTE=J!NX;47490000]i take it cards that are 3+ years old won't support it that would make a lot more sense then you can't let the past hold the future back at the same time, old/low end specs aren't necessarily a bad thing to own, but they do slow graphics progress down sometimes.[/QUOTE] At least on the Nvidia side it's Fermi+, I haven't looked at AMD but that's GTX 400 series, or, most cards in the last 5 years. That's not unreasonable I don't think. DX12 offers developers a lot more power, much like Vulkan does, reducing overhead considerably and improving CPU throughput, so that your application is actually GPU bound, not GPU driver bound as is often the case now. If anyone is really interesting, [url]http://www.gdcvault.com/play/1022018/[/url] is the GDC Vulkan talk, free to view. It's well worth a watch. Vulkan isn't DX12 however both are very much based on Mantle and offer very similar benefits. Of course the talk does cover changes to the shader pipeline too (SPIR-V), and that doesn't really apply to DX12. But the layers concept, the ability to better to use the CPU, command buffers etc, they are all shared with DX12.
I dunno if my ATI R270X will have DX12 compatibility...
[QUOTE=Swilly;47497309]I dunno if my ATI R270X will have DX12 compatibility...[/QUOTE] Yes it will.
This is how you alienate customers and make the community hate Dice even worse. Not everybody can afford a Ferrari of a computer.
[QUOTE=BenJammin';47497430]This is how you alienate customers and make the community hate Dice even worse. Not everybody can afford a Ferrari of a computer.[/QUOTE] people still think high end pc's are 10,000$? Fuck me
Most people already support DX12 and don't even know it.
This better not effect BattleFront Star War's and Mass Effect.
[QUOTE=willtheoct;47489365]I don't. And this isn't the first time I've tried to find said benefits. The key benefit they are marketing seems to be that multiple cores can send data to your GPU. This has a huge potential to [U]increase performance in badly programmed games.[/U] Your CPU shouldn't need to send a whole lot of information to your GPU, and your GPU's total processing power should be a bottleneck, if your game is optimized. There are certain exceptions for some unique games, but these games would have to be very unique. Basically, if your bottleneck was occurring in CPU->GPU transfer, you fucked up bad. There is also the reduced CPU overhead benefits. [URL="http://blogs.msdn.com/b/directx/archive/2014/03/20/directx-12.aspx"]This page here[/URL] explains that CPU time for this case was reduced from 6 ms to 3 ms. This does not mean that you will get double the number of frames. There is no implication whatsoever that it will increase your framerate. The actual result is your GPU receiving data 3 milliseconds earlier, which, with vsync on and on OpenGL4/DX11, can mean your frame would have reached your eyes up to 15 ms later than it should have. With vsync off, it would have only reached your eyes 3 milliseconds later. [U]The framerate you experience will not change.[/U] Basically, CPU performance doesn't matter. [sp]My posts on performance and graphics almost always get showered in boxes and no one steps up to the plate to explain why they think I'm wrong. If this is you, explain yourself.[/sp][/QUOTE] Saying very unique is a bit of a stretch. Many, many large scale RTS's and massive MMO's are definitely CPU bottlenecked, regardless of good or bad coding.
2016 is a bit early methinks. I mean, the tech has barely hit the market yet.
[QUOTE=itisjuly;47488772]As a poor consumer with a card that does not support DX12, it is bad for me.[/QUOTE] The Fermi series cards are compatible with dx12. Fermi series is the GTX 400 series guys. If you don't plan on upgrading from that by 2016, you wont be playing many games made in 2016.
For gods sake guys what is so hard to understand If you want better graphics, you need a better computer. Too bad if you can't afford one, technology will just leave you behind.
[QUOTE=Kazumi;47498041]2016 is a bit early methinks. I mean, the tech has barely hit the market yet.[/QUOTE] I think it's a fantastic move, but I think they are pushing it a little too fast. 2017 would be better.
Mm, yes, let's force PC gamers who may not be able to afford it to buy a new video card, a new copy of Windows, then go through all the faff and bother to reinstall Windows solely to play the next Battlefield game. Let's totally not consider that maybe they can't afford to drop that kind of dosh on their rigs! Let's also give two huge middle fingers to collegiate PC gamers on 2-3 year old lappies who due to the spacial restrictions of dorm life can't actually own a desktop and therefore have to spend upwards of four figures on a new laptop that has a DX12 graphics chip on it! Progress is great and all, but this is just stupid. They're going to lose sales if they actually make it [i]require[/i] W10 and DX12. Recommend? Sure. Build it around that and provide support for W7/DX11/DX9 after the fact? Okay. But forcing everyone to W10/DX12 like this would be a middle finger to PC gamers who through no fault of their own don't have pocketbooks deep enough to upgrade. Hell, the only reason I have a DX12 video card is because of tax return money(And it isn't even a high-end one, just a GTX960), had I not had some spare dosh from that I'd still be on DX11.
Sorry, you need to Log In to post a reply to this thread.