• DirectX 12 will know how to use your integrated GPU to boost your high-end one
    32 replies, posted
[url]http://www.pcgamesn.com/directx-12-will-know-how-to-use-your-integrated-gpu-to-boost-your-high-end-one[/url]
Poor man's SLI? Sounds neat, especially for laptops
if thats possible, that means a Nvidia with AMD "SLi" is possible?
[QUOTE=C4rnage;47671322]if thats possible, that means a Nvidia with AMD "SLi" is possible?[/QUOTE] I assume its the exact same thing.
Well my integrated runs flash like ass, so I wouldn't expect much unless you have an expensive ass motherboard
Granted a lot of computers come with Intel iGPU's, this could be a pretty good for a lot of people that don't build their own pc's or can't necessarily afford multiple cards for SLI.
It would be great if I could use an older GPU from a different generation say my old AMD 5850 with my 7950
[QUOTE=Saxon;47672068]It would be great if I could use an older GPU from a different generation say my old AMD 5850 with my 7950[/QUOTE] I presume it will work with all DX12 enabled cards?
Is there a list of already released GPU's that will support it?
I'm curious to see how well it performs with older cards, like a GTX 680. I'd imagine that ~10% increase you get with the Titan X is increased to like 30-40%, although at a much lower framerate of course. You could get some nice improvements for some older cards.
[QUOTE] 358 Jezcentral 4 Hours ago There are also whispers that you may be able to stack VRAM, which sounds like witchcraft to me.[/QUOTE] think of the uses
[QUOTE=~Kiwi~v2;47672811]nvidia gpus are fermi, kepler and maxwell amd gpus are any of them with GCN architecture intel gpus are unconfirmed[/QUOTE] What about my poor Tesla 285?
[QUOTE=J!NX;47672832]think of the uses[/QUOTE] Well odds are this is not doing traditional SLI with rendering alternating frames, otherwise every frame rendered with the iGPU will look like ass, so VRAM stacking is definitely a possibility.
[QUOTE=~Kiwi~v2;47672841]if only Skyrim was DX12. Then we could seriously go overkill with texture mods :v: [editline]7th May 2015[/editline] pffft[/QUOTE] 4x nvidia titan with every texture being 4k on a 4k monitor
Merge...
I doubt DX12 will automagically load off work to secondary GPUs. It'll probably just make it reasonably easy for developers to do. Which would mean games would have to be developed to take advantage of this feature, and how well it works depends on the game.
Does that mean I can use my integrated to make up for the fact that my third monitor isn't supported by my GTX 560 Ti?
I was actually tempted to pick up a setup like that. With my Qnix's being $250 now, it'd only cost me $500 to get it set up, and I could sell my ultrawide to make up most of that.
[QUOTE=Snickerdoodle;47672885]Does that mean I can use my integrated to make up for the fact that my third monitor isn't supported by my GTX 560 Ti?[/QUOTE] We'll have to see how the new graphics APIs will change driver design, but I'm not sure the GPU manufacturers will be very forthcoming there. [editline]6th May 2015[/editline] A GPU that old getting such an update is unlikely, anyway.
[QUOTE=DrTaxi;47672930]We'll have to see how the new graphics APIs will change driver design, but I'm not sure the GPU manufacturers will be very forthcoming there. [editline]6th May 2015[/editline] A GPU that old getting such an update is unlikely, anyway.[/QUOTE] I mean his GPU is going to support DX12.
If this means that I might possibly be able to enable my intel's igpu and bump up my antialiasing a notch due to the extra FPS, that would make me really happy
Would my integrated GPU need to support DX12 as well? Because if so, there's not a chance in hell my old motherboard will be going to 12.
AMD GPU + NVidia gpu working together and adding to vram when :V
Iris should work pretty well with this, it's already a fairly powerful chipset by itself.
[QUOTE=huntingrifle;47672798]Is there a list of already released GPU's that will support it?[/QUOTE] GTX 400 and Up. Amd HD 7k HD 8k and Rx 200
Woah, a DirectX update that actually matters to me. I'm so happy Windows 10 isn't looking like shit so I can use it.
Will it work for both integrated GPU in CPU and motherboard one toguether?
[QUOTE=DrTaxi;47672930]We'll have to see how the new graphics APIs will change driver design, but I'm not sure the GPU manufacturers will be very forthcoming there. [editline]6th May 2015[/editline] A GPU that old getting such an update is unlikely, anyway.[/QUOTE] With Win10, MS is taking the WDDM to 2.0 instead of just another 1.x version, so some big changes could happen.
[QUOTE=J!NX;47672832]think of the uses[/QUOTE] Graphic card engineers are some sort of modern day wizards I swear.
[QUOTE=DrTaxi;47672930]A GPU that old getting such an update is unlikely, anyway.[/QUOTE] This is why I'm so pleased with my current setup. Despite my GPU being an ATi Radeon 4670 1G vRAM, it still receives updates although unofficially due to open source radeon drivers. Regarding this move though, it's important to note if this is going to work with non-DX12 integrated systems, otherwise it's probably not that big news anyway. It'll be great to be able to utilize integrated cards for better performance or offloading certain intensive tasks, but the thing is that with DX12 being designed the way it is, I can only wonder if this would not require the application writers to support it? Or does DX12 attempt to get away with somehow offloading "parts" of the workload to the slower GPU system? So many questions regarding this, because it can either go kind of really well, or incredibly fucking bad. I hope for the former, but we'll see.
Sorry, you need to Log In to post a reply to this thread.