• Intel reveals yet another new CPU socket, but this time it has DX11.1 support in its integrated grap
    44 replies, posted
[table] [tr] [td][QUOTE]Some time in 2013, Intel will launch its new processor architecture, codenamed "Haswell", which will go on to succeed "Ivy Bridge". More than an year away from its market entry, Haswell has already been exhaustively documented, but not many got into the details about its embedded graphics processor. That is, until now. A new internal slide sourced by DonanimHaber details the integrated GPU (iGPU), it appears like Intel has solid plans for home users. To begin with, Haswell's iGPU will be DirectX 11.1 compliant, which means it will take advantage of API optimizations that improve performance, for typical desktop usage scenarios. Apart from support for a new DirectCompute architecture, it will also support OpenCL 1.2, which speeds up certain GPGPU-optimized applications. More importantly, the iGPU will be designed around a new stereoscopic 3D standard called Auto-Stereoscopic 3D (AS3D), which will take the likes of Blu-ray 3D acceleration, stereo 3D photos, etc., to the masses. Currently, it takes at least an entry-level GeForce or Radeon GPU to for acceptable performance with stereo 3D. Some time in 2013, Intel will launch its new processor architecture, codenamed "Haswell", which will go on to succeed "Ivy Bridge". More than an year away from its market entry, Haswell has already been exhaustively documented, but not many got into the details about its embedded graphics processor. That is, until now. A new internal slide sourced by DonanimHaber details the integrated GPU (iGPU), it appears like Intel has solid plans for home users. To begin with, Haswell's iGPU will be DirectX 11.1 compliant, which means it will take advantage of API optimizations that improve performance, for typical desktop usage scenarios. Apart from support for a new DirectCompute architecture, it will also support OpenCL 1.2, which speeds up certain GPGPU-optimized applications. More importantly, the iGPU will be designed around a new stereoscopic 3D standard called Auto-Stereoscopic 3D (AS3D), which will take the likes of Blu-ray 3D acceleration, stereo 3D photos, etc., to the masses. Currently, it takes at least an entry-level GeForce or Radeon GPU to for acceptable performance with stereo 3D.[/QUOTE][/td] [td][IMG]http://www.techpowerup.com/img/12-02-13/90a.jpg[/IMG][/td] [/tr] [/table] [url]http://www.techpowerup.com/160431/Intel-Haswell-Packs-DirectX-11.1-Graphics.html[/url]
I heard you the first time! But seriously, I'll be looking forward to this. And is there a real reason for this to use VGA? Wouldn't it be better to just use DVI?
It's nice to see Intergrated GPU's becoming more of a norm.
But can it run Crysis?
[QUOTE=certified;34685835]But can it run Crysis?[/QUOTE] Probably, Not too sure about a playable framerate.
But can it run windows 98?
New socket? Or new Chipset? Feels rather stupid to release another socket at this point. Especially when today low budget already got integrated gpu in the cpu.
GPGPU On a processor? Wasnt the whole objective of it sparing CPU time and having extreme multi threading? WTF Intel?
[QUOTE=Sexy Eskimo;34686311]New socket? Or new Chipset? Feels rather stupid to release another socket at this point. Especially when today low budget already got integrated gpu in the cpu.[/QUOTE] New socket They do this since most of the PC market aren't people upgrading their machines, so the small amount of people with customized desktop towers is very slim.
[QUOTE=JohnnyOnFlame;34686354]GPGPU On a processor? Wasnt the whole objective of it sparing CPU time and having extreme multi threading? WTF Intel?[/QUOTE] It isn't a merged CPU/GPU, they're just on the same die. GPU compute is more flops per transistor so even if you aren't using it as a graphics card it's a good choice assuming developers start using OpenCL more widely.
well passing over the current generation, yet again. still stuck with my shitty c2d socket, hopefully there's a decent priced processor/integrated card combo with this new set.
In other news: AMD retaliates with its new 34 core processor! Each core clocked at lightening fast 0.0000023MHz
in other words, upgrade to ivy bridge and you get ripped off a few months later
[QUOTE=DiBBs27;34686542]In other news: AMD retaliates with its new 34 core processor! Each core clocked at lightening fast 0.0000023MHz[/QUOTE] If you're going to take a stab at insulting AMD at least make it inteligent. Having 34 cores clocked at that speed only gives you 0.0000782Mhz. Not to mention having 34 cores is stupid when computers work in powers of 2.
As nice as progress can be. This "new socket every so often" thing Intel do is a pain in the ass for upgrading, and kind of why I like AMD, upgrade paths are quite flexible. Though them sticking to one socket for too long, and trying to get everything backwards compatible does have some negative impacts. A better IGP is always good though, mobile devices can make much more use of graphically intensive applications, with less power draw than a dedicated one. I think their HD 3000 (Sandy Bridge IGP) are actually quite powerful for what they are.
[QUOTE=Stick it in her pooper;34686616]in other words, upgrade to ivy bridge and you get ripped off a few months later[/QUOTE] Uh no. Haswell is an entire year later. Intel isn't deviating from their Tick Tock strategy. [QUOTE=hexpunK;34686654]As nice as progress can be. This "new socket every so often" thing Intel do is a pain in the ass for upgrading, and kind of why I like AMD, upgrade paths are quite flexible. Though them sticking to one socket for too long, and trying to get everything backwards compatible does have some negative impacts. A better IGP is always good though, mobile devices can make much more use of graphically intensive applications, with less power draw than a dedicated one. I think their HD 3000 (Sandy Bridge IGP) are actually quite powerful for what they are.[/QUOTE] Yah and look where sticking to one socket got AMD. Intel switches sockets because it's necessary due to changes in architecture. Like when they integrated the north bridge onto the CPU and everyone complained. It's not a choice, you have to create a new socket for these new architectures.
[QUOTE=reedbo;34686649]If you're going to take a stab at insulting AMD at least make it inteligent. Having 34 cores clocked at that speed only gives you 0.0000782Mhz. Not to mention having 34 cores is stupid when computers work in powers of 2.[/QUOTE] That, uhhh. Isn't how it works. You don't multiply the number of cores by the frequency. It just means you have 34 cores running abysmally slow, so even the most parallel of applications isn't going to compute shit.
[QUOTE=DiBBs27;34686542]In other news: AMD retaliates with its new 34 core processor! Each core clocked at lightening fast 0.0000023MHz[/QUOTE] Except for the fact that AMD is dropping out of the processor market, last I checked.
[QUOTE=Killerelf12;34686680]Except for the fact that AMD is dropping out of the processor market, last I checked.[/QUOTE] Not exactly. They're just shifting more focus towards mobile and portable devices. That's a problem for Intel because now they'll have ARM and AMD to compete with.
[QUOTE=hexpunK;34686671]That, uhhh. Isn't how it works. You don't multiply the number of cores by the frequency. It just means you have 34 cores running abysmally slow, so even the most parallel of applications isn't going to compute shit.[/QUOTE] That's the point. You've only got at max 0.0000782Mhz of processing power even if something could utilize all 34 cores. It's a stupid insult and a more stupid argument.
-snip-He fixed it.
[QUOTE=hexpunK;34686671]That, uhhh. Isn't how it works. You don't multiply the number of cores by the frequency. It just means you have 34 cores running abysmally slow, so even the most parallel of applications isn't going to compute shit.[/QUOTE] Yup. Also the fact that i was exaggerating in pretty much every way. I'm making fun of amd's bulldozer and their unstoppable logic "more cores = better performance".
[QUOTE=garrynohome;34686659]Yah and look where sticking to one socket got AMD. Intel switches sockets because it's necessary due to changes in architecture. Like when they integrated the north bridge onto the CPU and everyone complained. It's not a choice, you have to create a new socket for these new architectures.[/QUOTE] I don't think it was the sticking to AM3+ that did Bulldozer damage. It was just bad design choices in general, AMD went for massive numbers of cores, without actually making the cores themselves worthwhile for applications that lack parallelism. For applications that can utilize the cores, Bulldozer does work, but nowhere near as well as it should. AM2/3 were amazing sockets, with a massive range of processors and powers, it certainly isn't the socket that killed Bulldozer, just AMD dropping the ball in general.
[QUOTE=Scrimp;34686180]But can it run windows 98?[/QUOTE] My ARM powered mobile phone can manage win98, its no huge feat.
Bulldozer is simply an awful architecture. Nothing special about it. Intel could have designed a much more efficient CPU on AM3+, probably just as good as sandy bridge.
[QUOTE=Killerelf12;34686680]Except for the fact that AMD is dropping out of the processor market, last I checked.[/QUOTE] AMD are ahead of intel in regards to their APU's.
[QUOTE=FlubberNugget;34687497]AMD are ahead of intel in regards to their APU's.[/QUOTE] For now. The problem is that Intel has much more money to throw at R&D should they have issues competing with AMD in the APU market. [editline]14th February 2012[/editline] Oh and speaking of APUs and changing CPU sockets. FM1 is being replaced with FM2 for their new APU chips.
[QUOTE=paul simon;34685534] And is there a real reason for this to use VGA? Wouldn't it be better to just use DVI?[/QUOTE] Like floppies, it's going to take a while for VGA to be totally phased out. Also, it isn't hard to keep VGA and DVI in one connector. We have been doing that for over ten years.
[QUOTE=certified;34685835]But can it run Crysis?[/QUOTE] I'm never going to see a computer that can run Crysis in my lifetime.
my core i3 mobile (sandybridge v2) can run mw3/bf3/crysis just the last two get like 10 fps on extreme low, but I could fly jets in bf3 without problems
Sorry, you need to Log In to post a reply to this thread.