Intel's Ivy Bridge chips launch using '3D transistors'
13 replies, posted
[url]http://www.bbc.com/news/technology-17785464[/url]
[quote]
Intel is launching its Ivy Bridge family of processors - the first to feature what it describes as a "3D transistor".
The American firm says the innovation allows it to offer more computational power while using less energy.
The initial release includes 13 quad-core processors, most of which will be targeted at desktop computers.
Further dual core processors, suitable for ultrabooks - thin laptops - will be announced "later this spring".
Intel and PC manufacturers expect the release to drive a wave of new sales.
"The momentum around the system design is pretty astonishing," Intel's PC business chief, Kirk Skaugen, who is spearheading the launch, told the BBC.
"There are more than 300 mobile products in development and more than 270 different desktops, many of which are all-in-one designs.
"This is the world's first 22 nanometre product and we'll be delivering about 20% more processor performance using 20% less average power."
The firm has already built three factories to fabricate the new chips and a fourth will come online later this year.
"This is Intel's fastest ramp ever," Mr Skaugen added.
"There will be 50% more supply than we had early in the product cycle of our last generation, Sandy Bridge, a year ago. And we're still constrained based on the amount of demand we're seeing in the marketplace."
[b]Low Power[/b]
The fact that Intel's new transistor technology - the on/off switches at the heart of its chips - are more power-efficient could be crucial to its future success.
To date it has been largely shut out of the smartphone and tablet markets, where devices are most commonly powered by chips based on designs by Britain's Arm Holdings.
Arm now threatens to encroach on Intel's core market with the release of Windows 8 later this year.
Microsoft has decided to let one variant of its operating system work on Arm's architecture, paving the way for manufacturers to build laptops targeted at users who prioritise battery life over processing speeds.
[b]Tri-gate transistors[/b]
Intel hopes a new transistor technology, in development for 11 years, will help it challenge Arm's reputation for energy efficiency.
Bell Labs created the first transistor in 1947, and it was about a quarter of the size of an American penny.
Since then, engineers have radically shrunk them in size - so there are now more than one billion fitted inside a single processor.
Moore's law - named after Intel's co-founder Gordon Moore - stated that the number of transistors that could be placed on an integrated circuit should double roughly every two years without a big leap in cost.
However, transistors had become so small that there were fears they would become unreliable if they were shrunk much further.
"A lot of people had thought that Moore's law was coming to an end," said Mr Skaugen.
"What Intel has been able to do is instead of just shrinking the transistor in two dimensions, we have been able to create a three-dimensional transistor for the first time.
"For the user, that means the benefits of better performance and energy use will continue for as far as Intel sees on the road map."
[b]Graphics Gains[/b]
Mr Skaugen said that those who use the integrated GPU (graphics processing unit) on the chips, rather than a separate graphics card, would see some of the biggest gains.
He said the processing speed had been significantly boosted since Sandy Bridge, meaning devices would be capable of handling high-definition video conferences and the 4K resolution offered by top-end video cameras.
The GPU's transcoding rate also benefits from the upgrade, allowing users to recode video more quickly if they want to send clips via email or put them on a smartphone.
The chips also offer new hardware-based security facilities as well as built-in USB 3.0 support. This should make it cheaper for manufacturers to offer the standard which allows quicker data transfers to hard disks, cameras and other peripherals.
[b]Chip Challenge[/b]
It all poses quite a challenge to Intel's main competitor in the PC processor market - Advanced Micro Devices.
AMD plans to reduce the amount of power its upcoming Piledriver chips consume by using "resonant clock mesh technology" - a new process which recycles the energy used by the processor. However, full details about how it will work and a release date are yet to be announced.
One industry analyst told the BBC that Intel was expected to retain its lead.
"AMD did briefly nudge ahead of Intel in the consumer space in the early 2000s at the time of Windows XP, but since then Intel has been putting in double shifts to break away from its rival," said Chris Green, principal technology analyst at the consultants Davies Murphy Group Europe.
"Intel is making leaps ahead using proven technology, while AMD is trying to use drawing board stuff. So there's less certainty AMD will succeed, and PC manufacturers may not want to adopt its technology in any volume, at least initially."
As advanced as Ivy Bridge sounds, the one thing it is not is future-proof. Intel has already begun to discuss its successor, dubbed Haswell.
"We are targeting 20 times better battery life on standby - always on, always connected," Mr Skaugen said about the update, due for release in 2013.
"So you can get all your files and emails downloaded onto your PC while it's in your bag, and still get more than 10 days of standby and all-day battery life."
[/quote]
I can't wait to see how long the Fiona's battery life will last with an Ivy Bridge. I hope they don't mess it up, the idea of a relatively low cost tablet with a great processor running windows is awesome. Not sure how I feel about the handles though.
title made me think they had already launched, don't scare me like that
[QUOTE]
Mr Skaugen said that those who use the integrated GPU (graphics processing unit) on the chips, rather than a separate graphics card, would see some of the biggest gains.
He said the processing speed had been significantly boosted since Sandy Bridge, meaning devices would be capable of handling high-definition video conferences and the 4K resolution offered by top-end video cameras.
The GPU's transcoding rate also benefits from the upgrade, allowing users to recode video more quickly if they want to send clips via email or put them on a smartphone.
The chips also offer new hardware-based security facilities as well as built-in USB 3.0 support. This should make it cheaper for manufacturers to offer the standard which allows quicker data transfers to hard disks, cameras and other peripherals.[/QUOTE]
So in other words, they're still shit.
[QUOTE=jakeabbott96;35681908]So in other words, they're still shit.[/QUOTE]
Apparently you don't know how to read.
Cheers!
I'm still rather pissed off that for a $330 chip (the 3770k), around 40% of it is used for theintegrated GPU. Anybody who would purchase the top-end (below their enthusiast chips) more than likely has a dedicated GPU anyways. I understand that the GPU trickles down to the lower-end chips depending on the yield, but come on, I feel like I'll be paying 40% more for something I have absolutely no use for other than the faster encoding (which I never even use! Unless they could integrate it into live streaming programs).
[QUOTE=rodent-man;35684112]I'm still rather pissed off that for a $330 chip (the 3770k), around 40% of it is used for theintegrated GPU. Anybody who would purchase the top-end (below their enthusiast chips) more than likely has a dedicated GPU anyways. I understand that the GPU trickles down to the lower-end chips depending on the yield, but come on, I feel like I'll be paying 40% more for something I have absolutely no use for other than the faster encoding (which I never even use! Unless they could integrate it into live streaming programs).[/QUOTE]
I don't mind good integrated graphics at all. But I hope that some programs come out that actually utilize it to take a load off the GPU and still get stuff done. I have a pretty good GPU with a lot of memory, and a multi-core CPU, but every program uses the first 250MB of GPU memory, and the first CPU core.
[QUOTE=DarkendSky;35684182]I don't mind good integrated graphics at all. But I hope that some programs come out that actually utilize it to take a load off the GPU and still get stuff done. I have a pretty good GPU with a lot of memory, and a multi-core CPU, but every program uses the first 250MB of GPU memory, and the first CPU core.[/QUOTE]
As it currently stands, you can't even utilize the integrated GPU (at all) if you have a dedicated GPU installed and in use, unless you use a third-party program called Lucid Virtu, which to my understanding, streams the frames from the dedicated GPU through the integrated GPU. This used to penalize your overall graphics performance by around 7-9%, but I've heard that has been fixed with the latest edition in some Z77 boards. Basically, for the most part, unless you're actually using the chip for graphics, it's utterly useless without Lucid Virtu (which may still result in a minor performance decrease). That's why I'm stressing this point so much.
[editline]23rd April 2012[/editline]
[QUOTE=DarkendSky;35684182]I don't mind good integrated graphics at all. But I hope that some programs come out that actually utilize it to take a load off the GPU and still get stuff done. I have a pretty good GPU with a lot of memory, and a multi-core CPU, but every program uses the first 250MB of GPU memory, and the first CPU core.[/QUOTE]
Out of curiousity, how have you been able to monitor your graphics memory usage? I'd tried to do so in the past but couldn't really find a simple trust-worthy program.
[QUOTE=jakeabbott96;35681908]So in other words, they're still shit.[/QUOTE]
Apparently, cheaper, better, and faster are bad things in your world.
I'm still not getting this 3D transistor thing, did they ever explain what that actually means?
[QUOTE=POLOPOZOZO;35688461]I'm still not getting this 3D transistor thing, did they ever explain what that actually means?[/QUOTE]
I think it's multiple layering in the gates or something.
I knew I should wait with buying a new computer. Hopefully I'll be able to get something like this soon enough.
[QUOTE=T3hGamerDK;35688801]I knew I should wait with buying a new computer. Hopefully I'll be able to get something like this soon enough.[/QUOTE]
Are you building it yourself?
[QUOTE=rodent-man;35684243]Out of curiousity, how have you been able to monitor your graphics memory usage? I'd tried to do so in the past but couldn't really find a simple trust-worthy program.[/QUOTE]
Almost anything based off RivaTuner has some capability to monitor VRAM usage. I've been using MSI Afterburner on my GPU, largely because it came with it, and secondly because it's an actually useful frontend for RivaTuner. Though I don't know if it can log what it monitors, it most certainly can monitor live and feed it to a OSD overlay.
Also why are you complaining about the high end chip that has advantages in areas like encoding if you don't actually use that power? Buying the high end models of most CPUs is overkill for anything like gaming, and buying it to "future proof" yourself is an equally bad investment with the speed technologies change at today.
Sorry, you need to Log In to post a reply to this thread.