[QUOTE=DainBramageStudios;40607951]do you understand complex numbers?[/QUOTE]
square root of negatives and all that jargon?
[QUOTE=certified;40614173]Better question: Can it run Crysis slow enough for someone to comprehend what is going on? I mean, hell, your average gaming-tier PC has to run older games like Deus Ex with a framerate cap set in order for the game not to run unplayably fast (Yes, that's a real thing), now imagine playing Deus Ex with uncapped framerate on a quantum computer.
But anyway, I'm off to go hyperinflate bitcoin.[/QUOTE]
Only a retard would make game logic depend on your framerate, so no. That's not how it works.
[QUOTE=EvacX;40614236]Only a retard would make game logic depend on your framerate, so no. That's not how it works.[/QUOTE]
really old low-tech games did this
crysis is neither
[editline]12th May 2013[/editline]
actually plenty of games still do - they just cap the framerate
[editline]12th May 2013[/editline]
and crysis isn't a quantum game so the whole point is null
So they will be made with niobium, there is very few of these mine in the world and im studying in one of those, buts its a good news because they will expand a lot if quantum computer get common.
[QUOTE=codemaster85;40605805]But can it run crysis [sp] im so sorry [/sp][/QUOTE]
Machine-God have mercy upon the hardware of a computer, when the sun casts it's light on a massive-multiplayer game with photorealistic graphics, artificial intellegience of Arma 2, life-like physics modeling, unlimited players, map the size of western europe, player-friendly UI and gameplay that supports tight-knit teamwork.
[sp]Battlegrounds Europe if it had proper funding[/sp]
[QUOTE=RoboChimp;40614148]Finally, I was hoping for something like this, I mean how long does silicon have left before they reach a brick wall. I assume if they hadn't developed this, in 20 years Intel would be telling us how great a .0001 performance increase was over the last chip.[/QUOTE]
Well, nobody's really sure how far silicon can go. Early predictions for when it would hit a wall have already been broken, so it's really going to only stop when we get to single-atom transistors. And even then, we can keep improving performance by increasing verticality. Or even just through decreasing manufacturing costs - we could have made chips with as many transistors as current 22nm chips as early as 1975 or so, it just would have cost a ludicrous amount of money and wouldn't have scaled as well to high clock speeds. Even if we hit a limit at 5nm, we'll keep improving performance per dollar simply by decreasing costs, building bigger chips to pack more transistors in.
Well of course, quantum computer is like normal computer, but is good at multitask better.
I am awful at explaining things aren't i?
[quote]The D-Wave machine turned out to be around 3600 times faster than the best conventional algorithm.[/quote]
Outdated in 3...2...1
[QUOTE=gman003-main;40622571]Well, nobody's really sure how far silicon can go. Early predictions for when it would hit a wall have already been broken, so it's really going to only stop when we get to single-atom transistors. And even then, we can keep improving performance by increasing verticality. Or even just through decreasing manufacturing costs - we could have made chips with as many transistors as current 22nm chips as early as 1975 or so, it just would have cost a ludicrous amount of money and wouldn't have scaled as well to high clock speeds. Even if we hit a limit at 5nm, we'll keep improving performance per dollar simply by decreasing costs, building bigger chips to pack more transistors in.[/QUOTE]Then what? Cube CPUs?
[QUOTE=RoboChimp;40626894]Then what? Cube CPUs?[/QUOTE]
Probably at some point, yes. But really, even if we do hit a limit in shrinking die processes, there's still "increase die size". Which will start increasing power consumption, yes, but linearly with performance, not exponentially the way clock speed increases do.
[QUOTE=gman003-main;40622571]Well, nobody's really sure how far silicon can go. Early predictions for when it would hit a wall have already been broken, so it's really going to only stop when we get to single-atom transistors. And even then, we can keep improving performance by increasing verticality. Or even just through decreasing manufacturing costs - we could have made chips with as many transistors as current 22nm chips as early as 1975 or so, it just would have cost a ludicrous amount of money and wouldn't have scaled as well to high clock speeds. Even if we hit a limit at 5nm, we'll keep improving performance per dollar simply by decreasing costs, building bigger chips to pack more transistors in.[/QUOTE]
I think everyone pretty much knows that once we hit the limit of shrinking the size we'll pull a complete U turn and start going back the way we originally came by getting larger, but with smaller tech of course.
Personally I think we'll likely be able to push under the 5nm limit and manage something like 3nm before it actually becomes a problem but theres a lot of years to go before thats gonna happen for the commercial market.
Eitherway having someone look at an alternative never really hurts if something can come out of it which from the bases of this tech [I]might[/I] be worthwhile, but I'm still not convinced D-Wave is doing real quantum computing on the scale they present, it sounds so pseudo in places.
[QUOTE=DainBramageStudios;40607867](before people start getting ideas about "stuff doesn't happen unless you observe it" this is a very handwavey and somewhat inaccurate account of quantum physics.)[/QUOTE]Isn't observation in the physics sense basically when something interacts with another, usually a photon? At least it's my understanding that when observation is mentioned, they mean bouncing a photon, electron or such off of the object.
Couldn't you make some sort of small quantum thing you could add to your computer for calculating and such. Something the size of a graphics card, that wouldn't be your main computer, just a small component for calculating.
[QUOTE=GussGriswold;40628039]Couldn't you make some sort of small quantum thing you could add to your computer for calculating and such. Something the size of a graphics card, that wouldn't be your main computer, just a small component for calculating.[/QUOTE]
Potentially, yes, but from what I can tell, quantum computing requires cryogenic temperatures to be stable, which requires special hardware to get.
This shit is not getting into desktops for decades. I'd almost say "never", but things *do* advance. It will appear first in dedicated rack-sized servers, only later spreading to commodity servers, and *maybe* one day to desktops.
[QUOTE=gman003-main;40628273]Potentially, yes, but from what I can tell, quantum computing requires cryogenic temperatures to be stable, which requires special hardware to get.
This shit is not getting into desktops for decades. I'd almost say "never", but things *do* advance. It will appear first in dedicated rack-sized servers, only later spreading to commodity servers, and *maybe* one day to desktops.[/QUOTE]
NASA didn't think their technology would be used by the public. Now its not even advanced enough for our phones.
[QUOTE=Wingz;40629077]NASA didn't think their technology would be used by the public. Now its not even advanced enough for our phones.[/QUOTE]
Yes, which is why I'm saying "eventually". But it's definitely a long, long way out, and I really can't see it ever being the *primary* processor, since they're going to be much slower at traditional processing tasks for at least the foreseeable future.
Sorry, you need to Log In to post a reply to this thread.