• US takes supercomputer top spots
    16 replies, posted
[url]http://www.bbc.co.uk/news/technology-20272810#sa-ns_mchannel=rss&ns_source=PublicRSS20-sa[/url]
Fuck yeah, Murrica
That supercomputer is ugly as tits. It's Just a Bunch Of Racks.
710 TB of system memory. Jesus Christ.
[quote]Titan leapfrogged the previous champion IBM's Sequoia - which is used to carry out simulations to help extend the life of nuclear weapons - thanks to its mix of central processing unit (CPU) and graphics processing unit (GPU) technologies.[/quote] What a waste
lol Germany's super computer is called [B]Ju[/B]queen.
[QUOTE=lonefirewarrior;38423371]lol Germany's super computer is called [B]Ju[/B]queen.[/QUOTE] cause of all the money it took to build :v:
[QUOTE=mobrockers2;38423289]What a waste[/QUOTE] It carries out non-military research too you know.
You know what the amazing thing is? That entire Room of power will one day be on your motherboard in an area the size of a quarter for $150
[QUOTE=TheTalon;38431038]You know what the amazing thing is? That entire Room of power will one day be on your motherboard in an area the size of a quarter for $150[/QUOTE] I see no reason to disagree, getting everything that small will be the easy part. Keeping it cool will be the challenge, although I am sure by the time that happens every computer will be very efficiently cooled by some means.
[QUOTE=assassin_Raptor;38431149]I see no reason to disagree, getting everything that small will be the easy part. Keeping it cool will be the challenge, although I am sure by the time that happens every computer will be very efficiently cooled by some means.[/QUOTE] I've heard we're pretty much at the point where you can't go much smaller with current transistor designs (~16nm and less means just the small amount of ions in the air can practically knock transistors apart). I'm next to certain the next major jump forward will be the dropping of CPU's entirely, which will (and currently are) be replaced with the GPU entirely, though I do believe we will get smaller technologies [I]eventually[/I]. It'll just be a while.
[QUOTE=CakeMaster7;38429005]It carries out non-military research too you know.[/QUOTE] And? Does that magically mean it's not a waste that it's being used to research how to extend the lifetime of nukes instead?
[QUOTE=Em See;38433051]I've heard we're pretty much at the point where you can't go much smaller with current transistor designs (~16nm and less means just the small amount of ions in the air can practically knock transistors apart). I'm next to certain the next major jump forward will be the dropping of CPU's entirely, which will (and currently are) be replaced with the GPU entirely, though I do believe we will get smaller technologies [I]eventually[/I]. It'll just be a while.[/QUOTE] A GPU is a bunch of CPUs. If the GPU becomes the only PU then the GPU is the CPU. A GPU cannot do many of the things a CPU can do, but a many-core CPU would match the performance of a GPU. PU PU. [QUOTE=MIPS;38423232]That supercomputer is ugly as tits. It's Just a Bunch Of Racks.[/QUOTE] Isn't that what every supercomputer is these days? [QUOTE=mobrockers2;38433060]And? Does that magically mean it's not a waste that it's being used to research how to extend the lifetime of nukes instead?[/QUOTE] Wouldn't it be more of a waste to build new nukes instead of extending the life time of the current ones?
[QUOTE=Catdaemon;38433744]A GPU is a bunch of CPUs. If the GPU becomes the only PU then the GPU is the CPU. A GPU cannot do many of the things a CPU can do, but a many-core CPU would match the performance of a GPU. PU PU. Isn't that what every supercomputer is these days? Wouldn't it be more of a waste to build new nukes instead of extending the life time of the current ones?[/QUOTE] True, but they're not building new nukes, they're dismantling them. I think.
Too bad the vast majority of practically useful computations are not very parrallelizable.
Can it run crysis on high detail?
[QUOTE=mdeceiver79;38434268]Can it run crysis on high detail?[/QUOTE] no
Sorry, you need to Log In to post a reply to this thread.