• AMD Announces Industry's First "Supercomputing" Server Graphics Card with 12GB Memory
    36 replies, posted
[url]http://www.amd.com/us/press-releases/Pages/amd-announces-server-graphics-card-2013nov14.aspx?cmpid=social14543734[/url] [QUOTE] (NYSE: AMD) today announced the new AMD FirePro™ S10000 12GB Edition graphics card, designed for big data high-performance computing (HPC) workloads for single precision and double precision performance. With full support for PCI Express® 3.0 and optimized for use with the OpenCL™ compute programming language, the AMD FirePro S10000 12GB Edition GPU features ECC memory plus DirectGMA support allowing developers working with large models and assemblies to take advantage of the massively parallel processing capabilities of AMD GPUs based on the latest AMD Graphics Core Next (GCN) architecture. AMD FirePro S10000 12GB Edition GPU is slated for availability in Spring 2014. The AMD FirePro S10000 12GB Edition graphics card is a compelling solution for a variety of scenarios: Compute/Visualization Server: finance, oil exploration, aeronautics and automotive, design and engineering, geophysics, life sciences, medicine and defense Double precision: genetic sequencing, computational fluid dynamics, structural mechanics, numeral analytics, reservoir simulation, automated reasoning and weather forecasting Single precision: seismic processing, molecular dynamics, satellite imaging, explicit crash test simulation, video enhancement, signal processing, video transcoding, digital rendering and medical imaging Ultra High-end Workstation (Requiring GPU compute and 3D graphics performance): Oil and gas, computer aided engineering[/QUOTE]
But can it run Ryse? [highlight](User was banned for this post ("Meme shit" - SteveUK))[/highlight]
Might be useful for some stuff, but a lot of stuff is CUDA only unfortunately so the market is a lot more limited than it could be. Not that a 10,000 dollar card isn't already a niche market or anything. :v:
[QUOTE=Paul McCartney;42901378]But can it play Ryse?[/QUOTE] no, a graphics card by itself probably can't play a game developed for the xbox one.
Wie are still using CPU clusters in university, wish we could change.
[QUOTE=Killuah;42901416]Wie are still using CPU clusters in university, wish we could change.[/QUOTE] You might not be able to at all depending on what it is that you are doing. Some stuff really doesn't work on GPUs.
So wait - is this AMD's answer to [URL="http://www.nvidia.com/object/product-quadroplex-7000-us.html"]this[/URL] then?
[QUOTE=Paul McCartney;42901378]But can it run Ryse?[/QUOTE] It probably doesn't have a video output at all.
Yay for cloud computing... gone are the days of the powerful workstation. Now lets all have a stupid central system. Sigh.
[QUOTE=DogGunn;42901567]Yay for cloud computing... gone are the days of the powerful workstation. Now lets all have a stupid central system. Sigh.[/QUOTE] It's like going back to the good ol' Mainframe days. Except now it's greedy corporations being in the pocket of various spying agencies managing them instead of old men with grey beards. Soon it'll go full circle.
[QUOTE=FacepunchUser;42901346](NYSE: AMD) today announced the new AMD FirePro™ S10000 12GB Edition graphics card, designed for [B]big data[/B] high-performance computing (HPC) workloads for single precision and double precision performance[/QUOTE] Who first thought of this stupid term?
So wait, is this aimed at the bitcoin miners market?
[QUOTE=Anthracite;42901728]So wait, is this aimed at the bitcoin miners market?[/QUOTE] No...... it's aimed at businesses who actually need a powerful GPU to crunch numbers. Like the businesses listed in the OP.
has anyone made a crysis joke yet or am I still good to go
[QUOTE=snookypookums;42901485]So wait - is this AMD's answer to [URL="http://www.nvidia.com/object/product-quadroplex-7000-us.html"]this[/URL] then?[/QUOTE] Not to my layman's perspective. This doesn't look like it's designed primarily for rendering tasks.
[IMG]http://www.techpowerup.com/gpudb/images/802.jpg[/IMG] Jesus fuck.
I wonder if AMD started mining bitcoins after it quit the race with intel.
[QUOTE=Smug Bastard;42902200][IMG]http://www.techpowerup.com/gpudb/images/802.jpg[/IMG] Jesus fuck.[/QUOTE] [IMG]http://i.imgur.com/vt93yOS.jpg[/IMG]
Is this AMD's answer to a Tesla? Are they that confident? [QUOTE=Smug Bastard;42902200][IMG]http://www.techpowerup.com/gpudb/images/802.jpg[/IMG] Jesus fuck.[/QUOTE]Looks almost exactly like a 7990: [IMG]http://www.amd.com/PublishingImages/Public/Photograph_ProductShots/PNG/AMD-Radeon-HD-7990-360W.png[/IMG]
[QUOTE=RoboChimp;42902309]Is this AMD's answer to a Tesla? Are they that confident? Looks almost exactly like a 7990: [IMG]http://www.amd.com/PublishingImages/Public/Photograph_ProductShots/PNG/AMD-Radeon-HD-7990-360W.png[/IMG][/QUOTE] That's most likely what it is, although with a tweaked core and the rear plate's ports have been deleted in favor for exhausts vents.
[QUOTE=onebit;42902322]Does that even mean anything?[/QUOTE] Yes. It means that it has performance is high on single and double number types.
[QUOTE=onebit;42902322]Does that even mean anything?[/QUOTE] Single-precision floating-point numbers are 32 bits - 24 significand, 7 mantissa, and one for the sign. This works out to about seven decimal digits of precision (being floating-point, it can cover a much further range than just seven digits, but those are the only ones that will be accurate). Double-precision floating-point numbers are 64 bits - 53 significand, 10 mantissa, and one sign. This works out to about sixteen decimal digits of accuracy. For marketing reasons, gaming chips normally have their double-precision capabilities handicapped. For instance, most Kepler GPUs run single-precision calculations 24 times faster than double-precision. Those Kepler GPUs that are designed and sold for compute (Quadro and Tesla series) run single-precision at the same rate (adjusted for core+clock), but run double-precision eight times faster (1/3rd the speed of single-precision). AMD does the same thing - the ratios might be a bit different, but they do the same general thing. There is also a half-precision floating-point format - 16 bits wide, 10 significand, 5 mantissa, and the sign bit, and works out to 3-4 decimal digits of precision. This is mainly used in games for rendering HDR graphics in games, not for anything requiring any deal of accuracy.
How appropriate that Nvidia announces their 12GB Tesla K40. [url]http://www.anandtech.com/show/7521/nvidia-launches-tesla-k40[/url]
[QUOTE=onebit;42902322]Does that even mean anything?[/QUOTE] nope they just made up terms dude
[QUOTE=Zephyrs;42901448]You might not be able to at all depending on what it is that you are doing. Some stuff really doesn't work on GPUs.[/QUOTE] It's intensive linear algebraic equations for seimology and general inversion problems, it would work wonders on a GPU.
[QUOTE=Zephyrs;42901390]Might be useful for some stuff, but a lot of stuff is CUDA only unfortunately so the market is a lot more limited than it could be. Not that a 10,000 dollar card isn't already a niche market or anything. :v:[/QUOTE] A lot of CUDA only software is starting to transition to OpenCL.
Is it powered by petrol?
I feel as if the Transputer is coming of age.
[QUOTE=DogGunn;42901732]No...... it's aimed at businesses who actually need a powerful GPU to crunch numbers. Like the businesses listed in the OP.[/QUOTE] And academia.
[QUOTE=Anthracite;42901728]So wait, is this aimed at the bitcoin miners market?[/QUOTE] Graphics cards do linear algebra (Operations on matrices). Just about any problem that involves simulating the real world or doing the same math on a billion inputs at once can be expressed in terms of linear algebra, for which GPUs provide efficient hardware algorithms. So if you need to simulate a tornado while rendering the starship from Avatar in 1080p and mining Litecoins (Not bitcoins), you'd use this.
Sorry, you need to Log In to post a reply to this thread.