• Nvidia announces GTX Titan Z, costs $2999
    110 replies, posted
[QUOTE=CrimsonChin;44352577]You use a CPU to render though. Other than 3 or 4 programs which have GPU rendering, but I think the GPU rendering doesn't have all features of CPU rendering.[/QUOTE] this is exactly why I have five titans on one of the numerous render machines at my 3DVis business [QUOTE=J!NX;44352731]Titan is for people with more money then brains[/QUOTE] if they're gamers, hell yeah
Titan is for people with more money then brains
[QUOTE=dai;44352728]this is exactly why I have five titans on a single render machine[/QUOTE] My two GTX 680s are crying tears :[
[QUOTE=CrimsonChin;44352577]You use a CPU to render though. Other than 3 or 4 programs which have GPU rendering, but I think the GPU rendering doesn't have all features of CPU rendering.[/QUOTE] Well not necessarily, it's generally not as fully-sampled as a CPU render because in general GPU rendering is used to preview how a scene will look before starting the long CPU rendering process, and the better the GPU is the faster and better the scene can look while previewing it reducing the chance for missed errors and such. As for having all the features of the CPU render, that's debatable for the specific program, but I know most GPU rendering I've done or seen has included most normal rendering processes unless they incur a large time loss, (Therefor making the on-the-go rendering sort of pointless). Don't get me wrong, CPU rendering is still vastly more important in most cases, but GPU rendering is seriously getting to the point where calling is insignificant is very incorrect. Hell, I know most bigger studios use GPU rendering in some form be it an in-house engine or some professional toolset, and I bet any one of them would say that it's very beneficial to quickly render a preview as well as render the scene on the go.
[QUOTE=J!NX;44352731]Titan is for people with more money then brains[/QUOTE] path tracing benchmarks.. [t]http://www.systemagnostic.com/wp-uploads/2013/06/chart.png[/t]
[QUOTE=hypno-toad;44352741]My two GTX 680s are crying tears :[[/QUOTE] 580's running SLI will actually outdo a titan in a number of cases, it's intriguing. Really running dual 580's is probably the most your average gamer needs right now, provided you're not a rich bastard and want to run a 4k screen
I want to upgrade from my 590 because my power bill is crying tears
[QUOTE=tarkata14;44352748]Well not necessarily, it's generally not as fully-sampled as a CPU render because in general GPU rendering is used to preview how a scene will look before starting the long CPU rendering process, and the better the GPU is the faster and better the scene can look while previewing it reducing the chance for missed errors and such. As for having all the features of the CPU render, that's debatable for the specific program, but I know most GPU rendering I've done or seen has included most normal rendering processes unless they incur a large time loss, (Therefor making the on-the-go rendering sort of pointless). Don't get me wrong, CPU rendering is still vastly more important in most cases, but GPU rendering is seriously getting to the point where calling is insignificant is very incorrect. Hell, I know most bigger studios use GPU rendering in some form be it an in-house engine or some professional toolset, and I bet any one of them would say that it's very beneficial to quickly render a preview as well as render the scene on the go.[/QUOTE] Well it also depends on what type of rendering. Is it CG, or something like Sony Vega or Aftereffects? I know I've tried many renderers for CG, and most don't even have a GPU rendering feature.
[QUOTE=TheNerdPest14;44351625]What is the most powerful graphics card available? One that you could play ArmA 2 Warfare on maximum scale with and receive no lag whatsoever.[/QUOTE] A GPU probably won't help with warfare all that much since most of the fps loss is due to the script lag and AI, which would probably be offloaded to the CPU.
[QUOTE=milktree;44351537]People will buy 4 of these in sli.[/QUOTE] jesus christ are they trying to build the sun
[QUOTE=MatheusMCardoso;44351532]Imagine using this to mine crypto currencies.[/QUOTE] i'm doing this when I finish building my new pc
If only nvidias GPU's were any good at mining crypto currency
[QUOTE=hypno-toad;44352719]I don't. I much prefer being able to path trace images in 15 seconds as apposed to 3 minutes. The speed difference in unbiased rendering applications for GPUs is absolutely staggering. If you're into biased renderers then CPU reigns king, but the film industry seems to be headed towards unbiased renderers so unless big changes get made to CPU architecture, it's likely that GPUs will eventually take over. Arnold render is a bit of an exception, but its likely Arnold will also switch over to having full GPU support in a year or two.[/QUOTE] Can you tell me the names of some good renderers that can use GPU rendering with all features, and is compatible with openCL for AMD gpus? I am looking for one to replace Vray. The one's I have looked at that impress me seem to only work with that nvidia thing I can't remember the name of.
[QUOTE=SebiWarrior;44351073]Good luck selling that Nvidia.[/QUOTE] [quote]The GeForce GTX Titan Z[b] isn’t specifically aimed at gamers [/b]and is also available for developers and content creators who can now use the power of dual GK110 cores to develop rich conten[/quote]
[QUOTE=dai;44352728]this is exactly why I have five titans on one of the numerous render machines at my 3DVis business[/QUOTE] Lighten up , I didn't say 100%, but in most cases most renderers don't have a GPU feature. But what renderer do you use?
[QUOTE=Covalency;44351546]probably 4 times over this thing is made to benchmark and out perform consumer level cards what you see in fancy bullshot videos at E3 for PC games, usually these are the cards backing it up to give you a glimpse. Then it goes through several phases of being honed down and optimized for consumer level[/QUOTE] If you think that the GTX Titan Z could run [I]only[/I] 4 copies of Crysis at once, you're in for one hell of a surprise.
[QUOTE=GrizzlyBear;44351130]The sad thing is, I know someone who will probably buy it anyway. His response to Rome II at 30 fps was buying the current titan and a whole new computer, then found out he would of gotten 60 fps anyway if he had waited for patches.[/QUOTE] Hah, do we have the same friend? Almost the exact same thing happened to mine
[QUOTE=CrimsonChin;44352970]Lighten up , I didn't say 100%, but in most cases [B]most renderers don't have a GPU feature.[/B] But what renderer do you use?[/QUOTE] Loads of renderers work on the GPU: cycles, luxrender, octane, vray, indigo, iray etc.
[QUOTE=CrimsonChin;44352951]Can you tell me the names of some good renderers that can use GPU rendering with all features, and is compatible with openCL for AMD gpus? I am looking for one to replace Vray.[/QUOTE] [URL="http://furryball.aaa-studio.eu/"]Furryball[/URL] doesn't seem to use cuda or opencl and has some sort of proprietary system [URL="http://www.luxrender.net/en_GB/index"]Luxrender[/URL] currently has hybrid path tracing support for OpenCL devices But frankly, OpenCL is going nowhere fast in regards to raytracing renderers. For CUDA programs [URL="http://render.otoy.com/"]Octane[/URL] and [URL="http://wiki.blender.org/index.php/Doc:2.6/Manual/Render/Cycles"]Cycles[/URL] are the best dedicated GPU renders. Octane has no limitations, Blender Cycles can't do anything absorbtion or scattering related (that means no sss shaders, and no volumetric materials on gpu rendering mode)
[QUOTE=hypno-toad;44353067][URL="http://furryball.aaa-studio.eu/"]Furryball[/URL] doesn't seem to use cuda or opencl and has some sort of proprietary system [URL="http://www.luxrender.net/en_GB/index"]Luxrender[/URL] currently has hybrid path tracing support for OpenCL devices But frankly, OpenCL is going nowhere fast in regards to raytracing renderers. For CUDA programs [URL="http://render.otoy.com/"]Octane[/URL] and [URL="http://wiki.blender.org/index.php/Doc:2.6/Manual/Render/Cycles"]Cycles[/URL] are the best dedicated GPU renders. Octane has no limitations, Blender Cycles can't do anything absorbtion or scattering related (that means no sss shaders, and no volumetric materials on gpu rendering mode)[/QUOTE] It is a shame really that opencl isn't being used since AMD gpus are massively faster at this sort of stuff. [editline]25th March 2014[/editline] [QUOTE=alien_guy;44353058]Loads of renderers work on the GPU: cycles, luxrender, octane, vray, indigo, iray etc.[/QUOTE] Still, considering that there are a buttload of renderers, I don't think 8 is much. Plus I don't know about the others, but vray can't use many important features with the GPU render feature.
[QUOTE=GoDong-DK;44351769]6GB per GPU. 12GB total.[/QUOTE] SLI doesn't combine the VRAM between two GPUs. If it's 6GB per chip, that means it'll be 6GB effective even if there's 12GB total.
[QUOTE=Hat-Wearing Man;44351283]Can it run Crysis[/QUOTE] can it run gta iv decently is the real question
I fail to see why one would use more than 6GB. Unless you're trying to build the matrix.
[QUOTE=Zukriuchen;44353159]can it run gta iv decently is the real question[/QUOTE] Isn't GTAIV limited by CPU not the GPU?
[QUOTE=CrimsonChin;44353103]It is a shame really that opencl isn't being used since AMD gpus are massively faster at this sort of stuff. [/QUOTE] Not really. It's a complex issue that both octane and cycles programmers have run into. [URL]http://blender.stackexchange.com/questions/452/what-is-the-technical-reason-that-blender-cannot-use-opencl-on-amd-graphics-card[/URL] I'd like to see OpenCL and AMD resolve the issue though because Nvidia having a functional monopoly on GPU rendering is not good
[QUOTE=hypno-toad;44353067] For CUDA programs [URL="http://render.otoy.com/"]Octane[/URL] and [URL="http://wiki.blender.org/index.php/Doc:2.6/Manual/Render/Cycles"]Cycles[/URL] are the best dedicated GPU renders. Octane has no limitations, Blender Cycles can't do anything absorbtion or scattering related (that means no sss shaders, and no volumetric materials on gpu rendering mode)[/QUOTE] I use Octane, it's fantastic. I render on a 3x680 rig, so having a single GPU with 6X the VRAM + 1K more CUDA cores is only good news for me.
I haven't read the details but is there some advantage of getting dragon titan z before the line of cards that run dx 12?
we're kinda getting to the point where the GPU itself is less of a peripheral and more like the computer itself is a peripheral to the GPU.
[QUOTE=Egon Spengler;44351107]I'd imagine that you would turn this thing on, and every single light on your street would immediately flicker.[/QUOTE] more like every house on the block loses power and your house lights up like a las vegas casino.
[QUOTE=God of Ashes;44351231]more like it'd burn the whole block down[/QUOTE] Nitrogen Cooling Required Sold Separately
Sorry, you need to Log In to post a reply to this thread.