• Nvidia announces GTX Titan Z, costs $2999
    110 replies, posted
[QUOTE=Hat-Wearing Man;44351283]Can it run Crysis[/QUOTE] probably 4 times over this thing is made to benchmark and out perform consumer level cards what you see in fancy bullshot videos at E3 for PC games, usually these are the cards backing it up to give you a glimpse. Then it goes through several phases of being honed down and optimized for consumer level
Nvidia has to keep the crown.
[QUOTE=woolio1;44351104]I almost wonder if you couldn't match the performance of this with 4-way SLI? I mean, four 780s will run you about half this, and that'd be pretty beastly. I'm not entirely sure what the purpose of this card actually is...[/QUOTE] SLI does not increase your amount of VRAM, so if you need 12 GB, then you'll want this card.
[QUOTE=milktree;44351537]People will buy 4 of these in sli.[/QUOTE] Seeing as it's a Dual gpu card, that's not exactly possible. :v:
What is the most powerful graphics card available? One that you could play ArmA 2 Warfare on maximum scale with and receive no lag whatsoever.
[QUOTE=Hat-Wearing Man;44351283]Can it run Crysis[/QUOTE] no
[QUOTE=Del91;44351612]Seeing as it's a Dual gpu card, that's not exactly possible. :v:[/QUOTE] Well you could use those 4 of them to warm your county if you so desire. New this winter, GTX Titan Z Portable Oven/Heater/Hair Dryer
[QUOTE=Smug Bastard;44351213]I think it's a bit misleading to call it a GTX Titan if it's for developers, why not call it a Quadro card like their other dev cards instead?[/QUOTE] EDIT: What i meant was that Quadro cards aren't meant for gaming
[QUOTE=Thomo_UK;44351709]Well you could use those 4 of them to warm your county if you so desire. New this winter, GTX Titan Z Portable Oven/Heater/Hair Dryer[/QUOTE] My GTX 590 does in fact make my room quite toasty unless I crack the window.
[QUOTE=AGMadsAG;44351608]SLI does not increase your amount of VRAM, so if you need 12 GB, then you'll want this card.[/QUOTE] Except this only has 6GB usable Vram. As much as a normal Titan.
Oh gee 12GB of VRAM. Finally, GTA IV will run at max!
[QUOTE=GameDev;44351518]12GB of VRAM? christ[/QUOTE] 6GB per GPU. 12GB total.
Still runs Arma at 20fps.
[QUOTE=patq911;44351195]TB is going to buy 4.[/QUOTE] gonna run an fov of 595
[QUOTE=AGMadsAG;44351608]SLI does not increase your amount of VRAM, so if you need 12 GB, then you'll want this card.[/QUOTE] Well, who would need 12gb ram?
So basically this graphic card is for some gaming use and mostly developer work?
Could do with one of these for my workstation. Goddamn.
The most expensive ive seen was about 4500$ from one of those doctor computer.
you plebs wouldn't know TRUE GPU POWER [t]http://i.imgur.com/BHlJAWP.png[/t] [t]http://i.imgur.com/0FrD5Fg.jpg[/t]
This MIGHT offset my electrical heating costs ...might Also my 680GTX 4GB runs everything at max at 2k resolution... So I'll pass on these until the generation I need them
And TB will buy two and wonder why games are not optimized around it.
[img]http://s16.postimg.org/c0y00jj51/lol2.jpg[/img]
Jesus christ, $3000 dollars for a graphics card!?
[QUOTE=GrizzlyBear;44351130]The sad thing is, I know someone who will probably buy it anyway. His response to Rome II at 30 fps was buying the current titan and a whole new computer, then found out he would of gotten 60 fps anyway if he had waited for patches.[/QUOTE] I had a friend who bought a computer, top of the line shit mid-2013. Saw the Titan. Sold his new rig to me for $400 dollars, and then bought a brand new rig and a titan.
Perfect for true hardcore pc gamers to buy and then play WoW and Minecraft with it.
I'm guessing this is more for people doing some serious rendering
You use a CPU to render though. Other than 3 or 4 programs which have GPU rendering, but I think in some cases the GPU rendering doesn't have all features of CPU rendering (such as displacement mapping in VRAY).
Interested to see the benchmarks on such an expensive card, wonder how many extra fps it'll get over a 780 ti?
[QUOTE=Xmeagol;44351716]those are not meant for playing games of developing for games[/QUOTE] excellent
[QUOTE=CrimsonChin;44352577]You use a CPU to render though.[/QUOTE] I don't. I much prefer being able to path trace images in 15 seconds as apposed to 3 minutes. The speed difference in unbiased rendering applications for GPUs is absolutely staggering. If you're into biased renderers then CPU reigns king, but the film industry seems to be headed towards unbiased renderers so unless big changes get made to CPU architecture, it's likely that GPUs will eventually take over. Arnold render is a bit of an exception, but its likely Arnold will also switch over to having full GPU support in a year or two.
Sorry, you need to Log In to post a reply to this thread.