Oh look a video: [U]GTX 480 Unigine and 3D Vision Surround Demo (GF100)
[/U]
[media]http://www.youtube.com/watch?v=vpdPSZB8A8E&feature=player_embedded#[/media]
-snip-
When did the price of 5970 get raised to $700?
Also I think if the GTX 480 is priced at $700 they should compare it to the 5970 instead.
Lol @ that test. Let's see them actually turn anti aliasing on, then compare the fps differences.
[img]http://i.imgur.com/9ROAO.jpg[/img]
I was waiting to buy a computer until I know next nvidia movement, seeing this I think I'll buy a ati for sure... It's like nVidia thinks we are stupid... they come six months later with that mediocre device...
^ what
[editline]09:15PM[/editline]
it looks ok to me as long as they price it right
I forgot to say that they are comparing his device with the 4XXX gamma, maybe they should compare his NEW product with the already existing 5XXXX last gamma from Ati...
It's really similar to 4xxxx, so I think it's worse than 5700.
I'm the first to defend nvidia and like his products, in fact I was looking forward to know about his fermi but this time I've to be objective, they've failed.
[QUOTE=Rigal;20564537]I forgot to say that they are comparing his device with the 4XXX gamma, maybe they should compare his NEW product with the already existing 5XXXX last gamma from Ati...
It's really similar to 4xxxx, so I think it's worse than 5700.
I'm the first to defend nvidia and like his products, in fact I was looking forward to know about his fermi but this time I've to be objective, they've failed.[/QUOTE]
what the fuck are you talking abut
WOW what a nice box design!!!!!!
[QUOTE=DOG-GY;20565287]WOW what a nice box design!!!!!![/QUOTE]
its fake
[QUOTE=OCELOT323;20565495]its fake[/QUOTE]
thanks for telling me
[QUOTE=Rigal;20564537]I forgot to say that they are comparing his device with the 4XXX gamma, maybe they should compare his NEW product with the already existing 5XXXX last gamma from Ati...
It's really similar to 4xxxx, so I think it's worse than 5700.
I'm the first to defend nvidia and like his products, in fact I was looking forward to know about his fermi but this time I've to be objective, they've failed.[/QUOTE]
whats a gamma
what are you talking about
[QUOTE=Robbazking;20564516][img]http://resources.vr-zone.com//uploads/8543/2.jpg[/img]
If this is true, i bet gtx480 ain't much better.
But if it's fake well... then you can buy my own GTX 480!
[img]http://pici.se/pictures/quAvgYmXC.jpg[/img]
Look at that shit, fucking awesome designed box.[/QUOTE]
Very nice how much?
[QUOTE=Rigal;20563950][img]http://i.imgur.com/9ROAO.jpg[/img][/QUOTE]
Saw this earlier and did a lot of thinking about it.
I think that benchmark is meaningless.
The only time you see a significant performance difference is when the dragon is rendered. Now, the dragon has so many fucking polygons (i've hit f2 and actually seen it, it's fucking unbelievable), it's at the extreme level, and cause of this, it's nothing to do with the 480, it's the 5870 that struggles. Why? Because I seriously doubt we will see that many polygons being rendered in an actual real world game that uses tessellation. That benchmark is the absolute best case scenario for Nvidia. Too bad we'll never see that signifcant performance difference in a real game. Now, if you look, sometimes the cards are at par. Tessellation is still enabled, but there aren't as much polygons on the screen. So, most likely, in tessellation enabled games, we'll see something in between, like maybe 10-20 percent difference. What is worrysome for Nvidia, is that they were actually at par at a few points, because there weren't nearly as much polys being rendered. This leads me to believe that if tesselation was disabled and they tested it, or they tested a regular game, the performance would be the same.
Also, no AA or AF was used in that benchmark. Lemme guess, they probably used AA and AF and found the performance was very close, oh geez. And one more thing, that benchmark was made by Nvidia, in which they could have easily biased it. I say this because I'm suspicious of them because they've done it before, and they've also done that driver hack thing where the performance was increased but the image quality was reduced. They are creatures of habit.
Not to sound a like a fanboy in the last paragraph, but I just hate some of the shit that they try and pull off and hope that no one like me notices it.
[editline]12:32AM[/editline]
Also, look at what Robbazking posted with that crysis benchmark. It's done with a 470, but take a look at the uniheaven benchmark. It's faster, but 5-10 fps slower in games, so I would probably be right saying the 480 is pared in a game like crysis.
We'll have to wait and see.
[QUOTE=Dark-Energy;20568563]Saw this earlier and did a lot of thinking about it.
I think that benchmark is meaningless.
The only time you see a significant performance difference is when the dragon is rendered. Now, the dragon has so many fucking polygons (i've hit f2 and actually seen it, it's fucking unbelievable), it's at the extreme level, and cause of this, it's nothing to do with the 480, it's the 5870 that struggles. Why? Because I seriously doubt we will see that many polygons being rendered in an actual real world game that uses tessellation. That benchmark is the absolute best case scenario for Nvidia. Too bad we'll never see that signifcant performance difference in a real game. Now, if you look, sometimes the cards are at par. Tessellation is still enabled, but there aren't as much polygons on the screen. So, most likely, in tessellation enabled games, we'll see something in between, like maybe 10-20 percent difference. What is worrysome for Nvidia, is that they were actually at par at a few points, because there weren't nearly as much polys being rendered. This leads me to believe that if tesselation was disabled and they tested it, or they tested a regular game, the performance would be the same.
Also, no AA or AF was used in that benchmark. Lemme guess, they probably used AA and AF and found the performance was very close, oh geez. And one more thing, that benchmark was made by Nvidia, in which they could have easily biased it. I say this because I'm suspicious of them because they've done it before, and they've also done that driver hack thing where the performance was increased but the image quality was reduced. They are creatures of habit.
Not to sound a like a fanboy in the last paragraph, but I just hate some of the shit that they try and pull off and hope that no one like me notices it.
[editline]12:32AM[/editline]
Also, look at what Robbazking posted with that crysis benchmark. It's done with a 470, but take a look at the uniheaven benchmark. It's faster, but 5-10 fps slower in games, so I would probably be right saying the 480 is pared in a game like crysis.
We'll have to wait and see.[/QUOTE]
faaaaaaaaaaanboooooooooooy
[QUOTE=OCELOT323;20565560]whats a gamma
what are you talking about[/QUOTE]
I mistakes with a false friend from my language... gama... i meant Range of product, I don't know if you can understand it.
So what 3 monitors on 2 cards?
ATI's Eyefinity costs a shit load.
[QUOTE=OCELOT323;20560848][URL]http://www.sabrepc.com/p-174-nvidia-geforce-gtx-480-2gb-gddr5-pci-express-x16.aspx[/URL]
heres a preorder
lol 700 dollars[/QUOTE]
It's actually 600$, and that has been proved fake I'm quite sure, didn't Nvidia themselves say that per-orders are [B]not[/B] available yet? Besides, if it was real I'd expect Nvidia to make a lot more of a buzz about it.
Also I won't trust any benchmark until some reputable (and neutral) source backs it up (up until now it's either the second coming of Christ or barely on par with the 5870).
[editline]12:38PM[/editline]
Also the three monitors on 2 cards are 3D.
Not that I care because I couldn't stand the borders of the screens cutting through anyway.
[QUOTE=Rigal;20569478]I mistakes with a false friend from my language... gama... i meant Range of product, I don't know if you can understand it.[/QUOTE]
Ah, I believe the word you want is "series".
I think this is pretty much what charlie has been screaming about, no? Also, someone with a 5870 / 5850 do up a graph running heaven with nvidia's settings. It'll be fun.
[QUOTE=ShaRose;20570792]I think this is pretty much what charlie[/QUOTE]
and then I remembered where I've seen this idiot before
[IMG]http://i46.tinypic.com/2ignbxf.jpg[/IMG]
More [URL="http://www.pcgameshardware.com/aid,705745/Geforce-GTX-480-Fermi-graphics-card-pictured-at-Cebit-Update-More-pictures-retail-box-and-power-consumption/News/&menu=browser&article_id=705745"]here[/URL].
[QUOTE=Blackcomb;20569131]faaaaaaaaaaanboooooooooooy[/QUOTE]
lol, no
No one buys a high end gpu and doesn't run AA. Yeah its a benchmark but how they chose to put out this info is rather suspicious. It doesn't show confidence in their product. The way I see it is if there wasn't anything to hide, theyd have done a real world comparision.
I thought Tessellation was supposed to, among other things, make AA irrelevant. At least thats what I heard. I know it creates higher quality images/models/objects/whathaveyou, but can someone just clear up the AA thing?
[QUOTE=ShaRose;20570792]I think this is pretty much what charlie has been screaming about, no? Also, someone with a 5870 / 5850 do up a graph running heaven with nvidia's settings. It'll be fun.[/QUOTE]
:downs:
[editline]03:29PM[/editline]
[QUOTE=Spartan8907;20574654]I thought Tessellation was supposed to, among other things, make AA irrelevant. At least thats what I heard. I know it creates higher quality images/models/objects/whathaveyou, but can someone just clear up the AA thing?[/QUOTE]
it has nothing to do with anti aliasing
[editline]03:30PM[/editline]
[QUOTE=Dark-Energy;20574652]lol, no
No one buys a high end gpu and doesn't run AA. Yeah its a benchmark but how they chose to put out this info is rather suspicious. It doesn't show confidence in their product. The way I see it is if there wasn't anything to hide, theyd have done a real world comparision.[/QUOTE]
they ran the cards at the same settings and they performed around the same for the most part except in areas of high tessellation usage in which the 480 shined, what are you going on about
[editline]03:30PM[/editline]
[QUOTE=acds;20569511]It's actually 600$, and that has been proved fake I'm quite sure, didn't Nvidia themselves say that per-orders are [B]not[/B] available yet? Besides, if it was real I'd expect Nvidia to make a lot more of a buzz about it.
Also I won't trust any benchmark until some reputable (and neutral) source backs it up (up until now it's either the second coming of Christ or barely on par with the 5870).
[editline]12:38PM[/editline]
Also the three monitors on 2 cards are 3D.
Not that I care because I couldn't stand the borders of the screens cutting through anyway.[/QUOTE]
no one knows the price of any of the fermi cards yet
Awesome, can't wait for the new gpu.
[QUOTE=Blackcomb;20575721]
it has nothing to do with anti aliasing
[/QUOTE]
Thats not what I meant. I know that. But if everything has more polygons, then that means less jaggies, less gaggies means less of a need for AA. Is that how it works or not?
Not quite the power-house they made it out to be it seems.
Have they optimized Unigine yet? I remember that being a major gripe about previous benchmarks such as these was something about it being poorly optimized.
[QUOTE=Spartan8907;20579504]Thats not what I meant. I know that. But if everything has more polygons, then that means less jaggies, less gaggies means less of a need for AA. Is that how it works or not?[/QUOTE]
AA is a technique on minimizing the distortion artifacts known as aliasing when using a high-resolution signal at a lower resolution. For short, polygons have nothing to do with AA.
[QUOTE=reapaninja;20573262]and then I remembered where I've seen this idiot before[/QUOTE]
[QUOTE=Blackcomb;20575721]:downs:[/QUOTE]
Nice job guys, but charlie WAS right on this one.
[QUOTE=http://www.semiaccurate.com/2010/02/20/semiaccurate-gets-some-gtx480-scores/]There is one bright spot, and it is a very bright spot indeed. No, not the thermal cap of the chip, but the tessellation performance in Heaven. [B]On that synthetic benchmark, the numbers were more than twice as fast as the Cypress HD5870, and will likely beat a dual chip Hemlock HD5970.[/B] (As cypress uses dedicated tessellation, while fermi uses modified shaders) The sources said that this lead was most definitely not reflected in any game or test they ran, it was only in tessellation limited situations where the shaders don't need to be used for 'real work'. (Because they can only do one thing at a time, tessellate or shader work.)[/QUOTE]
Perhaps you should stop acting like idiots and agree that he was right.
Sorry, you need to Log In to post a reply to this thread.