[QUOTE=BmB;18109022]No they are not. Unlike individual games that have each their quirks that may favor or hate certain features, a dedicated benchmark like 3Dmark is designed to fairly uniformly stress the hardware as much as possible. I'd trust 3Dmark before a Crysis bench. :P[/QUOTE]
Right, care to explain the little fiasco with PhysX then.
[QUOTE=xZippy;18073585]There's still the people that brag about getting 30 or higher fps. 30fps is shit to me. I hate seeing anything below 50fps.[/QUOTE]
The human eye cannot distinguish between 30fps and 50fps.
[QUOTE=DainBramageStudios;18188556]The human eye cannot distinguish between 30fps and 50fps.[/QUOTE]
Get out.
As long as there is [b]some[/b] motion blur, I forgot to add.
[editline]08:33PM[/editline]
Stop breaking my automerge it ruins the context.
Which is why stopmotion looks all jerky whereas normal live action looks smooth, even though both are recorded at either 24, 25 or 30 fps.
[quote=dainbramagestudios;18188556]the human eye cannot distinguish between 30fps and 50fps.[/quote]
[b]Wrong![/b]
[QUOTE='Odellus[v2];18188571']Get out.[/QUOTE]
Movies are recorded are played at 24 or 32. The human eye can notice a difference up to 60 something (under ideal conditions). After that it looks the same. Although more FPS, the more wiggle room you have.
[QUOTE=xZippy;18073585]There's still the people that brag about getting 30 or higher fps. 30fps is shit to me. I hate seeing anything below 50fps.[/QUOTE]
The human eye can't percieve any difference between 30FPS or anything higher in a game. If yo uhave 30, and it's consistently above 30, then it would look the same to you as a steady 120FPS.
[B]Edit:[/B]
I should definitely have read the second page before posting this. I got ninja'd quite badly.
[QUOTE=Risonhighmer;18188644]The human eye can't percieve any difference between 30FPS or anything higher in a game. If yo uhave 30, and it's consistently above 30, then it would look the same to you as a steady 120FPS.[/QUOTE]
Wrong. It all depends on different factors.
Read up on it here:
[url]http://www.100fps.com/how_many_frames_can_humans_see.htm[/url]
[QUOTE=Risonhighmer;18188644]The human eye can't percieve any difference between 30FPS or anything higher in a game.[/QUOTE]
No, not really. I can't believe you gave me a dumb rating for that.
Next you're going to tell me that you can see the individual frames on a television programme because they are only at 30 fps.
Or 25 if you're on PAL.
(In fairness it's actually at 60i or 50i because of interlacing)
Kinda strange to compare FPS with television and playing a game.
GTA4 is my new benchmark, Crysis was defeated long ago.
[QUOTE=xZippy;18189109]Kinda strange to compare FPS with television and playing a game.[/QUOTE]
Both are moving images.
Both are made of pixels changing rapidly each second.
The only difference is that the former is interactive.
[QUOTE=SomeGuest;18189166]GTA4 is my new benchmark, Crysis was defeated long ago.[/QUOTE]
GTA4 Has extremely shit optimization. And isn't GPU heavy.
[QUOTE=DainBramageStudios;18188902]Next you're going to tell me that you can see the individual frames on a television programme because they are only at 30 fps.
Or 25 if you're on PAL.
(In fairness it's actually at 60i or 50i because of interlacing)[/QUOTE]
I'm going to repost this link because you are making flawed arguments. Frames are not the only factor.
[url]http://www.100fps.com/how_many_frames_can_humans_see.htm[/url]
[QUOTE=DainBramageStudios;18189231]Both are moving images.
Both are made of pixels changing rapidly each second.
The only difference is that the former is interactive.[/QUOTE]
Completely wrong. FPS in games means so much more. First of all, fps in games does not simply deal with displaying images at a set rate. Each frame also has game logic and rendering. Not every frame in a game will display at the same rate either, it constantly changes. FPS is an average of the frames displayed within that second, you could have 2 frames in a second but that does not mean they were at exactly 0.5 seconds and 1 second. Game logic works on time, and rendered images can be independant of that time. A higher fps will help to the point where it will go unnoticed. You just can't compare them.
[QUOTE=Pj The Dj;18189902]Each frame also has game logic and rendering.[/QUOTE]
I don't really know what you're trying to say with this (I'm not being sarcastic, just elaborate).
[QUOTE=DainBramageStudios;18190439]I don't really know what you're trying to say with this (I'm not being sarcastic, just elaborate).[/QUOTE]
Each frame has to contain the information of the the game being calculated. It isn't like a movie where each frame is set in stone and will flow in the direction it was recorded. Games allow you to change what you see. Each time you press a key, an action occurs and the game has to compute that and present it to you on screen. This is basically the gist of what he is saying
[QUOTE=Quantuam VTX;18189293]GTA4 Has extremely shit optimization. And isn't GPU heavy.[/QUOTE]
Yes but that's my point, the optimization is so shit that if you can run it, you can run anything.
Crysis is easily playable on todays power house machines at 1920x1200 Enthusiast settings.
And since I run GTA4 at a constant 50 FPS at 1920x1200 Max, I think it's safe to say that my computer is fine to take on any game currently out there.
they usually use games that put pressure on the GPU and PC components. But they do use other games
[QUOTE=Edthefirst;18190799]Each frame has to contain the information of the the game being calculated. It isn't like a movie where each frame is set in stone and will flow in the direction it was recorded. Games allow you to change what you see. Each time you press a key, an action occurs and the game has to compute that and present it to you on screen. This is basically the gist of what he is saying[/QUOTE]
Well that doesn't make any sense in the first place because all the GPU is for is drawing the frame by the information the CPU feeds it.
[QUOTE=Edthefirst;18190799]Each frame has to contain the information of the the game being calculated. It isn't like a movie where each frame is set in stone and will flow in the direction it was recorded. Games allow you to change what you see. Each time you press a key, an action occurs and the game has to compute that and present it to you on screen. This is basically the gist of what he is saying[/QUOTE]
But ... that still doesn't correspond to the argument here, in that I believe that 30 frames per second will flow smoothly, regardless of how it was made. It could have been pre-determined by filming it (as in television) or calculated in realtime (as in video games), it will produce the same result.
[editline]10:45PM[/editline]
Though I will admit that my entire argument is simply stemmed from my (admittedly anectdotal) evidence that I personally find 30fps perfectly playable.
[QUOTE=DainBramageStudios;18188556]The human eye cannot distinguish between 30fps and 50fps.[/QUOTE]
a) the human eye can tell the difference, something's up with your vision if you cannot
b) the human eye does not see in framerate and thus
c) cannot be measured in framerate
[editline]06:01PM[/editline]
[QUOTE=DainBramageStudios;18188902]Next you're going to tell me that you can see the individual frames on a television programme because they are only at 30 fps.
Or 25 if you're on PAL.
(In fairness it's actually at 60i or 50i because of interlacing)[/QUOTE]
that's an entirely different field. you CAN see the difference, however, if a game has enough motion blurring and very little movement, the chances of noticing it are slimmer.
it's a combination of your brain filling in the gaps, you not trying to see it, post processing(of the media), and technological capabilities of displays.
which is how film/tv is pulled off. for the most part, nothing moves around enough to really see that it's only approx 30FPS, but when something is moving around enough and the display area is big enough, and no motion blur was added, you should be able to see it. it's very common, mind you, for films to be shot in such a way that such motion is somewhat blurred, and in modern times that blur is often added when it comes to CGI related elements. when motion blur is in place, your brain fills in the blanks. in fact, the brain does this with a lot of things it sees, sort of the brains own method of what's referred to in graphics as inbetweening or simply tweening.
graphic example:
[img]http://upload.wikimedia.org/wikipedia/commons/5/53/Tweening.gif[/img]
but does that mean you shouldn't be able to notice a lack of framerate? heavens no, just that it makes it slightly less noticeable.
there's also the fact that tvs up until the last 3 or 4 years were never meant to display higher than ~30/25 frames and while interlaced. the tvs, in layman's, were and are slow. such that the picture sort of stays up on the display longer than it would a tv on the market right now or most monitors in the last 10 years, LCD or CRT (though LCDs were a notch too slow until about 6 years ago). the refresh rates on older sets were, simply put, terrible by today's standards. and it's because of this that you can't REALLY tell that it's only 24.97/29.97 frames per second. or that cartoons are (usually) only about 12 frames per second.
there's a lot more explaining that i could do but it mostly just goes in circles and takes forever to explain, but the main reason for one not being able to distinguish between 30FPS and 50FPS is motion blur. that said, even with motion blur, the difference is detectable, more so if you're trying to detect it.
[QUOTE=SomeGuest;18190920]Yes but that's my point, the optimization is so shit that if you can run it, you can run anything.
Crysis is easily playable on todays power house machines at 1920x1200 Enthusiast settings.
And since I run GTA4 at a constant 50 FPS at 1920x1200 Max, I think it's safe to say that my computer is fine to take on any game currently out there.[/QUOTE]
Safe? I'd say you'd be safe running any game maxed for another year or more maybe
also, somewhat irrelevant as octopi are not humans and humans are not octopi, but a researcher performed tests on many octopi a year or two ago. the test? video.
she showed them standard resolution (about 640x480 or 800x600, i gather) 30FPS video of different things, such as prey, predators, and things not even found in the ocean. none of the octopi reacted.
she then showed them high resolution (was not specified, i assume higher than 1080 lines of height, though) at 60FPS video on a high dot-pitch screen, showing basically the same exact things (just higher res and framerate). all, or at least nearly all of the octopi reacted.
[editline]06:07PM[/editline]
[QUOTE=lum1naire;18194819]Safe? I'd say you'd be safe running any game maxed for another year or more maybe[/QUOTE]
with the exact hardware he has now, a year is about it i'd assume. though it depends on which tech he decides to invest in, for example, maybe he wants a 3D display, which basically halves the framerate.
I hope it's only a year, I rather have technology advance than stand still. Hopefully nothing becomes too CPU intensive and I can just upgrade my GPU.
the way things are starting to support both CUDA/Stream, where GPUs are handling both graphics and things previously done on CPUs (which are terrible with floating point), it seems as though games will eventually join the ranks. there's already Havok/PhysX being done right on video cards (actually i'm not sure if the Havok-on-GPU thing is out and functional yet but it's the same idea as PhysX), GPUs sporting HDMI outputs also tend to do 5.1 audio, some 7.1. i think it's only a matter of time before the majority of PC games go back to being more GPU intensive than CPU intensive. it's constantly bouncing back and forth between the two.
[QUOTE=dumdydum;18073930]
I'd rather use 3DMark Vantage as a benchmark though.[/QUOTE]
This. A million times. That's the first thing I start up after buying a new graphics card.
[QUOTE=dumdydum;18073930]I'd rather use 3DMark Vantage as a benchmark though.[/QUOTE]
True dat. I have an AMD Phenom II X4 925 @ 2.8GHz, SLI 2x9800GTX+, and 4GB of Corsair 800MHz 4-4-4-12 RAM, and can only run some of the 3D presentations at 30fps when it looks like the graphics aren't even that demanding, or good looking.
[QUOTE=G-Strogg;18199957]This. A million times. That's the first thing I start up after buying a new graphics card.[/QUOTE]
first thing I do is play Crysis to feel the difference, then benchmark it to see it
[editline]02:20PM[/editline]
and then of course i try heaps of other games and stuff
Sorry, you need to Log In to post a reply to this thread.