• What would be the max "FPS" an eye can see?
    58 replies, posted
[QUOTE=Nerts;24160773]Eyes do not work that way, it's like asking what percentage hungry you are.[/QUOTE] About 12 percent.
[QUOTE]Frames Per Second Frames Per Second, often called Frame rate, or simply FPS, is the most commonly used term in gaming, and yet one of the more complex and often misunderstood ones. Most people generally know that higher FPS is better, but beyond that there are a lot of misconceptions that need to be clarified. Understanding framerate is the key to understanding the rest of the settings in this section, so do not skip it. A computer image is made up of lots of small dots called Pixels, the number of which at any time depends on your chosen resolution - for more details see the Resolution section of this guide. When viewed from a reasonable distance, your brain can take all these dots, put them together and perceive them as a single image. This single still image is called a Frame, and as we've seen in the Graphics Process section, a 3D game pretty much turns your entire system into a frame factory, with your graphics card constantly producing new frames every second. However these frames are only still images, like photographs. For a scene to appear fluid and animated on a computer screen, it has to be made up of lots of different still frames shown rapidly one after the other. This happens to be the same way that movies and TV work: a rapid slideshow of still images, each one slightly different than the last, to achieve the appearance of natural motion. The only reason this method works is that your eyes have a property called Persistence of Vision - they retain an image for a tiny fraction of a second even after it has disappeared. This mechanism is absolutely critical to the way in which TV, movies and computer displays work. This is where we reach our first misconception: FPS and Refresh Rate are two separate and independent things. It is possible to have 20FPS on a screen with an 85Hz refresh rate, or have 100FPS on a screen with a 60Hz refresh rate. FPS is the rate at which your graphics card is producing new frames; Refresh Rate is the rate at which your monitor is refreshing whatever is displayed on your screen. We discuss this further in the Refresh Rate and Response Time sections of this guide. Measuring Framerate FPS is not something which is easy to judge with the naked eye. Fortunately though, it is quite easy to actually measure how many frames per second your game is running at. Some games have a built-in FPS counter which you can activate, but the easiest method of enabling an FPS counter in any game is to use the free FRAPS utility. This utility does not impact on framerate in any significant way, and is accurate in all games. Install it, launch it and then launch a 3D game - a yellow FPS counter will appear in the corner of your screen indicating how many FPS is currently being produced by your graphics card. Once people become aware of their actual FPS, the most commonly asked question is: "Is my FPS high enough?", to which you may hear such varied replies as: "Any more than 30FPS is a waste because the human eye can't see any more than that", or "You need at least 30FPS for the game to be playable". These sorts of arbitrary comments show the confusion that abounds regarding FPS, and I'll attempt to clarify them below. Minimum FPS There is definitely a 'minimum' FPS - a point at which, if the FPS in a game becomes low enough, your eyes and brain will begin to notice the fact that an animated image on the screen no longer looks smooth; it becomes quite apparent that it is a series of still images being displayed in sequence, like a slideshow. However the exact minimum number of frames required to keep things smooth in a game is not a set scientific value. It will vary from person to person, and importantly it also varies by game type. In my experience, the baseline for acceptably smooth graphics is around 25 frames per second, if rendered consistently without stuttering or dipping lower. But this is only a subjective starting point, and not a definitive answer. There are several important factors to consider when talking about minimum acceptable framerate: # Variability of FPS: If the FPS in a game constantly fluctuates, particularly if it dips into the low teens at any point, then this is usually a scenario where it's safe to say you are not getting 'enough' FPS. To maintain the illusion of smoothness, FPS needs to be held relatively stable such that at its lowest point it still seems reasonably smooth, whatever that level may be. For example, gaming consoles typically use a locked framerate cap of 30FPS, meaning some console games can only draw a maximum of 30 frames per second. This may sound quite low for a maximum, however because the minimum FPS also stays very close to this framerate cap most of the time, and hence variability of FPS is quite low, 30FPS is sufficient for the impression of smooth animation. Unfortunately most PC games do not allow you to lock FPS at a particular rate. # Game Type: For games which have a lot of fast motion and hectic action, particularly deathmatch-style first person shooters like Team Fortress 2, Counter-Strike or the Battlefield series, higher minimum framerates may be required (e.g. a minium of 40FPS) to make things feel sufficiently smooth and responsive. This is in part because the large amount of fast action means that the content of each individual frame is noticeably different from the previous one, so low FPS is much more noticeable in such games. Conversely, games which are much slower in pace such as RTS or RPG games can get away with lower minimum framerates. As an extreme example, if you were staring at a wall in a game, you could go down to 5 or even 1 FPS and not notice any difference. So the speed of gameplay impacts on the perception of smoothness and hence alters the minimum FPS requirement. # Control Lag: Unlike a movie or TV, you actually control the camera in a game, typically the view through your character's eyes. The lower your FPS, the more 'laggy' on-screen reactions to your mouse and keyboard input will feel, and this also adds to the perception of certain framerates being inadequate. Typically at around 15 FPS or less, virtually every game exhibits noticeable mouse/control lag regardless of any sensitivity settings you alter, and this in turn makes the framerate feel entirely inadequate regardless of what your eyes perceive. See the Vertical Sync and Triple Buffering sections for some ways of possibly fixing this issue, but the most common fix is to change your settings so as to raise your FPS until you find there is no noticeable lag. The simple fact is that control lag due to low FPS is simply what happens when your graphics card struggles to keep up and the resulting frames go slightly out of sync with your input commands. The more the game relies on responsiveness, the more noticeable this lag will become. The key point to take away from this discussion is that there is no set scientific value for minimum FPS. Your eyes and brain don't have a precise trigger point at which, say 23 FPS appears stuttery but 24 FPS is smooth. It's more complex than that and all the factors above need to be considered. If a game looks and feels smooth to you, you are getting 'enough' FPS as a minimum, but don't expect other people to agree with or experience the exact same thing - it is in large part subjective. Maximum FPS The concept that there is a maximum possible FPS beyond which the human eye can't distinguish any real difference is not entirely accurate. For more details, see this article and this article among the many which refute this claim. In particular the common claim that "The human eye can't see more than 24 (or 25 or 30 or 60) FPS" is completely false, and is partly borne of the misconception that TV or movie FPS is the same as PC game FPS, and partly possibly borne out of a need to justify lower framerates. [IMG]http://www.tweakguides.com/images/GGDSG_14as.jpg[/IMG][IMG]http://www.tweakguides.com/images/GGDSG_14bs.jpg[/IMG] It's true that movies and TV only use around 24, 25 or 30 FPS, depending on which part of the world you're in. But there are three important differences between Movies, TV and PC games: 1. Movies and TV use Motion Blur, so that if at any time you freeze a movie scene on your DVD player for example, a large part of the scene may consist of blurred objects. Furthermore, the images in a movie or on TV do not have crisp detailed outlines. In a PC game on the other hand, if you take a screenshot or pause the game at any time, you will notice that everything is usually extremely sharp and distinct regardless of how fast it was moving when the shot was taken. Take a look at the screenshot comparison above: on the left is a fast motion shot of an alien from the movie Alien vs. Predator, on the right a fast motion shot of an alien from the old game Alien vs. Predator 2. Thus 24 often-blurred frames from a movie wind up looking much smoother to the human eye than 24 or even 30 distinct frames from a fast-moving PC game. So why can't games use motion blur? Well indeed most recent games have started incorporating blur effects. This can definitely help to reduce the visible impact of lower framerates, but aside from the fact that not all games have motion blur, the next point addresses why this doesn't always work. Even with motion blur, the graphics in PC games may still have very sharp outlines which only settings like Antialiasing can smooth out, but ironically this usually come at the expense of further lowering FPS. 2. Control responsiveness steps in again to further differentiate between a movie and a game. In a movie or TV show, the viewpoint is not under your control; it is typically a static or smoothly panning camera. In a game however, your control over the viewpoint means that in a rapidly moving gaming at 24 or even 30FPS you will notice the general choppiness due to a lack of responsiveness. The variability of control responsiveness based on variable framerate also helps highlight the next point below. 3. PC games do not have a rock-solid unchanging framerate, while TV and movies do. While some games have a framerate cap of 30 or 60 FPS, very few if any PC games can be locked down to consistently show exactly 24 or 30 FPS - their FPS will vary, sometimes significantly. Movies and TV on the other hand always show exactly the same number of frames per second and do not vary one bit. Therefore the variability in framerate in games also works to exaggerate the impact of lower framerates, making them more noticeable. In Crysis for example, if you walk out of an indoor area which has 60 FPS into a outdoor area with 25 FPS, you will notice the difference, partly due to a change in control responsiveness, and partly because your eyes detect the relative change in framerate. One way to demonstrate that the human eye can actually detect differences above 30FPS is to use a small program called FPS Compare (11KB) by Andreas Gustafsson (used here with his permission). To use it, simply extract and launch the FPSCompare.exe file. Make sure to read the instructions in the Readme.txt file, and note that this utility is still in beta form. You may need to force VSync to Off in your graphics card's control panel for it to work properly, but if it doesn't work properly for you, you can try the more basic version of it from here: FPS Compare (old) (106KB). FPS Compare shows the same scene rendered side by side in a split-screen arrangement, but each side is running at a different frame rate. When launching the new FPS Compare program, I recommend pressing F2 to change the scene to one more familiar to gaming. Now by staring at the middle of your screen, you should be able to detect that the portion on the left (at ~60FPS) appears smoother than the portion on the right (at ~30FPS). Even if the difference is not major to your eyes, many people do notice that there is at least some difference - something which refutes the fact that human eyes cannot notice differences in smoothness at an FPS over 30. As the articles I link to further above discuss, testing has shown that human beings can regularly distinguish differences of one frame in 200 every second. There is no actual theoretical limit on how many frames the eye can distinguish. In the natural world, human eyes don't digest motion in terms of a series of still frames, they take in a constant stream of analog movement data. In particular, we are quick to notice dramatic contrasts, no matter how brief. If there is a gap or brief fluctuation in the flow of visual data, then our eyes and consequently our brain can actually pick this up if it's relevant, even if it's subconscious - it all depends on the context of the data streaming in. So to come back to the question of how many FPS is enough, in my experience, and for most practical purposes, a framerate of around 60 FPS is completely sufficient as a maximum FPS. Even 25 or 30FPS can be totally sufficient in slow or medium-paced games - particularly if the game has motion blur, softer edges, and does not display significant variability or stuttering. If there's one thing that would be perfect to have in any game, it would be a method of maintaining a fixed framerate. Unfortunately this is not practically possible on many systems because of the different types of hardware used. You can enable VSync to cap the maximum framerate and hence reduce FPS variability, but this may also reduce performance - see the Vertical Synchronization section of this guide. [/QUOTE] Source: [url]http://www.tweakguides.com/Graphics_5.html[/url] The whole guide is a good read.
Read [url=http://www.100fps.com/how_many_frames_can_humans_see.htm]this[/url].
My fps rate right now is 2, damn life is laggy.
if it's 75, then my computer is better then my eyesight.
I don't think your eye works in straight fps. You have tons of little photosensitive cells that each see at a certain rate. I am pretty sure they each have their own "fps" and they all fire off at different times.
[QUOTE=Yoces;24156653]I couldn't find a better word then FPS, really. So say I'm playing a random game at 100FPS, then another game at 300FPS and a third game at 60FPS. I see absolutely NO difference. So how many "FPS" can the human eye see in?[/QUOTE] yours see 25, mine see 80~90 for most games it doesn't matter, but when I play quake3 arena at 75fps and then at 100 fps I feel the difference also, my CRT refreshes at 100hz
[QUOTE=slippp22;24158213]Well, out here in the country and near water I get a nice solid 72 FPS, but when im in a city or crowded area my eyes get a little bit laggy and go down to about 20-30 FPS[/QUOTE] you should get a second frontal lobe for crossfire. that's if your cranium supports it. upgrading your cerebellum works too.
60.
[QUOTE=xxncxx;24161126]So much fucking dense idiots. He didn't say the eyes see in FPS, he said what would a noticeable difference be seen at what FPS. God damn learn to read. [editline]02:10PM[/editline] I'm talking about abp1192 btw.[/QUOTE] I know that. I am saying you wouldn't notice past 72. Sorry for not making it clear what I meant everyone. Wow.
ITT: Idiots who knows nothing about the human eye.
I find it hard to believe that 72 fps is the highest the eye can see when real-time is around the thousands.
ITT: idiots who read a wikipedia article and think they know shit also ITT: people who didn't even bother experimenting with their own eyes and claim to know shit [QUOTE=Teh_Medic;24177446]I find it hard to believe that 72 fps is the highest the eye can see when real-time is around the thousands.[/QUOTE] there's no real-time fps, it's continuous motions
[url]http://www.boallen.com/fps-compare.html[/url] Comparing them for your sake.
[QUOTE=blackrack;24177503]ITT: idiots who read a wikipedia article and think they know shit also ITT: people who didn't even bother experimenting with their own eyes and claim to know shit there's no real-time fps, it's continuous motions[/QUOTE] I meant the closest thing to reality, my mistake.
[QUOTE=Schlinky;24177515][url]http://www.boallen.com/fps-compare.html[/url] Comparing them for your sake.[/QUOTE] it works, 60 fps are more fluid than 30, you can see the edges going choppier on 30 than on 60. Of course if you have an LCD you won't notice shit I'm out
The morale of the thread. Have at least 70 fps and you are set.
Your eyes doesn't work like that, and it entirely works on your responds time. Scientists have proven that the human mind can [I]register[/I] an image differense in an image when the differense is only shown for 1/1000th a second, however with that fast a change you won't be able to notice, memorize or imagine the visual differense, only respond to it.
What would be the max "resolution" an eye can see?
The eyes don't work in that way.
If I recall correctly human eyes see up to around 80 Hertz depending on the person. That's why looking at old CRT monitors bugs some people out. Fluorescent lights flicker at a rate beyond our threshold, but it still effects some people too.
Nope, fluorescent lights flicker at 50 Hz (50 times / second). Why we don't see it? I think this gonna work: brightness eats darkness. I don't know much into this, do a research in the webs. Why does anything above 60 FPS, even if it's 900 FPS, looks like 60 FPS? Well that's because most monitors are only able to display 60 Hz.
[QUOTE=Notnotprobydoby;24181558]What would be the max "resolution" an eye can see?[/QUOTE] I read that an average human eye is able to see the same amount of detail as a camera able to take ~500MP images. Don't ask me for the source, it was in a science magazine. [editline]02:59PM[/editline] [QUOTE=xboomguy;24181951]Nope, fluorescent lights flicker at 50 Hz (50 times / second).[/QUOTE] This makes it impossible for me to videotape at high framerates like 600FPS unless you have a proper flicker-free lamp. (I have a big 800W lamp for this)
[QUOTE=Teh_Medic;24177446]I find it hard to believe that 72 fps is the highest the eye can see when real-time is around the thousands.[/QUOTE] You can see more, but you wouldn't notice anything above it really.
Its all about the person. Yeah eyes dont work like that but if we somehow transfer them to fps... I would guess about 100 fps.
Let's just wait for monitors able to show 500+ FPS. Then we can bring this up again.
They go at 60fps when the world is set on low when its maxed it goes at 10fps my eyes sucks ass :hurr:
Well it starts to become noticable below 30, but the eye doesnt really have a limit. I mean if the max was 75fps and we see something at 300fps then our eyes wouldn't really have a way to 'skip' the frames.
Sorry, you need to Log In to post a reply to this thread.