I don't mean to be that "human eyes can't see above 30 fps" kind of guy but c'mon, how much further past 240Hz is really worth it?
How about improving OLED technology so we have infinite contrast without the screen burning in?
[QUOTE=Dr. Evilcop;52573404]I don't mean to be that "human eyes can't see above 30 fps" kind of guy but c'mon, how much further past 240Hz is really worth it?[/QUOTE]
60 to 75 is a very big difference
75/100 or so, to 144 hz is visible though not quite as much as 60-75
Every time you double the framerate you halve the delay between frames. 60 fps is 16 ms, 120 is 8, etc. The jump from 240 to 480 only reduces the delay between frames by 2 ms, which is way too little to make any difference.
I bet if you would to play at 480Hz and it suddenly drops to 30Hz it would feel like you got punched in the face.
These sorts of monitor are basically made for quake and cs players who really want the best fps they can get. If you see this monitor and think "why would I need this" you're not their target market.
[QUOTE=Elstumpo;52573545]These sorts of monitor are basically made for quake and cs players who really want the best fps they can get. If you see this monitor and think "why would I need this" you're not their target market.[/QUOTE]
I will be surprised if people can tell the difference between 240 and 480 hz.
[QUOTE=Kiwi;52573469]This is some serious dick waving.[/QUOTE]
More of a dick spasm at 480fps
I want a 2k 480Hz monitor with a true hdr spectrum.
More FPS = more samples for better natural motion blur.
I want one.
[QUOTE=AntonioR;52573543]I bet if you would to play at 480Hz and it suddenly drops to 30Hz it would feel like you got punched in the face.[/QUOTE]
I would puke
[QUOTE=bananaslamma;52573558]I will be surprised if people can tell the difference between 240 and 480 hz.[/QUOTE]
It shouldn't be that hard, really.
I'm gonna say something that sounds crazy, but it absolutely is true: Doubling the framerate will be noticeable into the [del]millions[/del] [B][I]trillions[/I][/B] as long as on-screen movement is contrasty, bright and fast enough. (NOT normal content, obviously)
For CS:GO and similar games though? Eh, 240Hz is probably enough, and it'd be hard to notice differences unless you like spinning your camera around just to see which one looks less jagged in the retinal afterimage.
[QUOTE=Paul-Simon;52573698]It shouldn't be that hard, really.
I'm gonna say something that sounds crazy, but it absolutely is true: Doubling the framerate will be noticeable into the [del]millions[/del] [B][I]trillions[/I][/B] as long as on-screen movement is contrasty, bright and fast enough. (NOT normal content, obviously)
For CS:GO and similar games though? Eh, 240Hz is probably enough, and it'd be hard to notice differences unless you like spinning your camera around just to see which one looks less jagged in the retinal afterimage.[/QUOTE]
I'm not saying you're wrong, but what you say contradicts stuff I learned in neurobiology classes I took. Could you post a source?
I've never seen a monitor that goes past 144Hz in person.
I can say for a fact that I notice the difference between 90Hz and 144Hz.
I wonder if I could tell the difference any higher?
well the human eye cant see past 30 fps so what is the point
[QUOTE=Cold Finger;52573847]I've never seen a monitor that goes past 144Hz in person.
I can say for a fact that I notice the difference between 90Hz and 144Hz.
I wonder if I could tell the difference any higher?[/QUOTE]
I've never seen above 60 :cry:
well it's not good enough, i need 1440p , 10bit per color component, IPS or better with those params
Gaming on a 480Hz monitor? Pfew, you guys think to low
I wanna see how porn looks like
[QUOTE=Cold Finger;52573847]I've never seen a monitor that goes past 144Hz in person.
I can say for a fact that I notice the difference between 90Hz and 144Hz.
I wonder if I could tell the difference any higher?[/QUOTE]
I've seen a TV that's 240Hz at a bestbuy and turned on the 120Hz 3D. That shit certainly fools with your eyes.
From what I understand, higher frequency monitors have a point of diminishing returns.
[QUOTE=The Saiko;52573837]I'm not saying you're wrong, but what you say contradicts stuff I learned in neurobiology classes I took. Could you post a source?[/QUOTE]
What exactly does it contradict?
No source, although i could make a visual example and do some maths to explain it.
But basically, at those absolutely ridiculous framerates, to do a visual comparison between two monitors you'd do something like having an object (VERY BRIGHT against dark background) move across the monitor at some incredible speed, a speed so high that the object would essentially only be on the screen for, say, 4 frames if ran at 100 million FPS, but 8 frames if ran at 200 million FPS.
You wouldn't see the movement per-se, but you'd be able to tell which monitor was the higher FPS one based on the afterimage. (As in, one monitor left an afterimage of 4 objects flashing on the screen, while the other left an afterimage of 8 objects)
I should be clear that this doesn't matter in any sort of practical application, but flashes of light can be recognised and identified by us no matter how quick as long as they're bright enough.
I understand diminishing returns but at wat point do you actually stop seeing a difference
[QUOTE=Citrus705;52574938]I understand diminishing returns but at wat point do you actually stop seeing a difference[/QUOTE]
I think this might be it.
[QUOTE=AntonioR;52573543]I bet if you would to play at 480Hz and it suddenly drops to 30Hz it would feel like you got punched in the face.[/QUOTE]
Just right now I remembered how I played on my very old rig Splinter Cell: Chaos Theory and barely was able to push it past 15 frames per second, and thinking oh my god this is playable :v:
[QUOTE=Cold Finger;52573847]I've never seen a monitor that goes past 144Hz in person.
I can say for a fact that I notice the difference between 90Hz and 144Hz.
I wonder if I could tell the difference any higher?[/QUOTE]
90 to 144 is very noticeable though. I found it odd when I could easily see the difference.
[QUOTE=Citrus705;52574938]I understand diminishing returns but at wat point do you actually stop seeing a difference[/QUOTE]
I remember reading somewhere that 285Hz is when a vast majority of people cannot see a light flashing on and off. 240-280Hz is probably the highest consumer monitors will ever need to go imo. 480Hz is overkill and goes far into the point of diminishing returns.
This thread is baffling. Why would people dismiss progress in display tech? I remember when everyone here went wow at VR tech and wanted it to get even better few years ago. What has happened? Is it because this one is a desktop monitor and so people think it's a gimmick?
When such refresh rate screen becomes first available for sale it may be a marketing gimmick indeed and tech for such high framerates in video games might not be ready, but it's still progress towards the "perfect" screen that has no blur or issues from blur reduction (The only advantages old heavy CRT screens still have :v:)...
[QUOTE=Dr. Evilcop;52573404]I don't mean to be that "human eyes can't see above 30 fps" kind of guy but c'mon, how much further past 240Hz is really worth it?[/QUOTE]
It's "worth it" up to 1000Hz in typical screens for eliminating motion blur and even higher in displays that are supposed to trick you into believing you are looking at real life.
[QUOTE=bananaslamma;52573534]Every time you double the framerate you halve the delay between frames. 60 fps is 16 ms, 120 is 8, etc. The jump from 240 to 480 only reduces the delay between frames by 2 ms, which is way too little to make any difference.[/QUOTE]
4 ms to 2 ms still very visible:
[IMG]https://www.blurbusters.com/wp-content/uploads/2014/03/motion_blur_from_persistence.png[/IMG]
You'll need 1000Hz at 1000 fps for 1ms of persistence without using strobing tech.
I'm no vision scientist or display engineer so I'll use lot of quotes from now.
Why even push for higher refresh rates:
[QUOTE]This is in the pursuit of achieving the fuller "Holodeck" experience without strobe effect / wagonwheel effect / mousedropping effect / motion blur effect. The only way to really solve all such image artifacts simultaneously, is attempting to resemble something closer to framerateless continuous-motion (or infinite framerate). Currently, we are stuck with needing finite refresh cycles in order to artifically display moving imagery (films, televisions, monitors, screens, etc), so strobing (ala LightBoost/ULMB) is a much easier/simpler way to achieve low persistence using today's technology.[/QUOTE]
[url]http://forums.blurbusters.com/viewtopic.php?f=7&t=174[/url]
How could GPU's achieve 1000 fps in anything newer than Quake?
The answer is interpolation/reprojection/timewarping:
[QUOTE]
Many modern researchers & VR scientists now agree that to have the equivalent persistence of 2ms flicker (black frame insertion, strobing, LightBoost, ULMB, Oculus OLED rolling scan) is the baseline. 2ms or better to even remotely approach CRT-league motion clarity. But how do we achieve this without strobing? The answer is very simple, but often jawdropping to some. In order to do this without strobing -- no black frame insertion, no strobing, no pulsing, no CRT scanning, no OLED pulsed rolling scan -- you would need all consecutive frames and refresh cycles to be visible for your target persistence. If you want 2ms persistence without black periods, you need (1000/2ms) = 500fps @ 500Hz to have the same motion clarity without adding any blackness between refresh cycles (avoiding common Blur Reduction techniques).[/QUOTE]
[QUOTE]The good news is that interpolation/reprojection/timewarping is gradually becoming more free of artifacts, and eventually may be less objectionable than strobing disadvantages. In yesterday's HDTVs, these interpolators were quite abysmally terrible, and even Oculus' timewarping is not perfect. However, progress in frame rate amplification technologies is currently very rapid and with the newer virtually lagless implementations (Picture this: interpolation 240fps->960fps while keeping input lag of 240fps@240Hz) -- and with the progressively reduced artifacts (thanks to increasingly geometry-aware frame rate amplification) -- eventually, the benefits far outweigh the disadvantages even for mainstream use.
Today, frame rate amplification (Oculus' term "reprojection", "timewarp") is already being done by Oculus at a 2x factor, they call it "reprojection"). GPU is definitely the limiting factor, but 25 years ago, GPUs didn't even exist, now they've come up with their esoteric stuff like shaders, transforms, stream processors, etc, and it's possible to have virtually-lagless darn-near-artifact-free geometry-aware interpolators/translators/etc. GPUs have been constantly gaining added goodies in their processing pipeline and, eventually, there may be dedicate silicon for framerate amplification. Whatever is currently being done today has quite a lot of artifacts at 45fps->90fps but they do vastly diminish when amplifying 240fps->480fps, and then adding enhanced geometry awareness to the frame interpolator/translation/reprojection/timewarping (Which I'll now call "framerate amplifier stage of the GPU"), helps even further to reduce artifacts. Scientifically, it's a furious research subject nowadays -- unbeknownst to the mainstream.
[/QUOTE]
[QUOTE]Certainly, most people don't care, but 144Hz was more dismissed in 2012 ("who cares, nobody can see above 30fps") than even 480Hz currently is today. Nowadays you see 144Hz monitors at mainstream stores such as Best Buy and at Staples, not just at niche stores. Still niche -- but far easier and more accessible than getting an HDTV in the late 1990s. We don't dismiss progress, too quick -- it marches on, indeed.
Even today, people "I cant tell apart 120Hz and 144Hz and 165Hz" [B]don't quite realize the bigger jumps needed[/B] (120Hz->240Hz->480Hz->960Hz) to easily notice the steps of motion clarity improvements, especially in the light of increasing resolution and FOV, and the evolving types of games currently being played. And due to GtG limits, the worst/unoptimized 240Hz LCDs to have more motion blur than the best fastest 144Hz LCDs -- and often, people don't run framerates high enough to milk the lowness of the motion blur of those new displays. For a gamer's "I fluctuate 100-to-200fps" upgrading from 144Hz to 240Hz won't see as dramatic clarity improvement as those gamers running nearly permanently->240fps (e.g. CS:GO). [/QUOTE]
[url]https://forums.blurbusters.com/viewtopic.php?f=7&t=3520[/url]
[QUOTE=Elstumpo;52573545]These sorts of monitor are basically made for quake and cs players who really want the best fps they can get. If you see this monitor and think "why would I need this" you're not their target market.[/QUOTE]
I play CS only casually but biggest reason I want higer refresh rate displays is motion clarity.
I despise the motion blur my 60Hz display has and even 120/144Hz displays need backlight strobing to achieve blur-free motion. But that has disadvantages, main two are that fps has to match refresh rate perfectly and the image will be dimmer, a lot dimmer if you want the best motion clarity.
Here's somewhat condensed post about all this: [url]https://www.quora.com/Can-I-watch-a-video-with-more-frames-per-second-than-our-human-eye-can-see/answer/Mark-Rejhon?share=701a84cf[/url]
Michael Abrash also wrote a blog post: [url]http://blogs.valvesoftware.com/abrash/down-the-vr-rabbit-hole-fixing-judder/[/url]
I'm looking forward to Blur Busters review of that 480Hz screen. Not really because of the screen itself, but because they'll likely explain everything about frame and refresh rates in one single article.
[QUOTE=Sini;52576359]-words-[/QUOTE]
It's refreshing having someone actually post a bunch of info in one of these threads.
I don't even the notice the difference between 60hz and 120hz outside of testufo.com, I'd rather the focus was on other aspects of monitors.
Sorry, you need to Log In to post a reply to this thread.