• The Great Framerate Non-Debate
    64 replies, posted
From the eyes of a PC peasant that had a shitty/less than mediocre rig: The two multiplayer shooters I've invested the most time in, TF2 and Bad Company 2, I must have had between 20-40 FPS with most of the time. I don't remember ever noticing or remarking about the frame rate, though I remember being pissed on occasion where the game effectively jumped a few frames and I died. I'd concur in that 20 FPS and slightly above is highly impractical for shooters, given the number of bad matches I had due to memorably slow responses caused by low frame rate. But I'd say 30-40 is fine for me. I don't think I've ever experienced constant 60 FPS in shooters that hogged my old crappy rig's resources and I certainly don't know the full benefits of 60 FPS.
[QUOTE=Janus Vesta;44980255]Movies are shot at 24 frames per second because film reels were fucking huge and going any higher than that would mean you have a giant, heavy, extremely flamable reel of disaster. There's absolutely no point in filming at 24 FPS any more because, for the most part, we don't use film any more. "It looks cinematic" is a fucking retarded argument, you never heard people arguing against colour because "monochrome is more cinematic!", no one bitched about sound in film because "having a live pianist is more cinematic."[/QUOTE] Actually, people DID complain about sound and colour. [I]A lot of people.[/I] All the big critics back in the days. They considered sound a cheap gimmick used to sell tickets cos "this film... has noises!!" that would ruin cinema because it'd bring it too close to theatre and thus reduce the point of cinema (cos I mean with sound why not just make it a play right) And people complained about colour because they thought it was a dumb, expensive gimmick (much like 3D nowadays- it was viewed similarly) and it was only used as a way to sell tickets. Little did they know, I guess :v:
[QUOTE=mikeyt493;44983394]Actually, people DID complain about sound and colour. [I]A lot of people.[/I] All the big critics back in the days. They considered sound a cheap gimmick used to sell tickets cos "this film... has noises!!" that would ruin cinema because it'd bring it too close to theatre and thus reduce the point of cinema (cos I mean with sound why not just make it a play right) And people complained about colour because they thought it was a dumb, expensive gimmick (much like 3D nowadays- it was viewed similarly) and it was only used as a way to sell tickets. Little did they know, I guess :v:[/QUOTE] the transition from silent film to the era of the talkies took less than a handful of years. Same with the change to colour more or less. People adapt quickly when something works.
To be honest with Watch Dogs at 30FPS I notice absolutely no difference at all. I was watching a general PC/PS4 comparison and the guy blurted that Watch_Dogs was at 30FPS.. I genuinely, genuinely though it was 60FPS at all times, I mean, bar the VERY OCCASIONAL dip when I decided to blow up the entire city. [editline]3rd June 2014[/editline] I thought 1080P too. TO be honest I couldn't care how overhyped PS4's and Xbox One's are. As long as I don't notice it I couldn't give a fuck, it's all about the games man.
that feel when toaster of computer so you can't run at glorious 60 fps ever except on really old shit why
I do know that feeling, though if you can scrape enough together you can upgrade.
[QUOTE=HumanAbyss;44983557]the transition from silent film to the era of the talkies took less than a handful of years. Same with the change to colour more or less. People adapt quickly when something works.[/QUOTE] I've watched a couple of shows at 30fps, filmed in 60fps for whatever reason they decided to do that. At first it looks cheap and awkward but you stop noticing it pretty quickly, to the point that things felt choppy when I first watched something in 24fps after a lot of the 30fps stuff.
[QUOTE=Reds;44983682]I've watched a couple of shows at 30fps, filmed in 60fps for whatever reason they decided to do that. At first it looks cheap and awkward but you stop noticing it pretty quickly, to the point that things felt choppy when I first watched something in 24fps after a lot of the 30fps stuff.[/QUOTE] your eyes adjust and the software your brain runs adjusts as well, often so quickly you stop noticing what was pissing you off before.
[QUOTE=Janus Vesta;44980255]Movies are shot at 24 frames per second because film reels were fucking huge and going any higher than that would mean you have a giant, heavy, extremely flamable reel of disaster. There's absolutely no point in filming at 24 FPS any more because, for the most part, we don't use film any more. "It looks cinematic" is a fucking retarded argument, you never heard people arguing against colour because "monochrome is more cinematic!", no one bitched about sound in film because "having a live pianist is more cinematic."[/QUOTE] except a.) both of those things happened and b.) there are serious reasons to not move towards higher framerate film as they cause sickness in vast portions of the population because they don't actually involve motion blur whatsoever and as such tend to fill the brain with more visual information than is comfortable to absorb. film and animation is literally designed so that your brain is forced to fill in the blanks because it creates a more realistic perceived image in that way. and honestly while tb is correct in the presumption that framerate is really important, I don't think it's more important than graphical fidelity. object based motion blur, better lighting techniques, and various other advancements are far more important than increasing framerate to me. it's the way to move forward and keep pushing forward. if the first crysis had focused solely on getting that 60fps framerate, there wouldn't have been such a graphical leap at that time. crysis and bioshock set the standard for that time and set the stage for the future for graphical fidelity. [editline]2nd June 2014[/editline] if you're coming from the perspective of a competitive multiplayer game, yes, framerate is critical. but i'd say that for a non-competitive singleplayer experience, pumping those resources into creating a more believable and rich world is probably a better course of action, as there's many times where you'll simply be relaxing and absorbing the world around you. like dark souls, for instance. sure, 60 fps feels better, but it doesn't immerse you in the world any better than visuals do. if from software kept the original lighting as it was in the game i'm willing to bet that more than half of players would be willing to have it be locked at 30fps again.
[QUOTE=BrickInHead;44983761] if you're coming from the perspective of a competitive multiplayer game, yes, framerate is critical. but i'd say that for a non-competitive singleplayer experience, pumping those resources into creating a more believable and rich world is probably a better course of action, as there's many times where you'll simply be relaxing and absorbing the world around you. like dark souls, for instance. sure, 60 fps feels better, but it doesn't immerse you in the world any better than visuals do. if from software kept the original lighting as it was in the game i'm willing to bet that more than half of players would be willing to have it be locked at 30fps again.[/QUOTE] Well a higher frame rate in a frame critical game like Dark Souls can be a game changer in how it feels to play it
[QUOTE=Janus Vesta;44980255]There's absolutely no point in filming at 24 FPS any more because, for the most part, we don't use film any more. "It looks cinematic" is a fucking retarded argument, you never heard people arguing against colour because "monochrome is more cinematic!", no one bitched about sound in film because "having a live pianist is more cinematic."[/QUOTE] You can always tell the ones that don't know fucking shit about the art of cinema when they say things like this. How disgustingly ignorant of the craft can you be.
[QUOTE=Axznma;44984093]You can always tell the ones that don't know fucking shit about the art of cinema when they say things like this. How disgustingly ignorant of the craft can you be.[/QUOTE] Uh, we DON'T really use film anymore as a material. The proof of this is that a large, large part of the producers of film have stopped manufacturing film. A large magnitude of the film reel development(dark rooms and etc) are now going out of business. It's not gone, it's not dead, probably won't be for years, but it's largely going out. It's sad to see, film is a magical material to work with and hearing the "clickity clack" of a film camera on a set is magic. But it's dying. Video cameras like the Alexa and Red are amazing, high quality pieces of technology that are largely eating away at the number of films being done on film as opposed to video. Ever since video started being able to do high enough quality that large beautiful feature films are created on them, it has taken the industry by storm. I say this having worked in the film industry for a few years.
[QUOTE=HumanAbyss;44983843]Well a higher frame rate in a frame critical game like Dark Souls can be a game changer in how it feels to play it[/QUOTE] Like how your weapons break faster because from is shit at coding.
motion blur can be as important as fps too, when done for more than viewport rotation object motion blur is a good technique and helps give hint of velocity in a frame, so if you want to portray motion, instead of flashing a ton of frames with no velocity information what so ever, use object motion blur. games also can run internally at a higher rate than their output to display, which cuts down on input lag tailoring your game for 60 fps means more than just a cut down graphically, but the sum of anything that touches your cpu needs to be done in half the time, from physics, ai, sound, etc etc; to sum it up, your "gameplay" has to be tailored towards it as well
a
[QUOTE=Neckbird;44984564]motion blur can be as important as fps too, when done for more than viewport rotation object motion blur is a good technique and helps give hint of velocity in a frame, so if you want to portray motion, instead of flashing a ton of frames with no velocity information what so ever, use object motion blur. games also can run internally at a higher rate than their output to display, which cuts down on input lag tailoring your game for 60 fps means more than just a cut down graphically, but the sum of anything that touches your cpu needs to be done in half the time, from physics, ai, sound, etc etc; to sum it up, your "gameplay" has to be tailored towards it as well[/QUOTE] Object motion blur is an expensive technique in on itself unless you want it to look like ass.
[QUOTE=BrickInHead;44983761]there are serious reasons to not move towards higher framerate film as they cause sickness in vast portions of the population because they don't actually involve motion blur whatsoever and as such tend to fill the brain with more visual information than is comfortable to absorb. film and animation is literally designed so that your brain is forced to fill in the blanks because it creates a more realistic perceived image in that way.[/QUOTE] I think I would like to see you back that up. That sounds like absolute bullshit. How can 60 frames per second ever be 'too much information' when you're getting a literally [I]constant[/I] stream of information into your eyes when you're not watching a video. If 60 frames per second is too much, then looking away from any screen should be instant visual information overload?
i play rome 2 at like 20-25fps because my video card can't really handle a game like that at 1440p but it doesn't bother me because it's a strategy game that i only play in singleplayer so because the input isn't so twitch-based or fast-moving like in an fps it's quite bearable but yeh i'd never bother playing an fps especially a multiplayer one at less than 30-40
[QUOTE=Sherow_Xx;44989801]I think I would like to see you back that up. That sounds like absolute bullshit. How can 60 frames per second ever be 'too much information' when you're getting a literally [I]constant[/I] stream of information into your eyes when you're not watching a video. If 60 frames per second is too much, then looking away from any screen should be instant visual information overload?[/QUOTE] That's because it is absolute bullshit and it's coming directly out of his rear
I bought GTA V on the 360 last summer and had a lot of fun with it, but then I had a stint where I played Forza 4 a lot. After Forza, I couldn't stand to play GTA anymore, just because of the massive frame rate differences.
Framerates of film and videogames can't be compared at all. The frames in a film/video are photographs created by capturing actual light in a short time. Blur created by movement during this time creates the feel of movement in the film. Frames in videogames aren't actually photographs, they're rendered images. They don't connect the way frames in video do. With todays technology, the best you can do is to cover it up with more frames.
[QUOTE=Sherow_Xx;44989801]I think I would like to see you back that up. That sounds like absolute bullshit. How can 60 frames per second ever be 'too much information' when you're getting a literally [I]constant[/I] stream of information into your eyes when you're not watching a video. If 60 frames per second is too much, then looking away from any screen should be instant visual information overload?[/QUOTE] films that are in deep focus on a 2 dimensional plane create serious motion sickness in viewers who haven't spent the time to adapt to high frame rate video, it was an extremely common problem during the release of the hobbit. your eyes naturally blur out a ton of space as they only focus on a matter of 2-3 degrees in front of you, this is also spatially dependent (ie what's behind and in front of the focal point of your eyes). when all of that information is presented without motion blur, it causes headaches. i don't have a source but it happened with a ton of people that watched high frame rate film, it's one of the major reasons (beyond tech and cost) 48fps hasn't picked up after the hobbit. people were getting nauseous watching it. the film appears to be moving in fast motion for the first 20-30 minutes of viewing, which causes pretty serious headaches in a lot of people. once the brain acclimates it starts to go away for some, not so much for others. [editline]3rd June 2014[/editline] [QUOTE=DeEz;44992462]That's because it is absolute bullshit and it's coming directly out of his rear[/QUOTE] are you calling me a bull???
I see the difference between 30 and 60 fps for sure, I just don't really get why people care? Most of the argument in favor of 60 fps I've seen is not arguing in favor of 60 fps but arguing against lower than that, and it's the same for the other way around.
It matters because the games are more responsive and look better, and it's getting tiring to hear excuses on why games don't run at 60 FPS when they were fully capable of doing so.
In answer to the question "Is it ever acceptable to have a game running at 30 fps?" Yes. I would certainly love for them to be running at a higher framerate, but a game running at 30 fps is hardly unplayable, not even unenjoyable.
[QUOTE=Rufia;44993819]In answer to the question "Is it ever acceptable to have a game running at 30 fps?" Yes. I would certainly love for them to be running at a higher framerate, but a game running at 30 fps is hardly unplayable, not even unenjoyable.[/QUOTE] People ask that question in the context of how we damn well can run games in 60 FPS, but developers don't make them like that. It's not unacceptable in the sense of "ugh, this game's in 30 FPS? That's shit, I can't even play it," it's unacceptable in the sense of "they didn't bother." Especially when companies try to bullshit in saying that "30 FPS is better."
[QUOTE=Neckbird;44984564]motion blur can be as important as fps too, when done for more than viewport rotation object motion blur is a good technique and helps give hint of velocity in a frame, so if you want to portray motion, instead of flashing a ton of frames with no velocity information what so ever, use object motion blur. games also can run internally at a higher rate than their output to display, which cuts down on input lag tailoring your game for 60 fps means more than just a cut down graphically, but the sum of anything that touches your cpu needs to be done in half the time, from physics, ai, sound, etc etc; to sum it up, your "gameplay" has to be tailored towards it as well[/QUOTE] motion blur always looked like shit to me, and ironically on my older computers just made it a little slower its totally not worth it if motion blur doesn't effect ones computer then it doesn't matter if they can run it faster anyways.
back in the day I was fucking bouncing off the walls if I could get Max Payne to hit 30 fps then I paid $350 for a single piece of hardware in my computer i expect results
[QUOTE=Janus Vesta;44980255]Movies are shot at 24 frames per second because film reels were fucking huge and going any higher than that would mean you have a giant, heavy, extremely flamable reel of disaster. There's absolutely no point in filming at 24 FPS any more because, for the most part, we don't use film any more. "It looks cinematic" is a fucking retarded argument, you never heard people arguing against colour because "monochrome is more cinematic!", no one bitched about sound in film because "having a live pianist is more cinematic."[/QUOTE] Shooting and presenting at 24fps still exists today due to several reasons, mainly because its to present a movie with a film look. There is several factors to making a movie have a film look. One of them is presenting the movie at 24fps, during shooting to shoot at the 180 degree shutter rule (hence the motion blur), color correct/grading and even the sound can make it feel like a film. Sure there is higher frame rate options out there, sure we have the storage and the technological processing power to handle it but directors love using 24fps. It's really just a historical film frame rate that will probably still be around for decades if not forever. If they were shooting at 48fps, it would still be a very good idea in most cases to shoot following the 180 degree shutter rule but it will cause less motion blur than if the film was shot at 24fps and following the same rule.
to be honest the biggest thing about framerate locked games is that I should be able to run it at 90 fps on max everything but i can only get 30? I didn't pay 1300$ to be fucking jerked around with a bad port.
Sorry, you need to Log In to post a reply to this thread.