• The Framerate Debate: 30 “Feels More Cinematic” Than 60
    58 replies, posted
[QUOTE=Bloodshot12;46200152]Is that bottom left response real[/QUOTE] Yep, second one in the interview [url]http://www.computerandvideogames.com/464912/previews/hands-on-and-interview-the-order-1886s-e3-demo-showcases-style-over-substance/[/url]
Yeah no. I've played 30 fps console games for years and I still prefer 60fps. I don't know how anyone could say they think 30 is better. I would give so much just to play all of those games in 60fps.
Ubisoft is literally shit at optimizing games on all systems now so they wanna spin shitty frame rate and poor optimization as a new feature.
[QUOTE=Warship;46198900]Being able to control the camera isn't very cinematic either, so they should make locked camera presets while they're at it.[/QUOTE] Like in MGS2? Fuck no, please don't.
[QUOTE=megafat;46198834]Movies don't have input lag.[/QUOTE] Assassin's Creed games are so easy that it probably wouldn't matter. I think that's the reason, they make the games super easy and just eye candy with no challenge, that's why they always use that "looks cinematic" arguement, cause its basically a movie with some times you push a button.
[QUOTE=megafat;46198834]Movies don't have input lag.[/QUOTE] No worries, future games are all gonna be cutscenes with quicktime events anyways. *SIGH*
60fps looks better and more importantly plays better. Your reflexes are a lot faster by default as seen by that one study where people played Quake 3 on 30 then 60 fps.
[QUOTE=Snickerdoodle;46199109]So what excuse are they going to use when most cinemas use 60 FPS?[/QUOTE] Games are going to hit 60fps before the majority of movies will. Movies are going to be 24fps for a very long time for a lot of reasons
[QUOTE=Yummy Pie;46199979]If they're able to make it a solid 30 with no drops at all then it's not a huge deal. But they can't even do that, so why not make it 60? Ubisoft proving once again that they're literally worse than EA. [editline]10th October 2014[/editline] This is what developers have become. [thumb]http://i.imgur.com/6tdC8ie.jpg[/thumb][/QUOTE] It's like people are writing a story and then putting gameplay as an afterthought, which is the opposite of what you're supposed to do. Sort of like idea guys when they write huge complex stories, but forget about gameplay.
What the fuck, why is this even called a debate? There's no debate, 60 fps > 30 fps. No debate.
[QUOTE=Satane;46203226]30 fps still has less input lag than 60 fps with vsync[/QUOTE] Depends on the game, and how much overhead the computer has. A game that would run at 70FPS without vsync has very little input lag with vsync on. A game that would run in 300FPS however, gets pretty delayed when vsync is on. That's how I think it works, anyway.
I'm going to give them £0, because it's a lot like £40 but it's much more cinematic.
[QUOTE=kenji;46203261]I'm going to give them £0, because it's a lot like £40 but it's much more cinematic.[/QUOTE] I will probably give them $10.50 during a Steam sale in 2 years. That's how much it costs to see a movie at the theater.
I'd rather have last-gen looking games running at 1080/60p than "next-gen" looking games at 900/30p.
[quote] It's a bit like The Hobbit movie, it looked really weird. "And in other games it's the same - like the Rachet and Clank series.[/quote] Say what you like about the slippery slope fallacy, this is definitely going in worrying directions the more publishers and developers chant about 30fps, party and resolution. First it was unfair and unwarranted comparisons to non-interactive film mediums, now we're at outright lies about examples from this one. This is the first time I've seen someone in this position try to use an existing target-60fps title to try and justify their bullshit, and it couldn't have happened them chanting "filmic" and "cinematic experience" for months now. What's next, claims that 4k is the result of mass hallucination?
[QUOTE=Dark RaveN;46200271]Like in MGS2? Fuck no, please don't.[/QUOTE] actually the camera works super well for mgs2 but most games shouldn't have it
Metal Gear Solid 5 is going to run 60 FPS and it's probably going to be cinematic as fuck as well. The cinematic argument is such bullshit, it's ridiculous.
I agree 60fps should not be the standard. 120fps should be the standard I mean come on let's go forward not backwards!
[QUOTE=Warship;46203254]Depends on the game, and how much overhead the computer has. A game that would run at 70FPS without vsync has very little input lag with vsync on. A game that would run in 300FPS however, gets pretty delayed when vsync is on. That's how I think it works, anyway.[/QUOTE] I think it just caps the framerate at your monitors refresh rate so it prevents frames being rendered on top of other frames (Causing those tear lines you see when you have motion) because your monitor can't display them as fast as they're being rendered. Some games handle it differently and you won't get much input lag at all, if any, whereas others will be like a low quality game streaming service. It's something to do with skipping frames to put the latest rendered on top and not splitting frames into multiple parts, each part being rendered separately. It's best to say Fuck vSync and cap your framerate to your refresh rate, or slightly under it, in your video card control panel, and change your rendered frames ahead (Flip queue if you're on AMD) to 1 or 2, this option usually also gives a small boost in framerate. The higher you go, the more input lag you'll get, though
[QUOTE=LATTEH;46203880]I agree 60fps should not be the standard. 120fps should be the standard I mean come on let's go forward not backwards![/QUOTE] but the human eye cant see more than 24 fps!
[QUOTE=LATTEH;46203880]I agree 60fps should not be the standard. 120fps should be the standard I mean come on let's go forward not backwards![/QUOTE] I think we should make every game 300fps, just like in l33t pro Counter-strike 1.6 configs.
[QUOTE=Mitsuma;46199362]Even rendered motion blur is bad in my opinion. It doesn't work as well as real motion blur (...)[/QUOTE] Depends on the method. We can't really make good real-time motion blur though, but we have some extremely good (and accurate) non-real time methods.
I bought a new pc to go above 30 frames. The fact they think we will accept 30 as a standard is ridiculous
Hey guys, did you know 30 fps tells a better story than 60 fps? [IMG]http://i.imgur.com/6zGbANF.png[/IMG] [url]http://wccftech.com/30fps-vs-60fps-30fps-better-story-telling-games/[/url]
[QUOTE=Yummy Pie;46210042]Hey guys, did you know 30 fps tells a better story than 60 fps? [IMG]http://i.imgur.com/6zGbANF.png[/IMG] [url]http://wccftech.com/30fps-vs-60fps-30fps-better-story-telling-games/[/url][/QUOTE] Not sure if you're serious and didn't read the update on top the story or not.
I bought a really good computer 5 years ago and this was the first time, when I first experienced the difference between 30 and 60 fps. I think it feels way more realistic and gives you gameplay benefits (if there's a game with high specs requirements I lower them just for the 60 fps).
Sorry, you need to Log In to post a reply to this thread.