• Most next gen console games will run at 30 FPS according to Carmack
    165 replies, posted
[QUOTE=alien_guy;38877546]Doubt you can do 1080p @ 120hz through HDMI.[/QUOTE] 120hz in TVs is nothing whatsoever to do with 120hz refresh rate - it's a marketing buzzword. 120hz in TVs means 24-30 fps raised to 60 fps through interpolation. If there is a single TV on the planet that is actually designed to accept a signal >60hz I'd love to see it. On an somewhat related note, the maximum you can push through HDMI before the average RAMDAC goes splat is about 108hz, and the chances of having a monitor capable of it are slim. (My Samsung 750D 120hz monitor will display an image for 60 seconds before cutting out, presumably as protection against damaging components)
[QUOTE=PvtCupcakes;38879481]quit spergin yall 30fps is fine. you have autism if you think you can tell the difference.[/QUOTE] You have to have some serious vision issues if you [I]can't[/I] tell the difference between 30 and 60+ fps.
I honestly dont noticed a change in FPS above 30. I find even playing at around 18-20fps is perfectly fine for me aswell.
[QUOTE=Darkimmortal;38880334]If there is a single TV on the planet that is actually designed to accept a signal >60hz I'd love to see it.[/QUOTE] There's those 3D TVs with shutter glasses, they run at 120Hz when showing a 3D signal
[QUOTE=laserguided;38877747]What is he basing this on or is he just saying it because he is saying it.[/QUOTE] Probably just saying it 'cause he hates consoles and he's John Carmack so everyone will believe him.
[QUOTE=PvtCupcakes;38879481]quit spergin yall 30fps is fine. you have autism if you think you can tell the difference.[/QUOTE] What?
I enter this thread with people talking like 30 FPS is a stutter piece of shit and unplayable. The hell, did I miss something somewhere? When did 30 FPS look like 10? I understand 60 FPS looks smooth as hell. I've compared it directly, watching a YouTube video then downloading a 60 FPS version of it. I lowered the graphics of a game to see smooth as hell animations. 60 FPS [I]does[/I]look positively glorious. But when I went back to 30 I didn't think "dear god this is terrible." It's just average. [editline]18th December 2012[/editline] [QUOTE=PvtCupcakes;38879481]quit spergin yall 30fps is fine. you have autism if you think you can tell the difference.[/QUOTE] 30 fps is fine, but the different is crystal clear between the two.
[QUOTE=Wormy;38881107]Would you recommend a 120Hz monitor? I have thought about buying one, but they are so expensive right now, at least a Samsung monitor that I saw with 120Hz.[/QUOTE] Only if you have the hardware to actually use it, I know many that barely get 60 fps in the newer games.
FPS on consoles does not matter that much because controllers don't require the precision anyway.
[QUOTE=LegndNikko;38881024]I enter this thread with people talking like 30 FPS is a stutter piece of shit and unplayable. The hell, did I miss something somewhere? When did 30 FPS look like 10? I understand 60 FPS looks smooth as hell. I've compared it directly, watching a YouTube video then downloading a 60 FPS version of it. I lowered the graphics of a game to see smooth as hell animations. 60 FPS [I]does[/I]look positively glorious. But when I went back to 30 I didn't think "dear god this is terrible." It's just average. [/QUOTE] I guess it's what you used to. As someone who has always prioritized framerate over graphics ever since I became a PC Gamer, whenever I play 30 fps it feels way worse after playing 60 fps all the damn time. It's not unplayable, but it feels and looks so wrong.
[QUOTE=Darkimmortal;38880334]120hz in TVs is nothing whatsoever to do with 120hz refresh rate - it's a marketing buzzword. 120hz in TVs means 24-30 fps raised to 60 fps through interpolation. If there is a single TV on the planet that is actually designed to accept a signal >60hz I'd love to see it. On an somewhat related note, the maximum you can push through HDMI before the average RAMDAC goes splat is about 108hz, and the chances of having a monitor capable of it are slim. (My Samsung 750D 120hz monitor will display an image for 60 seconds before cutting out, presumably as protection against damaging components)[/QUOTE] hz is defined as something per second (not exactly but you get the point) therefore 120 frames per second can be described as 120hz, ofcourse im not talking about retarded frame interpolation.
-snip for I am wrong-
[QUOTE=MadBomber;38881313]This seems very similar the the whole reason why films are only 24fps. Tradition and trends have stuck so fast that for whatever reason we don't want to break them. We have the capabilities and technology to achieve far faster frames per second but we don't want to move on. This of course is a terrible thing and keeps progression stunted.[/QUOTE] No its not like that at all. To film in 48fps over 24fps you need special cameras and the cinemas have to update their projectors. All devs have to do is choose what frame rate to run at.
[QUOTE=Warship;38880491]There's those 3D TVs with shutter glasses, they run at 120Hz when showing a 3D signal[/QUOTE] They accept a max 60hz signal even if the panel outputs 120 real hz or more. [QUOTE=alien_guy;38881261]hz is defined as something per second (not exactly but you get the point) therefore 120 frames per second can be described as 120hz, ofcourse im not talking about retarded frame interpolation.[/QUOTE] On a '120hz' TV, it's not 120 frames per second. It is 60. The panel is still an ordinary 60hz panel. On a '240hz' TV, it is 120 frames per second or 120 real hz. Check your facts before rating me dumb please, only with monitors are the hz values reported in marketing related to the actual hz capability of the panel. TVs can get away with it because 120hz is double 60hz (ie normal tv), just as 60 fps is double ~30 fps (ie normal tv).
It has just occurred to me that maybe I can't notice a difference between 30fps and 60fps because there is a chance my monitor might only have a 30Hz refresh rate.
[QUOTE=Raidyr;38879624]Unless I'm mistaken, most console games today run at 30-45 FPS. [editline]18th December 2012[/editline] If you think anything less than 60 is fine for an FPS you have no idea how shooters work.[/QUOTE] That depends entirely on the shooter, and the skill I'm required to be playing it at. Crysis on normal at 30 FPS? Not ideal, but whatever. Playing a game of TF2 online at 30 FPS? Really bad.
[QUOTE=MadBomber;38881313]This seems very similar the the whole reason why films are only 24fps. Tradition and trends have stuck so fast that for whatever reason we don't want to break them. We have the capabilities and technology to achieve far faster frames per second but we don't want to move on. This of course is a terrible thing and keeps progression stunted.[/QUOTE] Not the same at all. Even CoD runs at 60 fps and they brag about it being an advantage over Battlefield. It's mostly technology reasons, consoles are usually behind in hardware and devs might decide it's more important to run at 30 fps and get the maximum out of their hardware like they're doing now.
[QUOTE=IKTM;38877469]It's not that bad. 30 fps is still quite playable.[/QUOTE] meanwhile some people play PC games at 120 fps (yes there is a benefit, better input response, 8ms instead of 16ms, better mouse input)
That's stupid and all, but does John Carmack even matter anymore?
Doesn't matter too much if all next gen games have really good motion blur effects
[QUOTE=Darkimmortal;38881501]They accept a max 60hz signal even if the panel outputs 120 real hz or more. On a '120hz' TV, it's not 120 frames per second. It is 60. The panel is still an ordinary 60hz panel. On a '240hz' TV, it is 120 frames per second or 120 real hz. Check your facts before rating me dumb please, only with monitors are the hz values reported in marketing related to the actual hz capability of the panel. TVs can get away with it because 120hz is double 60hz (ie normal tv), just as 60 fps is double ~30 fps (ie normal tv).[/QUOTE] I wasn't referring to tv's though.
[QUOTE=RobbL;38881907]Doesn't matter too much if all next gen games have really good motion blur effects[/QUOTE] True (Crysis is more playable at 20-30fps than most games), but 60fps should still be the target, in my opinion. It reduces input lag, more closely matches the refresh rate, and it ensures everything moves smoothly.
Not too relevant considering this is about console games, but for rhythm gaming, a high framerate is especially important. When the best timing judgement available in a rhythm game is, for example, a 22.5 ms window (Stepmania 'Marvelous') then suddenly your framerate becomes a [i]very big deal.[/i] 60 FPS is a 16 ms delay, which means that if you're hitting exactly to the music, you could still easily hit outside of that timing window. Most rhythm games require ~250 fps before the delay truly becomes unnoticeable. (for a pro, at least - someone new to the genre might not notice even a 20-30 ms delay!) So that's my take on it.
Not to pull the PC master race card, but this just blows my mind. With my next round of upgrades I fully intend to have at least one 120/144 Hz capable monitor for several reasons. Native 3D support, and higher effective framerates are among them. If this 48fps movie stuff looks like it will catch on, I may want it even higher. I realize that some people cannot notice a distinct difference between 60 and anything higher, but I can in a blind side by side comparison. 30-40 FPS is jarring for me without blur effects and I detest the way those are generally done.
[QUOTE=JeanLuc761;38881959]True (Crysis is more playable at 20-30fps than most games), but 60fps should still be the target, in my opinion. It reduces input lag, more closely matches the refresh rate, and it ensures everything moves smoothly.[/QUOTE] It's visually playable but I did notice some input lag in Crysis 2 when it was like 25fps. Sure thanks to motion blur it looked smooth but it only looked so, but it played like a jerk.
[QUOTE=Darkomni;38882040]Not too relevant considering this is about console games, but for rhythm gaming, a high framerate is especially important. When the best timing judgement available in a rhythm game is, for example, a 22.5 ms window (Stepmania 'Marvelous') then suddenly your framerate becomes a [i]very big deal.[/i] 60 FPS is a 16 ms delay, which means that if you're hitting exactly to the music, you could still easily hit outside of that timing window. Most rhythm games require ~250 fps before the delay truly becomes unnoticeable. (for a pro, at least - someone new to the genre might not notice even a 20-30 ms delay!) So that's my take on it.[/QUOTE] I'm glad there are people who actually get the importance of fps besides image quality. There is a significant gameplay advantage to higher fps, there is a reason why Valve has TF2 and CSS's default fps cap set to 300 fps. Like seriously fps is about input, not so much look (even though sub 60 fps looks godawful)
As long as Call of Duty is a popular franchise, 60 FPS on consoles won't go extinct.
[QUOTE=Silikone;38883717]As long as Call of Duty is a popular franchise, 60 FPS on consoles won't go extinct.[/QUOTE] Every COD from COD 2 on runs at 60 FPS on the console. Doesn't make up for what the franchise turn into though.
[QUOTE=borisvdb;38878359]I don't understand how next gen consoles expect to compete with PC gaming if they keep the 30 FPS cap. Hell, some gamers nowadays have 120hz monitors.[/QUOTE] I don't see him, or anyone outside of you lot, saying it's a 30 FPS "cap", he said developers will aim for 30 FPS, as in, they intend for that to be a bare minimum. Because even though they said that about this generation, they are starting to struggle (many games dip into the 20s, some into the 10s). And 30FPS is pretty playable on a console. You don't notice input lag as much, because there genuinely seems to be less input lag (think about it, the consoles are literally running the game, leaving slightly more CPU time available for inputs).
FPS controls a lot more than just rendering. The amount of frames a second is also the amount of inputs a game can grab in that second. Low fps means it can grab less key presses or mouse movements in a second, thus producing a lag between each grab. A very low fps can even miss key presses or cache them and then push them all at once which also produces unwanted effects. FPS also controls game logic. While many well designed games multiply actions by delta time(time it took between last and current frame), low fps still introduces more inaccuracies. They're rarely game breaking but they can cause discomfort. Overall high framerate is better for everything. After playing PC games at 50+ fps I have an uneasy feeling when playing on consoles. Games where fast action and combos are needed, high fps is absolutely needed. If all you need to do is press one button then it doesn't matter too much, but if you need to capture that combo combination in half a second it starts to matter.
Sorry, you need to Log In to post a reply to this thread.