• John Carmack: “it's time to start pushing forward on higher frame-rate, lower latency”
    34 replies, posted
[url]http://www.pcgamesn.com/john-carmack-its-time-start-pushing-forward-these-higher-frame-rate-lower-latency[/url]
Lol, you guys are WAY late on this one.
Too right. It's surprising that with all the advances in technology over the years these things have virtually been at a standstill. We're even having next-gen games locked at 30fps, that's arguably a step backwards.
[QUOTE=Bread_Baron;42821993]It's surprising that with all the advances in technology over the years these things have virtually been at a standstill.[/QUOTE] The problem is that if we want to double framerate, we'd basically be holding graphical fidelity constant for a generation of graphics cards. Maybe with the new nvidia refresh technology we can skip the need to double it at once (60Hz->120Hz) and instead do fractional gains (e.g. 60Hz->80Hz will still be better). That would let it gradually improve over generations, with a much lower reduction to the rate of fidelity increase.
I'd still say actually game design and game writing need some new things to do, not just the graphics. I mean in many ways games have gone a shitload backwards in design and writing, while the tech just keeps on going forward.
[QUOTE=I am Error;42822036]I'd still say actually game design and game writing need some new things to do, not just the graphics. I mean in many ways games have gone a shitload backwards in design and writing, while the tech just keeps on going forward.[/QUOTE] amen to that. I dont care how pretty my games are, I dont feel like playing a pretty remake of the same games over and over again.
[QUOTE=Thy Reaper;42822034]The problem is that if we want to double framerate, we'd basically be holding graphical fidelity constant for a generation of graphics cards. Maybe with the new nvidia refresh technology we can skip the need to double it at once (60Hz->120Hz) and instead do fractional gains (e.g. 60Hz->80Hz will still be better). That would let it gradually improve over generations, with a much lower reduction to the rate of fidelity increase.[/QUOTE] Raw graphics are a huge money sponge and the closer we get to perfection, the more easily for little things to immediately stick out. That doesn't just mean the uncanny valley, it reaches from uncomfortable headbobbing and non-rendered legs to clipping between mocap and virtual objects to fire arrows not setting wood on fire. Entire sections that would fall under graphics like finding a cheap way to render flowing water have also been been entirely neglected in order to up poly count and tex res and make even more particles appear on screen. I'd sacrifice graphics in a heartbeat for something like actually amazing AI or gameplay that interacts in tons of different ways with the environment or some decent facial animations. Just put an alright art style on it and you've got me covered graphically. I think another reason why focusing graphics is kinda wasteful is because it gets overhauled so frequently, whereas a major step forward in something like AI should be recycle-able for different games of the same genre. Yeah you can recycle the general engine, but when you copypaste your entire prop catalogue it starts smelling fast. Recycling AI on the other hand, unless you do something as characteristic as Bioshock Infinite for example, shouldn't stick out too much. That said, I imagine it's more complicated to code AI or environmental interacts between who-knows-how-many different objects with different properties.
Hate to quote the call of duty guys, but "Graphics serve gameplay" You can have pretty graphics, but if they aren't there for a good reason, it's best to remove it. As long as you have a good art style, you can make even low resolution stuff look really nice. (super mario galaxy, windwaker, etc)
Everyone's always complaining about how consoles won't reach 60fps 1080p but that's mostly because instead of doing that, devs focus on bigger textures, higher polycounts, more shaders etc. If we stopped or slowed down the progress towards "realistic graphics" we could easily have anti-aliased 60fps 1080p games on next-gen consoles. The graphics would certainly look worse than if we didn't, but what would people play between a game that has very good looking graphics (obscured by upscaling from 900p and with a choppy framerate) or a game that has alright graphics but is pixel-perfect, and renders smoothly?
[QUOTE=latin_geek;42822358]Everyone's always complaining about how consoles won't reach 60fps 1080p but that's mostly because instead of doing that, devs focus on bigger textures, higher polycounts, more shaders etc. If we stopped or slowed down the progress towards "realistic graphics" we could easily have anti-aliased 60fps 1080p games on next-gen consoles. The graphics would certainly look worse than if we didn't, but what would people play between a game that has very good looking graphics (obscured by upscaling from 900p and with a choppy framerate) or a game that has alright graphics but is pixel-perfect, and renders smoothly?[/QUOTE] Nowadays I have a serious hardtime playing any multiplayer game without 60fps. I'd rather play with worse graphics then lower fps. For singleplayer games I try to balance it more. And, they could've [B]tried[/B](not to imply that they haven't, but maybe not enough?) to crank more performance in the next-gen consoles.
You aren't going to get that Carmack until someone overhauls the programming ethics in the field.
[QUOTE=Stiffy360;42822355]Hate to quote the call of duty guys, but "Graphics serve gameplay" You can have pretty graphics, but if they aren't there for a good reason, it's best to remove it. As long as you have a good art style, you can make even low resolution stuff look really nice. (super mario galaxy, windwaker, etc)[/QUOTE] art style IS graphics. Graphics is what you see. Art style is what you see. They're the same thing. Games can have tricked-out DoF and DX11 but with a shitty art style they still look like shit. Wind Waker has good graphics.
[QUOTE=Schmaaa;42822691]art style IS graphics. Graphics is what you see. Art style is what you see. They're the same thing. Games can have tricked-out DoF and DX11 but with a shitty art style they still look like shit. Wind Waker has good graphics.[/QUOTE] When people say graphics they mean graphical fidelity, not art style.
[QUOTE=Schmaaa;42822691]art style IS graphics. Graphics is what you see. Art style is what you see. They're the same thing.[/QUOTE] It really isn't. It's clear what you mean but the difference is that a game with low-fidelity graphics can still look good if the art style is good. Vegetation in 3D games is a good example.
I kind of agree when he says the graphics are more than adequate. We've made a lot of progress in the last 20 years. It would hurt to stop right here and learn how to do what's being done more efficiently.
[QUOTE=Stiffy360;42822355]Hate to quote the call of duty guys, but "Graphics serve gameplay" You can have pretty graphics, but if they aren't there for a good reason, it's best to remove it. As long as you have a good art style, you can make even low resolution stuff look really nice. (super mario galaxy, windwaker, etc)[/QUOTE] And Okami, to add to the list. Graphics may serve gameplay, but certain game genres outright require higher framerate in order to play properly. For example, if you have low FPS in an FPS, good luck hitting anything without autoaim or shot homing most of the time. If I had to make a toss-up between something that looked like shit but ran like a dream and something that looked like a dream but ran like shit, I'd choose the former, since at least I could actually PLAY the damn thing without having to adjust to it's choppy crap factor. The march towards "photorealism" needs to slow down, especially since no machine can ever hope to simulate true photorealism in the current century. Save the advances in 3D rendering for CGI movies and the like; the next couple of years need to be the years of amazing framerate over photorealism, and to counteract the nature of games looking a little like arse of the technical side, actually put some thought and passion into your artstyle so that people are too enraptured by the art direction to worry about a couple of errant jaggies here and there.
[QUOTE=pentium;42822546]You aren't going to get that Carmack until someone overhauls the programming ethics in the field.[/QUOTE] I was under the impression that a big part of it was actually the GPUs themselves not being built with low latency in mind. I think one of the examples he used in a keynote speech a few years ago was something to the effect of "it takes less time for a TCP packet to get to the other side of the world then it does for a pixel to change color on your screen!" [editline]10th November 2013[/editline] The hardware for these networks was built from the ground up with low latency in mind. The components used in gaming are not always so.
[QUOTE=Marik Bentusi;42822200]Raw graphics are a huge money sponge and the closer we get to perfection, the more easily for little things to immediately stick out. That doesn't just mean the uncanny valley, it reaches from uncomfortable headbobbing and non-rendered legs to clipping between mocap and virtual objects to fire arrows not setting wood on fire. Entire sections that would fall under graphics like finding a cheap way to render flowing water have also been been entirely neglected in order to up poly count and tex res and make even more particles appear on screen. I'd sacrifice graphics in a heartbeat for something like actually amazing AI or gameplay that interacts in tons of different ways with the environment or some decent facial animations. Just put an alright art style on it and you've got me covered graphically. I think another reason why focusing graphics is kinda wasteful is because it gets overhauled so frequently, whereas a major step forward in something like AI should be recycle-able for different games of the same genre. Yeah you can recycle the general engine, but when you copypaste your entire prop catalogue it starts smelling fast. Recycling AI on the other hand, unless you do something as characteristic as Bioshock Infinite for example, shouldn't stick out too much. That said, I imagine it's more complicated to code AI or environmental interacts between who-knows-how-many different objects with different properties.[/QUOTE] I couldn't agree with you more, this obsessive graphical enhancements and improvements might look neat but if you can't back it up with good gameplay, it's useless. Graphics can enhance the gameplay and provide great atmosphere but this is also achievable by other means. Actually I would love to see more improvements on level design (like Dark Souls), AI (Fear) or diversity in how different mechanics work together and make it more dynamic. Also more optimization would be neat, how can people enjoy good graphics if they need ridiculous rigs or badly optimized console games. The only thing I want from a game graphicwise is a good art style and a decent framerate.
[QUOTE=SGTNAPALM;42824054]I was under the impression that a big part of it was actually the GPUs themselves not being built with low latency in mind. I think one of the examples he used in a keynote speech a few years ago was something to the effect of "it takes less time for a TCP packet to get to the other side of the world then it does for a pixel to change color on your screen!" [editline]10th November 2013[/editline] The hardware for these networks was built from the ground up with low latency in mind. The components used in gaming are not always so.[/QUOTE] With what type of connection? A direct line through the Earth in a vacuum would still take ~42ms to get to the other side of the Earth, that's enough time for a GPU to render 2 frames at 60fps. It might take longer for a monitor to display it, but that's not the fault of the GPU/API/programmer/whatever else people want to blame. Edit: And networking actually has big issues with latency, look up stuff like bufferbloat.
[QUOTE=TheDecryptor;42825204]With what type of connection? A direct line through the Earth in a vacuum would still take ~42ms to get to the other side of the Earth, that's enough time for a GPU to render 2 frames at 60fps. It might take longer for a monitor to display it, but that's not the fault of the GPU/API/programmer/whatever else people want to blame. Edit: And networking actually has big issues with latency, look up stuff like bufferbloat.[/QUOTE] More than just two actually, by definition at 60 fps each frame is rendered in 1/60 of a second, so each frame is rendered in 16.66~ ms. Though of course we're talking fractions of a frame there, so it's not worth nitpicking. Dunno how you could possibly have lower latency between two opposite points on earth than two things physically within a foot of each other.
[QUOTE=ironman17;42824005]no machine can ever hope to simulate true photorealism in the current century. .[/QUOTE] um we went from this [t]http://img.gamefaqs.net/screens/4/6/d/gfs_45139_2_14.jpg[/t] to this [t]http://images.eurogamer.net/2013/articles//a/1/5/8/0/9/1/1/MetroLL_1080p_8.bmp.jpg[/t] in only 11 years. thats a pretty big jump and to say that we cant get better than this is very silly
Yeah, it was something like 2.51 frames, but I didn't want to start thinking about buffer flips or whatnot.
[QUOTE=SGTNAPALM;42824054]I was under the impression that a big part of it was actually the GPUs themselves not being built with low latency in mind. I think one of the examples he used in a keynote speech a few years ago was something to the effect of "it takes less time for a TCP packet to get to the other side of the world then it does for a pixel to change color on your screen!" [editline]10th November 2013[/editline] The hardware for these networks was built from the ground up with low latency in mind. The components used in gaming are not always so.[/QUOTE] We have incredible hardware that's running against the lazy fucks of the industry who can't be assed to optimize for a processor much less a GPU. Kind of hard to dedicate sufficient effort to a project when you generally think "It will work and we'll make money anyways". I'm not going to use Minecraft for an example but I'll use COD because it has faster paced online play. First Person Shooters have been developed to death. We know what does and does not work in the series. How are we shitting out games that are not able to fit on one DVD and require a system no more than a year old to make it useable albeit with a steady flow of bugfixes afterwards?
[QUOTE=CakeMaster7;42825527]More than just two actually, by definition at 60 fps each frame is rendered in 1/60 of a second, so each frame is rendered in 16.66~ ms. Though of course we're talking fractions of a frame there, so it's not worth nitpicking. Dunno how you could possibly have lower latency between two opposite points on earth than two things physically within a foot of each other.[/QUOTE] When he said that he was talking about full cycle lag, from how long it took to press a key, for it to be registered and displayed (Which in total is about 35~ ms.) I can ping a server across the US in 50, The timing is way too close, the fully cycle should be much lower.
It's pretty crazy that a game like, say, GTA5, is running on 3 cores and 512mb of RAM at 25ish frames per second. To have a game that's only moderately more demanding than GTA5, or in other cases less so, that isn't running at 60FPS on a console with 8 cores and 8Gb of RAM and 10 times the GPU ops/second is nothing less than lazy/unfocussed development. If they wanted to and optimized what they were making, with consoles having that much horsepower, It would take 10 years of constant development to create enough shit in a game to be too much for that console to handle I think the mentality of some developers is 25 is enough, others 30, and very few 60. The standard should be 60 now
[QUOTE=ironman17;42824005] especially since no machine can ever hope to simulate true photorealism in the current century. [/QUOTE] hahaha WHAT [editline]10th November 2013[/editline] Like really? We're already pretty close to photorealism, some games are good looking to the point of being able to caused people to have to actually examine the image to know it isn't a game, that is major progress from 1998 with Half-Life, or 1993 with Doom. If we can get Doom to Crysis type of progression consistently for the next 14 years, then we will have undoubtedly photorealistic.
[QUOTE=be;42826016]hahaha WHAT [editline]10th November 2013[/editline] Like really? We're already pretty close to photorealism, some games are good looking to the point of being able to caused people to have to actually examine the image to know it isn't a game, that is major progress from 1998 with Half-Life, or 1993 with Doom. If we can get Doom to Crysis type of progression consistently for the next 14 years, then we will have undoubtedly photorealistic.[/QUOTE] No we aren't, we're not even doing raytracing yet. We're just using shitty tricks that have their limits. We're so far from perfecting graphics, and we're probably not gonna have the machines to pull of proper light simulation in real time for the next then years or so. [editline]11th November 2013[/editline] Heck, even proper motionblur is a bitch to pull off. [editline]11th November 2013[/editline] And then there's volumetrics, smoke and fluid simulations, breakable/deformable objects etc etc.
Even raytracing isn't entirely realistic, you need to implement stuff like caustics and global illumination alongside it. Path tracing is realistic, but it has the downside of taking a long time to converge on a correct answer. When we have a GPU capable of path tracing that can converge a scene a couple of hundred times a second, then we can start talking about photorealisim (And then it comes down to stuff like model quality, physics, logic, etc.). No amount of fancy graphics will help if objects bounce around or character models twist their heads in strange ways (or intersect with geometry)
Should probably mention another reason why it's not realistic on a fundamental level, in that it uses polygons. Euclideon might have the answer to that though.
[QUOTE=Mike Tyson;42825568]um we went from this [t]http://img.gamefaqs.net/screens/4/6/d/gfs_45139_2_14.jpg[/t] to this [t]http://images.eurogamer.net/2013/articles//a/1/5/8/0/9/1/1/MetroLL_1080p_8.bmp.jpg[/t] in only 11 years. thats a pretty big jump and to say that we cant get better than this is very silly[/QUOTE] note that there's more than just raw power involved: there's also advances in digital art. there are games that look pretty damn good on really shit Android phones, for example
Sorry, you need to Log In to post a reply to this thread.