• Ultra Settings Suck
    98 replies, posted
[QUOTE=Im Crimson;52028880]How the fuck do you avoid this Just enable vsync always? Or just annihilate your wallet and get a Gsync/Freesync monitor if you need 144Hz?[/QUOTE] Having a graphics card that will give you consistent high framerates on uncapped games and enabling vsync on those that only allow for 60Hz hasn't given me any trouble visually, personally speaking.
[QUOTE=Im Crimson;52028880]How the fuck do you avoid this Just enable vsync always? Or just annihilate your wallet and get a Gsync/Freesync monitor if you need 144Hz?[/QUOTE] If the game is hard capped at 60fps, you might be able to change your refreshrate to 120hz, that'll fix the stuttering, you just have to switch it back off afterwards
[QUOTE=Im Crimson;52028880]How the fuck do you avoid this Just enable vsync always? Or just annihilate your wallet and get a Gsync/Freesync monitor if you need 144Hz?[/QUOTE] Having vsync on when possible isn't a bad solution (make sure it's triple buffered kids!). But a lot of modern games are offering better solutions now like adaptive resolution paired with half-vsync, where it will cap to a lower bound of your refresh rate/ 2 (so 30, 72, 120. etc.), and just scale the resolution down to around 80/70% of the native resolution until it meets that. Not being on the vblank of your monitor and getting off-paced frames is probably more jarring than having half the refresh rate of your display.
I just turn off post-processing and shadows down to their lowest settings on most games if my PC can't handle it. Way I see it I am not looking at their shadows anyway so why does it matter, that blur from PP is so ugly as well i always turn it off regardless of if it affects my performance
[QUOTE=hexpunK;52028998]Not being on the vblank of your monitor and getting off-paced frames is probably more jarring than having half the refresh rate of your display.[/QUOTE] This is really key and a lot of people don't even realise it. This is the number 1 reason I always play with vsync because otherwise it's just intolerable to look at because there's a constant "swimming" effect as frames come in and out of alignment with the vblank. Even sometimes when I see people talk about frame pacing they are not really talking about the same thing, every other person i've seen talks more about having frame times that do not vary a lot from eachother whereas really what you want to do is get a frame at the vblank that has updated the correct amount of time from the last one. Vsync isn't really an option as such, it's kind of just how your monitor works (gsync changes this if your fps is lower than max hz of course). Your monitor updates at whatever hz it is set to, if you turn off vsync you're just providing wrong frames to it. You can't get smoother animation than your maximum hz, in fact if you turn off vsync you're almost definitely getting far less smooth animation unless you have at least 2 times your monitors hz fps (far more if it's highly variable) and even then you're just reducing the error which you caused yourself anyway. All you can do is attempt to reduce latency but just brute forcing frames is a really wasteful and often worse way to do it for a lot of reasons (the thing about variable latency is that it varies..). Unfortunately i've never come across a game that does anything better for latency than turning vsync off. But the latency of a 60hz monitor is really not that bad, usually it's just certain games fucking it up, so really it's worth thinking about all that waste for a potential couple ms less latency, unless you really like fan noise or something. Something like nvidias fast sync is a better option because it attempts to get multiples of your monitors hz, but even that can only do its best when faced with highly varying frames.
[QUOTE=StrawberryClock;52027369]To be honest, I don't even see why you would NEED AA with a 4K monitor. I'm using 2xAA as my base setting but only because with my 1080p monitor, long range marksmanship is made really difficult by aliasing.[/QUOTE] You don't. At 8K aside from font niceties it is utterly redundant.
[QUOTE=hexpunK;52028998]Having vsync on when possible isn't a bad solution (make sure it's triple buffered kids!). But a lot of modern games are offering better solutions now like adaptive resolution paired with half-vsync, where it will cap to a lower bound of your refresh rate/ 2 (so 30, 72, 120. etc.), and just scale the resolution down to around 80/70% of the native resolution until it meets that. Not being on the vblank of your monitor and getting off-paced frames is probably more jarring than having half the refresh rate of your display.[/QUOTE] Doesn't vsync cause noticeable input latency, triple buffering even moreso? Also, afaik vsync caps fps at 60, so won't that be a problem at 144Hz? As opposed to 60 cap at 120Hz
Honestly, I just want my foliage and base shadows looking nice and everything else I don't give a shit about. [editline]29th March 2017[/editline] [QUOTE=hexpunK;52028998]Having vsync on when possible isn't a bad solution[/QUOTE] It's a pretty shitty solution if you like responsive controls, unfortunately.
[QUOTE=Talishmar;52029806]Doesn't vsync cause noticeable input latency, triple buffering even moreso? Also, afaik vsync caps fps at 60, so won't that be a problem at 144Hz? As opposed to 60 cap at 120Hz[/QUOTE] It syncs to v-blank if implemented correctly. So no, it shouldn't be a problem. Though it's not unheard of for devs to implement it as a framerate lock, which is wholly incorrect and would be a problem. VSync is "vertical-sync", as in, it syncs to the completion of a vertical-blank (a complete refresh, top-to-bottom) of a display. If it isn't syncing to your monitor, it's not actually vsync. This is why GSync and FreeSync are big game changers, all the benefits of VSync, none of the issues of locked refresh rates and framerates.
[QUOTE=gk99;52029816]Honestly, I just want my foliage and base shadows looking nice and everything else I don't give a shit about. [editline]29th March 2017[/editline] It's a pretty shitty solution if you like responsive controls, unfortunately.[/QUOTE] Triple Buffer or bust
[QUOTE=gk99;52029816]It's a pretty shitty solution if you like responsive controls, unfortunately.[/QUOTE] Can't say I've ever noticed input being particularly worse, if your GPU is keeping up to maximum capacity then triple-buffering shouldn't introduce anything too awful, where double-buffering could introduce 1-2 frames of latency. If you can't keep up with the refresh rate, and it halves to match vblank, then yeah, you're getting effectively double the latency in all situations. It's entirely down to how much of a hack job the implementation of VSync is for that render pipeline.
[QUOTE=Loadingue;52026970]God rays are often uncalled for and should be disabled. You're in a normal city in broad daylight? If there are god rays there then it's realistically not correct. God rays only appear in real life where thin particles are dense in the air (such as dust in old rooms and houses, or ambient humidity in a jungle). God rays in gaming are seriously overrated. Also reminder that sharp shadows don't always mean better shadows. Shadows become blurrier based on distance between the object and the surface the shadow is projected on, while the brightness of the light source also impacts this significantly (dimmer means blurrier). Those sharp building shadows on a cloudy day in CSGO? An abherration. So sometimes you're gonna want to compare the different shadow settings and find the best one, based on your environment.[/QUOTE] Planetside 2 always surprised me on how heavy their volumetric/god rays are and yet I can run the game fine.
[QUOTE=hexpunK;52029851]Can't say I've ever noticed input being particularly worse, if your GPU is keeping up to maximum capacity then triple-buffering shouldn't introduce anything too awful, where double-buffering could introduce 1-2 frames of latency. If you can't keep up with the refresh rate, and it halves to match vblank, then yeah, you're getting effectively double the latency in all situations. It's entirely down to how much of a hack job the implementation of VSync is for that render pipeline.[/QUOTE] It depends what kind of triple buffering you're talking about. Pretty much every game that has that option is actually adding another back buffer in to a queue and shifting the latency back another frame. Most people think it's the triple buffering where 2 buffers are constantly rendered to whilst the 3rd displays the most recently completed one. But the thing about that is it has all the frame timing problems that just turning vsync off has only with no tearing. To my eyes it's actually worse and if I had to pick I would just go straight vsync off. This kind of triple buffering can be gotten by playing games in window mode with vsync off. Double buffering doesn't add any latency. It's just that one buffer is being displayed while the other is being written to. Using that you could have 1ms latency just by updating the 2nd buffer 1ms before the vertical blank it gets displayed at. But the average game isn't gunna be doing this. Vsync is not some awful latency inducing thing. Don't get me wrong almost all games use it naively and you'll be getting some multiple of whatever your frame times are as latency, but in itself it's just a tool for synchronising frames. Without vsync using a computer would be awful, imagine no vsync on the desktop, and most people probably don't think they have latency on the desktop. Every time I see someone complain about screen tearing I just wonder, well what did you think vsync did and why did you turn it off?
[QUOTE=Nerfmaster000;52027304]I thought everyone hated 2kilksphilips here, or at least those in the mapping scene.[/QUOTE] nah 2kliksphilip is fine, it's 3kliksphilip you should watch out for
[QUOTE=JackDestiny;52026987]I used to play with V-sync most of the time but I turned off for competitive games. The screen tearing is noticable but you get used to it and now [B]I've disabled V-sync on every game and it feels much more responsive (unless you have a g-sync monitor of course[/B]).[/QUOTE] If you have Nvidia then use Fast Vsync in the Control Panel. No screen tearing and no noticeable input lag.
[QUOTE=Rixxz2;52028182]That was a long time ago He actually even made a video addressing it [video=youtube;V9OyRqP8c6s]https://www.youtube.com/watch?v=V9OyRqP8c6s[/video][/QUOTE] heh in that video at 3:44 i'm recommending him for his tutorials and literally everyone is giving me shit :v:
[QUOTE=Stiffy360;52026536]Deffered rendering really threw a wrench in AA, because it renders the scene multiple times, then composites it. Which is why Post Process AA is all the Rages such as temporal, as you can just slap it on top. Issue is, it has no scene information, so it just tries to blur what it thinks is an edge, which usually results in a blurrier image.[/QUOTE] Good TAA is incredible. Doom's TAA is downright magic. Unreal 4's is usually pretty decent.
[QUOTE=Brt5470;52030060]Good TAA is incredible. Doom's TAA is downright magic. Unreal 4's is usually pretty decent.[/QUOTE] iirc when I used TAA on DOOM 4 it looked blurrier than FXAA, must because i was using sharpening, 1024x576 and a 85% render scale
[QUOTE=Fox Powers;52030084]iirc when I used TAA on DOOM 4 it looked blurrier than FXAA, must because i was using sharpening, 1024x576 and a 85% render scale[/QUOTE] Their highest end TSSAA was what I was referring to. Try that, because it looks really good. For me at 1440p it's tack sharp, really good reprojection. At lower resolutions I could see TSSAA looking softer with larger pixels.
[QUOTE=Brt5470;52030097]Their highest end TSSAA was what I was referring to. Try that, because it looks really good. For me at 1440p it's tack sharp, really good reprojection. At lower resolutions I could see TSSAA looking softer with larger pixels.[/QUOTE] ill prolly give it a go if i get the full version of doom
[QUOTE=J!NX;52028090]Fucking Lens flare is the worst [t]http://static.gamespot.com/uploads/original/349/3494045/2456257-7102613253-1.gif.gif[/t] Especially when the devs are incompetent about it[/QUOTE] Overblown artistic lens flares are my gaming guilty pleasure.
[QUOTE=Why485;52030233]Overblown artistic lens flares are my gaming guilty pleasure.[/QUOTE] I think Halo Combat Evolved was one of the first games where I saw the sun cutting through the trees, that along with the lens flare was just so awesome to look at.
I'm really glad this video exists, it's nice to hear a more 'technical' channel (Even though he's usually more focused on CS GO mechanics in particular) pointing out that the best possible isn't really necessary. Playing Fallout 4 was a learning experience for me, because I hit the game with the slap in the face that not only was I not able to run it on ultra, I could barely run it with the lowest settings and had to use external mods to get a good framerate. Despite all that I pushed through determined to finish the game before the inevitable spoilers would find me, and I ended up having a pretty good time not counting all the other aspects of the game that make my dick soft like the story. When I think back to my time with the game I don't really remember it for the bad shadows, or the low draw distance, I remember it for the gameplay and the exploration and the companions and the shitty story. And at the end of the day that's what really matters.
[QUOTE=Philly c;52029935]Double buffering doesn't add any latency. It's just that one buffer is being displayed while the other is being written to. Using that you could have 1ms latency just by updating the 2nd buffer 1ms before the vertical blank it gets displayed at. But the average game isn't gunna be doing this.[/QUOTE] Ah good point, I usually fuck up which VSync buffering implementation introduces which latency. We're well past the point where VSync is a genuine problem in 99.999% percent of games now anyway, most of the prefab engines out there have reasonably fine implementations now so you only really have to keep an eye out for developers who insist on implementing their own engines entirely as their solution isn't proven yet. If you can already hit the refresh rate of your display consistently, might as well turn it on tbh. Most games aren't going to suffer from any extra latency. Hell even most competitive shooters aren't going to be a problem if you're just playing with average dudes.
Lens flares in first person don't even make any sense considering you're supposed to be looking through the characters eyes. Unless the character is some kind of cyborg with camera eyes
[QUOTE=Kenneth;52037017]Lens flares in first person don't even make any sense considering you're supposed to be looking through the characters eyes. Unless the character is some kind of cyborg with camera eyes[/QUOTE] It'd be pretty awesome if there was a RPG where you could get cybernetic augmentations, and getting cyber eyes adds a lens flare to lights.
[QUOTE=kaze4159;52028855]It's because 60 isn't a multiple of 144, so you're not seeing the frame for a consistent amount of time I couldn't really explain it well so I made an example :v: [video=youtube;0iRK_3E9HSI]https://www.youtube.com/watch?v=0iRK_3E9HSI&[/video] The top bar is moving at 30fps, the bottom at 25fps. 30fps looks fine on 60hz monitors because it's just showing each frame twice. The bottom bar twitches back and forth because you're seeing each frame either 2 or 3 times depending on how the two sync up[/QUOTE] I don't think there are 144Hz monitors that don't support variable refresh rates (G-Sync/FreeSync), which solve exactly this problem.
[QUOTE=DrTaxi;52037645]I don't think there are 144Hz monitors that don't support variable refresh rates (G-Sync/FreeSync), which solve exactly this problem.[/QUOTE] The problem is GSync and FreeSync are vendor locked. Which is a pain in the ass. There are some out there that don't support either though, I picked up a ASUS VG248QE, which doesn't support GSync out of the box and nVidia stopped selling the upgrade kits you could slap into them to support it.
I don't use ultra settings because i dont want my computer to get hot
[QUOTE=Kenneth;52037017]Lens flares in first person don't even make any sense considering you're supposed to be looking through the characters eyes. Unless the character is some kind of cyborg with camera eyes[/QUOTE] Well you do get lens flares if you're wearing goggles. Which reminds me of a really fucking clever detail in ARMA II, where if you'd look at the sun through first person you'd get a lot of bloom but looking at the sun through third person would give you a camera lens flare.
Sorry, you need to Log In to post a reply to this thread.