• Ultra Settings Suck
    98 replies, posted
Anti-alising is just a closer approximation to reality. With powerful enough hardware and appropriate algorithms, why not have it?
To be honest, I don't even see why you would NEED AA with a 4K monitor. I'm using 2xAA as my base setting but only because with my 1080p monitor, long range marksmanship is made really difficult by aliasing.
[QUOTE=StrawberryClock;52027369]To be honest, I don't even see why you would NEED AA with a 4K monitor. I'm using 2xAA as my base setting but only because with my 1080p monitor, long range marksmanship is made really difficult by aliasing.[/QUOTE] You don't. You barely even need it at 1440p. It's only enabled in benchmarks for the sake of it.
[QUOTE=StrawberryClock;52027369]To be honest, I don't even see why you would NEED AA with a 4K monitor. I'm using 2xAA as my base setting but only because with my 1080p monitor, long range marksmanship is made really difficult by aliasing.[/QUOTE] i always go 4xAA or none, since 2x isnt square and 8x isnt noticeable
[QUOTE=abcpea2;52027424]i always go 4xAA or none, since 2x isnt square and 8x isnt noticeable[/QUOTE] I find 2xAA to be a better compromise, even if it's not as clean as 4xAA. 4xAA is just WAY too expensive.
[QUOTE=paul simon;52026368]SSAO is very light performance wise compared to the heavier AO implementations. Besides, I feel AO is an extremely important graphical setting.[/QUOTE] i was gonna say this. games without ao look like those cheap, dreamworks knock off animated movies.
[QUOTE=TFA;52026571]Modern implementations of SMAA are actually really good and take very little FPS. The SMAA used in Reshade can blur along the edges of the depth buffer rather than the image/color information, which provides results almost rivaling MSAA. Check out the post solution used here off: [t]http://images.akamai.steamusercontent.com/ugc/263845380501641876/4EF4E11F0E998E04E8C4756F0B107AF10E65D39C/[/t] on: [t]http://images.akamai.steamusercontent.com/ugc/263845380501642120/1F9A70C40535DDA610C76A977EA8EB48B069E15F/[/t] Through use of lumasharpen, overall image quality actually becomes crisper. Source: [url]https://nomansskymods.com/mods/sweetfx-smaa-lumasharpen-fxaa-best-of-both-worlds-after-fx/[/url][/QUOTE] SMAA is the best, with basically every game I play I disable the ingame AA and inject SMAA if it's not an ingame option.
[QUOTE=Stiffy360;52026536]Deffered rendering really threw a wrench in AA, because it renders the scene multiple times, then composites it. Which is why Post Process AA is all the Rages such as temporal, as you can just slap it on top. Issue is, it has no scene information, so it just tries to blur what it thinks is an edge, which usually results in a blurrier image.[/QUOTE] Temporal AA definitely needs some information about the scene to work properly. It basically blends the new frame with the previous one, so it needs to track the movement of each pixel between frames and reproject the previous frame accordingly. The process isn't exactly perfect though, so it usually introduces some amount of blur in motion.
[QUOTE=Sini;52026898]Sadly some games will get awful input lag with V-sync turned on, but I try to have it active whenever possible.[/QUOTE] How about playing in borderless fullscreen ? Windows applies triple(or double, I don't know) buffering on all windows, so you can play without screen tearing at high framerates and the mouse feels much smoother.
[QUOTE=AntonioR;52027568]How about playing in borderless fullscreen ? Windows applies triple(or double, I don't know) buffering on all windows, so you can play without screen tearing at high framerates and the mouse feels much smoother.[/QUOTE] You'll usually lose some performance in a lot of games if you play in windowed mode.
[QUOTE=StrawberryClock;52027610]You'll usually lose some performance in a lot of games if you play in windowed mode.[/QUOTE] How much, really? I've heard this but I've never noticed any significant difference myself. XCOM, TW2, TW3, BF3, they all seemed the same to me.
In most games I end up turning ambient occlusion off because of some weird shit looking implementation that bugs me more than not having it, like when there's a really noticeable edge between the shadow and the object You really don't even notice it while playing most of the time, only if you A/B between screenshots
There's so many kinds of anti-aliasing out there nowadays; I can't keep up. Would it kill game devs to include a tooltip telling you what each type does, and how demanding it is?
All games need to have pictures showing the difference between each graphics setting, like this: [t]https://www.pcinvasion.com/wp-content/uploads/2017/03/gr-wildlands-graphics-5.png[/t]
[QUOTE=Im Crimson;52027674]How much, really? I've heard this but I've never noticed any significant difference myself. XCOM, TW2, TW3, BF3, they all seemed the same to me.[/QUOTE] I think its just old advice, it used to be a problem because windows would be rendering the game + the desktop and stuff but now days that isn't such a large performance killer as it used to be.
[QUOTE=Im Crimson;52027674]How much, really? I've heard this but I've never noticed any significant difference myself. XCOM, TW2, TW3, BF3, they all seemed the same to me.[/QUOTE] It will usually be in the form of non-trivial increase to input delay, lower or even locked framerate, even sometimes rendering issues. On the other hand, some games might not act any different whether fullscreen or windowed. The entire responsibility for this rests in the hands of the developers and how well coded and optimized they decide to make the game engine. There is no consistency from game-to-game in this regard unless they use the same graphics engine. As a matter of fact, it's the same reason why some games will react very poorly to Alt+Tab while others will work perfectly.
[QUOTE=Loadingue;52026970]God rays are often uncalled for and should be disabled. You're in a normal city in broad daylight? If there are god rays there then it's realistically not correct. God rays only appear in real life where thin particles are dense in the air (such as dust in old rooms and houses, or ambient humidity in a jungle). God rays in gaming are seriously overrated. Also reminder that sharp shadows don't always mean better shadows. Shadows become blurrier based on distance between the object and the surface the shadow is projected on, [B]while the brightness of the light source also impacts this significantly (dimmer means blurrier).[/B] Those sharp building shadows on a cloudy day in CSGO? An abherration. So sometimes you're gonna want to compare the different shadow settings and find the best one, based on your environment.[/QUOTE] This bit is nonsense, sorry. Only light source size, shape and distance affects penumbra, or its blurriness. Light source brightness has zero effect on this.
[QUOTE=paul simon;52027961]This bit is nonsense, sorry. Only light source size, shape and distance affects penumbra, or its blurriness. Light source brightness has zero effect on this.[/QUOTE] Well in this case it's not about the light source itself but the fact that if it travels through a cloud it should be more diffuse due to partial refraction, in regards to his CSGO example I mean. Otherwise, you're right.
I usually set preset to medium or high then up the textures and models to ultra, because those are the most important to me.
I feel like a retard for saying this but I actually sold my 144Hz monitor at a ~50USD loss and went back to my old 60Hz monitor. 100+FPS on such a monitor is absolutely amazing, the drawback being that framerate fluctuations are extremely noticeable, which IMO damages the experience quite a lot. Sure, I could just limit the framerate in titles which my system could not pull at 100+ for an at least consistent experience, and I did, but when you're used to 100+Hz moving from title to title becomes a pretty jarring experience, as some are at 120+ and butter smooth, while others are at 60 and seem extremely choppy in comparison. It doesn't help that a panel that's native to 144Hz looks and feels a lot worse at 60Hz than a panel native to 60Hz does. I could lock absolutely everything to like 75Hz, but what is even the point of having that monitor then? After I'd used my old monitor for about a week I got used to it, 60Hz no longer felt or looked choppy, and everything was nice, standardized and consistent, which is when I decided to sell the 144Hz one
[QUOTE=Electrocuter;52026740]don't forget DoF shit belongs in movies and photography, not games[/QUOTE] [QUOTE=Episode;52026590]FXAA, Motion Blur, Vsync Disable, every game[/QUOTE] Fucking Lens flare is the worst [t]http://static.gamespot.com/uploads/original/349/3494045/2456257-7102613253-1.gif.gif[/t] Especially when the devs are incompetent about it
i like all the ~fancy~ effects like DoF, Lens Flare, Motion Blur, etc...
[QUOTE=Nerfmaster000;52027304]I thought everyone hated 2kilksphilips here, or at least those in the mapping scene.[/QUOTE] That was a long time ago He actually even made a video addressing it [video=youtube;V9OyRqP8c6s]https://www.youtube.com/watch?v=V9OyRqP8c6s[/video]
I have no problem with no anti aliasing. In fact, the jaggies help me see better. Also, forget motion blur, v-sync, post-processing. Give me 60 fps and I am happy.
And then you have those awful mods that add a bunch of shit filters and effects to games that never had them in the first place, making them look fucking awful.
[QUOTE=Rixxz2;52028077]I feel like a retard for saying this but I actually sold my 144Hz monitor at a ~50USD loss and went back to my old 60Hz monitor. 100+FPS on such a monitor is absolutely amazing, the drawback being that framerate fluctuations are extremely noticeable, which IMO damages the experience quite a lot. Sure, I could just limit the framerate in titles which my system could not pull at 100+ for an at least consistent experience, and I did, but when you're used to 100+Hz moving from title to title becomes a pretty jarring experience, as some are at 120+ and butter smooth, while others are at 60 and seem extremely choppy in comparison. [highlight]It doesn't help that a panel that's native to 144Hz looks and feels a lot worse at 60Hz than a panel native to 60Hz does[/highlight]. I could lock absolutely everything to like 75Hz, but what is even the point of having that monitor then? After I'd used my old monitor for about a week I got used to it, 60Hz no longer felt or looked choppy, and everything was nice, standardized and consistent, which is when I decided to sell the 144Hz one[/QUOTE] Why would that be the case? Anyway of course you're going to have little use for a hfr monitor if you can't run games stably at such framerates. Framerate fluctuations can be felt regardless as input latency, they just can't be seen beyond your refresh rate. The real limit is your GPU, but with 60Hz that's as good as it's ever going to get so I don't get why switch back, let alone for a cost. You might not get much use of hfr on new games in that case, but old / graphically simple games will be running at solid creamy 144+ fps.
[QUOTE=Talishmar;52028279]Why would that be the case? [/QUOTE] Was at least a problem with the monitor I had, felt a lot more choppy than a native 60Hz monitor, and also resulted in a bunch of weird ghosting issues. I remember reading an article about it like a year ago when I first bought the screen, and while I remember understanding it, I can't remember what it actually said. I'll try finding it again
FPS has never been a huge thing for me aside from needing it in fast-paced online games, though I know I'm in the minority here. I've always been a graphics over performance guy, so 30fps doesn't bother me at all (I'm pretty sure I've even played games on lower than 30fps as well). I guess I see it as a cinematic thing, since most movies aren't 60fps either, but I love the graphical effects. It's interesting how some people just can't play games at 30fps, I remember playing Halo Combat Evolved on the original Xbox, which ran at 30fps max and sometimes dropped down from that, and I had no problems with it
[QUOTE=Rixxz2;52028584]Was at least a problem with the monitor I had, felt a lot more choppy than a native 60Hz monitor, and also resulted in a bunch of weird ghosting issues. I remember reading an article about it like a year ago when I first bought the screen, and while I remember understanding it, I can't remember what it actually said. I'll try finding it again[/QUOTE] It's because 60 isn't a multiple of 144, so you're not seeing the frame for a consistent amount of time I couldn't really explain it well so I made an example :v: [video=youtube;0iRK_3E9HSI]https://www.youtube.com/watch?v=0iRK_3E9HSI&[/video] The top bar is moving at 30fps, the bottom at 25fps. 30fps looks fine on 60hz monitors because it's just showing each frame twice. The bottom bar twitches back and forth because you're seeing each frame either 2 or 3 times depending on how the two sync up
[QUOTE=kaze4159;52028855]It's because 60 isn't a multiple of 144, so you're not seeing the frame for a consistent amount of time I couldn't really explain it well so I made an example :v: [video=youtube;0iRK_3E9HSI]https://www.youtube.com/watch?v=0iRK_3E9HSI&[/video] The top bar is moving at 30fps, the bottom at 25fps. 30fps looks fine on 60hz monitors because it's just showing each frame twice. The bottom bar twitches back and forth because you're seeing each frame either 2 or 3 times depending on how the two sync up[/QUOTE] How the fuck do you avoid this Just enable vsync always? Or just annihilate your wallet and get a Gsync/Freesync monitor if you need 144Hz?
Sorry, you need to Log In to post a reply to this thread.