• Downsampling risks?
    15 replies, posted
Piecing together my new, powerful rig has been incredibly rewarding. An i7 4790k on an MSI board, 16gb of DDR3, a 256gb SSD so it all loads quick. And the crown jewel? My MSI GTX 980. The simple joy of going into games that I once played with kid gloves, and cranking up every setting... so satisfying. But of course, when the games run at maximum quality to a smooth 60fps or higher, you start wondering what can be done to enhance the experience further. I've spent the vast majority of time before seriously PLAYING a game, looking up config file edits to boost fidelity HIGHER than maximum, texture enhancement mods. Only recently did I find Sweetfx, and a wicked antialiasing method with NVidia Inspector. Clearly not ALL games will be good for this. Tomb Raider has ridiculous performance demands for such a modest (but nice) looking game. But right now I'm playing Dishonored for the first time, quite impressed with its aesthetic, and fantastic frame rate. even adding all the things I mentioned, it runs like butter. But then I found out about Downsampling, and it's nagging at my curiosity. The Inspector method, with ingame AA disabled, makes the game look FAR clearer than the initial max settings, but it still looks like it could stand some improvement. Now, downsampling seems straightforward enough, but a few of the guides on the subject have given me pause. And the warning NVidia makes about voiding warranties when attempting a custom resolution about stopped me cold. A few guides I read spoke of the potential to damage the monitor when downsampling. It mentioned streaming more data than the monitor was made for putting a strain on it. It seems odd to me, because I would think the resulting crushed image would be rendered in the GPU, and THEN sent to the monitor at the proper maximum resolution. Not the monitor having to make sense of a torrent of data stuffing it. But I didn't design the thing, so I don't know. My monitor is a modest ASUS vh236h, linked to the GPU through an HDMI cord. It's native to 1080. Nothing special, but I'm in no position to replace it. It might not KILL me, but it'd be a stupid reason to drop $200. But just about every other article or video I read on the subject foregoes any such disclaimer. They all read like, "A quick and easy way to have better AA!" So what's the truth of the matter? Is damage possible, but exceedingly rare? Is it an issue reserved for older cards or monitors, or using old DVI inputs instead of HDMI? I'm too curious to miss out, but too cautious to take a blind leap. Anyone have an opinion on this?
Why not use DSR instead? Your card supports it and you can't damage anything that way, it's a lot more reliable and easier to set up, too
Oh? Do tell... ...Or I could google it myself, I'm sure.
[url=http://www.geforce.com/hardware/technology/dsr/technology]It's basically downsampling but done by software and on your card, it renders the image at for example 4K and sends the image to your monitor at its native 1080p resolution so there's no risk of damaging anything[/url] [editline]7th March 2015[/editline] Basically go to "Global 3D settings" in your Nvidia control panel, enable 4x or 2x DSR factors, go into any game and change the resolution to something higher than it normally lets you, if it looks blurry mess around with the smoothness setting.
here is it in picture form [t]http://3kv.in/~techbot/ShareX/2015/03/07/NVIDIA_Control_Panel_13-23-46.png[/t]
I don't like DSR very much, anything that isn't 3d is extremely hard to read.
[QUOTE=std DONOR;47278612]I don't like DSR very much, anything that isn't 3d is extremely hard to read.[/QUOTE] That's downsampling for you.
DSR does seem to help my 1080 get a little 'smoother', but it's too slight most of the time. it feels... mostly placebo. The effect is there but not nearly that much to bother. If you're running a game that your computer gets over 60 stable, or a billion fps on, and you DSR, it's obviously worth it to a degree. But it's otherwise overrated. The only risk is that it's harder to process, and that's it. [editline]7th March 2015[/editline] it is pretty great for screenshots though from what I know
[QUOTE=J!NX;47278665]DSR does seem to help my 1080 get a little 'smoother', but it's too slight most of the time. it feels... mostly placebo. The effect is there but not nearly that much to bother. If you're running a game that your computer gets over 60 stable, or a billion fps on, and you DSR, it's obviously worth it to a degree. But it's otherwise overrated. [editline]7th March 2015[/editline] it is pretty great for screenshots though from what I know[/QUOTE] It can look better than AA at times but the trade off is small ui if the game doesn't scale.
[QUOTE=sparky28000;47278689]It can look better than AA at times but the trade off is small ui if the game doesn't scale.[/QUOTE] yeah pretty much Alien: I has like no AA so DSR is a huge bonus for people, and I noticed it helps a little for tf2, but not really that much in other games [editline]7th March 2015[/editline] to be fair though, some games have huge huds with no way to change it, nothing is more annoying than that. So small UI isn't always that bad
[quote] And the warning NVidia makes about voiding warranties when attempting a custom resolution about stopped me cold. A few guides I read spoke of the potential to damage the monitor when downsampling. It mentioned streaming more data than the monitor was made for putting a strain on it. It seems odd to me, because I would think the resulting crushed image would be rendered in the GPU, and THEN sent to the monitor at the proper maximum resolution. Not the monitor having to make sense of a torrent of data stuffing it. But I didn't design the thing, so I don't know.[/quote] I wouldn't be worried about the non native resolution warranty warning, they are just covering themselves in.
The point of downsampling is to act as a last effort to make a game look better because it's kinda like an anti-aliasing method. My monitor's native res is 1600x900. I play CSGO at 2560x1440. When I'm down with the game, my desktop is automatically my native resolution. Try downsampling any game that runs great, like source games and anything else that you run well over 60fps.
Yeah, if the in-game UI doesn't scale then it looks pretty bad (I'm looking at you Gmod), text rendering can go funny too since it's done stupidly in a lot of games (So instead of having high resolution text downscaled, it's low resolution upscaled and then scaled back down, making the whole thing blurry) [QUOTE=Cold;47278816]I wouldn't be worried about the non native resolution warranty warning, they are just covering themselves in.[/QUOTE] It's probably to stop people trying it, then getting confused when the monitor rejects the signal and they have no idea how to revert the setting. I've never come across a monitor that's handled an out of range signal badly, all of them just say it doesn't work and behave like there's no signal at all.
[QUOTE=~Kiwi~v2;47279838]It's really just sending the same signal but at their native res so example 1600x900 it would send out a 1600x900 signal but the picture would be 3200x1800. So yeah there are no risks to this except for losing your eye sight because of the text (YES IM ALSO LOOKING AT YOU GMOD AND GUILD WARS 2)[/QUOTE] GW2 has native supersampling.
Sorry, you need to Log In to post a reply to this thread.