• Both Nvidia and AMD are preparing for 8K: “for the human eye that resolution is close to perfection”
    58 replies, posted
[url]http://www.pcgamesn.com/both-nvidia-and-amd-are-preparing-for-8k-for-the-human-eye-that-resolution-is-close-to-perfection[/url]
I've found that 4K is still too expensive so how will 8K rake in the money?
[QUOTE=Midas22;46080630]I've found that 4K is still too expensive so how will 8K rake in the money?[/QUOTE] I doubt anyone who has that kind of money has time to play video games We'll probably be seeing 8k personal office setups for a while
[quote]And, in truth, they’d need to be even more powerful than that. Due to the way graphics cards push information to the screen they typically need to produce twice as much data than the monitor requires, otherwise you end up with visual artifacts.[/quote] Is he talking about vsync? In which case that is one of the worst explanations I've seen.
I'd rather that they work on making 4K the new, viable standard before they jump the gun and move on to 8K. When each side is able to produce an affordable graphics card that doesn't need to be SLI'd/CFX'd to max out games at a 4K resolution and stay at a stable framerate throughout the process, then I'll be satisfied.
I wonder how long before 8k becomes the norm, if it ever does. Imagine how much of an overkill graphic cards would be for any current game.
And of course, please expect the whole "#PCMASTERRACE" posts to come swarming in...
[quote]Due to the way graphics cards push information to the screen they typically need to produce twice as much data than the monitor requires, otherwise you end up with visual artifacts. So, an 8K monitor would actually be receiving 16K resolution data. If you had 16K you’d offer a perfect experience, no one would see artifacts if they had 20/20 vision[/quote] Uhhh what Is he talking about antialiasing/downsampling?
[QUOTE=Thunderbolt;46080698]Uhhh what Is he talking about antialiasing/downsampling?[/QUOTE] I have no idea. My only idea was for buffering for vsync, but even then it's not "double" and not everyone needs or wants vsync. I disable it in every possible way I can on my system. Downsampling isn't required and PostAA is basically just as useful at 4k given how small the details are, so I don't see why AA would be the reason. Ontop of that, that the monitor is receiving the same data unless it's downsampled inside the monitor. ONTOP of that, 16k is not double of 8k. RPS dun fucked up. Edit: I keep assuming Tech articles are RPS given how often they are completely wrong. Edit: 2 [QUOTE=GoDong-DK;46080731]Could they mean SSAA? That would render the whole picture at a much higher resolution, then downsampling it. Aliased edges might be the visual artifacts he's talking about.[/QUOTE] Even then, it doesn't make sense to me. Downsampling is needed less and less the higher and higher resolution you get. AA has a purpose in all resolutions, but just run FXAA when the pixels are defining like a mere hair of a letter, not half of someones head. it almost all circumstances, that statement of it needing double the information doesn't work to me.
Could they mean SSAA? That would render the whole picture at a much higher resolution, then downsampling it. Aliased edges might be the visual artifacts he's talking about.
[QUOTE=Sam Za Nemesis;46080803]That's MSAA[/QUOTE] [url]http://en.wikipedia.org/wiki/Multisample_anti-aliasing[/url] [i]In graphics literature in general, "multisampling" refers to any special case of supersampling where some components of the final image are not fully supersampled[/i]
I have yet to buy a 1080p monitor. 1440p monitors here start at 500€.
Even if Moore's Law was still reliable, that'd still mean it'd take at least 4 to 5 years before SLI / Crossfire setups can run 8K at reasonable rates. Yeah it's going to take a while :v:
[QUOTE=AntonioR;46081104]I have yet to buy a 1080p monitor. 1440p monitors here start at 500€.[/QUOTE] you've also yet to buy more than 3.5gb ram
[QUOTE=Sam Za Nemesis;46080803]That's MSAA[/QUOTE] MSAA is exactly when you [I]don't [/I]render the whole picture at a higher resolution :s
[QUOTE=Pernoccuous;46081679]you've also yet to buy more than 3.5gb ram[/QUOTE] Actually, after 9 years I've finally bought a new PC last week(and it has 8GB or ram). Unfortunately a 1080p TV monitor I wanted to use didn't give a good picture, so I'm in a process of getting a new one.
You won't need to do any kind of super-sampling when you have an 8k resolution.
In addition to videogame concerns it's not like anybody has internet good enough to stream 4k, 8k would be a nightmare
[QUOTE=Elspin;46081983]In addition to videogame concerns it's not like anybody has internet good enough to stream 4k, 8k would be a nightmare[/QUOTE] You only need like 30-40mbit to stream/watch 4k.
I bought my first 1080p monitor like four months ago. I'm fine for now I think. :v: I had two VGA 22 inch monitors before. Yes it was awful. (I still have one as a secondary)
[QUOTE=Atlascore;46082030]Yeah, I think 4k is a long way away from becoming mainstream, it requires a fast connection to stream it, and a 4k movie take forever to download on the average person's connection. As for the gaming market, 4k is simply impossible to run on the vast majority of current gaming PCs, the average gamer isn't going to be able to run at 4k for a very long time, even the most top of the line cards available today struggle with it.[/QUOTE] Netflix only requires like 25mbit to watch a 4k movie, i think that's well within reach of millions of people all around the world.
[QUOTE=Cold;46081962]You won't need to do any kind of super-sampling when you have an 8k resolution.[/QUOTE] Imagine how hard a computer would shit itself if it did 8K at 2x2 SSAA.
[QUOTE=Cold;46082020]You only need like 30-40mbit to stream/watch 4k.[/QUOTE] Haha are you fucking kidding me, the top speed you can get where I live is 25mbit, and that's [i]advertised[/i] (in other words way higher than what you get) and we're relatively near a major population center. The truth is internet completely blows chunks in the vast majority of rural areas and this is just trying to watch 4k, so until google fibre starts dragging their ass to Canada any 4k or 8k technology can fuck off until everything else catches up :v:
tbh we don't need 8k atm guys seriously the only reason nvidia and amd are pushing out 8k right now is because they want to be the first to innovation! fucking firstfags
Just because it sucks for you guys doesn't mean it sucks for everyone. [url]http://www.netindex.com/download/1,8/OECD/[/url] OECD index is 25.9mbit, Worldwide is 20mbit. Plenty of people can watch a 4k stream.
I remember hearing the same when 1080p came around, then 4K. Now they're saying it's 8K. They gotta sell new technology somehow.
[QUOTE=Cold;46082300]Just because it sucks for you guys doesn't mean it sucks for everyone. [url]http://www.netindex.com/download/1,8/OECD/[/url] OECD index is 25.9mbit, Worldwide is 20mbit. Plenty of people can watch a 4k stream.[/QUOTE] Plenty of people can watch a 4k stream at ideal conditions at non-peak hours of the day when they are doing literally nothing else... which is really not a big market. I mean hell, take a look at Amazon's TV llistings: 1080(p/i) - 1854 TVs 4k - 100 TVs Nobody is buying the things because the vast majority of content is released at 1920x1080 and streaming is a nightmare at 4k but once again this isn't the point! The point is that starting to release 8k stuff now is even [i]worse[/i] because barely anyone wants 4k at the moment. 4k should exist - because for things to become mainstream they have to slowly build up in quality but if 8k TVs were released right now basically nobody would buy it. Do I want 8k? Yeah, totally, that would be great. But it's total nonsense to start trying to push it right now
4K would be great for PC monitors larger than 27", but for TVs that are more than 2m away... I think only for the ones larger than 100cm and for a user with super good eyesight. This could get affordable for average user in 3-4 years. I don't know where to place 8K. Monitors over 40", and TVs over 2m ? That is just for premium market now.
Nice to see them planning for it, however I still think its probably a way off yet. Maybe even longer for broadcasting as well, I recall the BBC when using 8k at the Olympics (And actually helping to develop it) said that it could be another 10 or 20 years before it starts making an appearance in most peoples' homes.
[quote]“8K, or anything above 4K is going to require multiple GPUs,” Scott Herkelman, Nvidia’s head of GeForce GTX told me when I asked about what would be needed for 8K. “4K for most GPUs is pretty tough, the 980 handles it well but it’s still one of those things that the more GPUs you have the better it looks.”[/quote] In 5-10 years it won't.
Sorry, you need to Log In to post a reply to this thread.