IIRC This is being used in VR headsets to help bring the performance requirements to more reasonable levels.
This again?
[QUOTE=IrishBandit;51556937]IIRC This is being used in VR headsets to help bring the performance requirements to more reasonable levels.[/QUOTE]
Not in the Rift or the Vive, but FOVE is seemingly the first one to adopt this. No consumer release yet though.
Foveated rendering is going to be very important with 4k displays and up in VR.
Actually it isn't going to be important at all except to current gen consoles, and on a 4K screen the effect frankly looks terrible.
The only reason this is being pushed as hard as it is is current gen consoles don't have the horsepower to properly do either vr nor 4K in the first place.
[QUOTE=27X;51558917]Actually it isn't going to be important at all except to current gen consoles, and on a 4K screen the effect frankly looks terrible.
The only reason this is being pushed as hard as it is is current gen consoles don't have the horsepower to properly do either vr nor 4K in the first place.[/QUOTE]
What about Low-End or Mid-End PC's?
[QUOTE=Fox Powers;51559023]What about Low-End or Mid-End PC's?[/QUOTE]
Because even with top of the line GPUs, VR struggles with graphical fidelity. You can't just throw more money at it like you can with PCs.
HiAlgo boost is already a thing for those, and secondly any dev is fully capable of scaling and delineating a plethora of options for those platforms.
It is literally down to laziness on the dev's part. look at DooM and Warframe, options out the ass.
If you're not seeing those options, that's entirely on the devs not providing them.
They are going to have a lot of fun trying to bring down the latency of the eye trackers. Considering how fast our eyes are it would be jarring as fuck if you look around on a screen or in VR and feel everything go blurry for a split second because you moved your eyes.
Shadow Warrior 2 has something like this but without the eye tracking, it just renders the edges of the screen at 60% of the screen resolution. Decent increase in performance for how hard it is to notice
[t]https://cms-images.idgesg.net/images/article/2016/10/shadow-warrior-2-nvidia-multi-res-shading-performance-100687525-orig.png[/t]
[QUOTE=I_love_garrysmod;51559163]Shadow Warrior 2 has something like this but without the eye tracking, it just renders the edges of the screen at 60% of the screen resolution. Decent increase in performance for how hard it is it notice
[t]https://cms-images.idgesg.net/images/article/2016/10/shadow-warrior-2-nvidia-multi-res-shading-performance-100687525-orig.png[/t][/QUOTE]
Same technique is used by Battlefield 1.
[video]https://youtu.be/ZZkSd3-uh40[/video]
[QUOTE=CommanderPT;51559083]They are going to have a lot of fun trying to bring down the latency of the eye trackers. Considering how fast our eyes are it would be jarring as fuck if you look around on a screen or in VR and feel everything go blurry for a split second because you moved your eyes.[/QUOTE]
I imagine latency being less of a problem than, say, VR, where there's a lot of devices communicating through several processes. Here there's one device directly wired, which should work fine if synchronised with the rendering process. If the process went: process game > get eye position > draw frame, then there'd be almost no latency at all, besides the physical limitations of the display of course.
My concern would be whether the additional processing is worth the savings, and what kind of hardware your average user would need to make this work, especially in a vr headset. I wonder if the process would make gameplay footage unwatchable, or would seeing where the player is looking make it more interesting to watch?
but can it do unlimited graphics??
Legitimately looks terrible
This would probably actually give me motion sickness
[QUOTE=Grenadiac;51559496]Legitimately looks terrible
This would probably actually give me motion sickness[/QUOTE]
It's not something you can judge by looking at a small video in low resolution, not to mention when it doesn't track your eyes. Might as well say HDR looks terrible by watching HDR videos on a non HDR monitor.
While that applies to an extent to VR, it does not look great at 4K, and it is extremely immersion-breakingly obvious at that resolution as well.
[QUOTE=27X;51560019]While that applies to an extent to VR, it does not look great at 4K, and it is extremely immersion-breakingly obvious at that resolution as well.[/QUOTE]
Your posts in this thread are confusing/pretty vague i'm going to assume that what you mean is that you have a 4K monitor and the video looks terrible on it?.
[QUOTE=27X;51558917]Actually it isn't going to be important at all except to current gen consoles, and on a 4K screen the effect frankly looks terrible.
The only reason this is being pushed as hard as it is is current gen consoles don't have the horsepower to properly do either vr nor 4K in the first place.[/QUOTE]
What?
This benefits [B]everybody[/B], and this is one of the holy grail technologies of VR to that will make it significantly easier to run. It enables huge pixel densities on VR headsets that are simply impractical to render even with the best possible hardware configuration. The current pixelated look of VR would be a thing of the past. This would also give developers a hell of a lot more room to play with in terms of performance so VR games can look much better.
The real challenge is getting the latency of eye tracking low enough that the effect becomes totally transparent.
This is going to give people motion sickness and headaches unless the eye tracking in their VR headset is absolutely perfect
[QUOTE=Laserbeams;51560488]This is going to give people motion sickness and headaches unless the eye tracking in their VR headset is absolutely perfect[/QUOTE]
Fortunately eye tracking is not new technology and it should be possible to get it working seamlessly after some R&D
[QUOTE=Laserbeams;51560488]This is going to give people motion sickness and headaches unless the eye tracking in their VR headset is absolutely perfect[/QUOTE]
Fixed-foveated-rendering is what is being implemented now. Due to how the optics work for VR, you basically waste a ton of pixels at the periphery of the lenses, so AMD and nVidia have introduced technology to render these areas at lower resolution, for no perceived quality impact.
-snip-
[QUOTE=bunguer;51561523]You need to research a bit more about the topic before posting these things, it's quite clear you are mixing up concepts.
"it applies to an extent to VR, it does not look great at 4K" is a sentence that doesn't even make sense.
The main use of this technology is VR, it doesn't make much sense at all to apply it to a standard monitor, even if you had eye tracking.
Whether the VR screen is 2K or 4K or 8K is irrelevant to how useful this tech is. All we're doing is rendering at a low quality the things that are outside your focus, aka everything on your peripheral vision. This decrease in quality is not even noticeable because your peripheral vision is unable to notice those details anyway.[/QUOTE]
You need to to do more research, because I clearly delineated between VR and regular screens, three video technology vendors (not game companies, but the companies whom supply those companies with licensed tech are already using types of this for regular non-VR games, which is my point. It won't be for VR only and it isn't for VR only. [i]A variation of it is already in use in four AAA titles right now[/i] and it looks like crap. So whomever this "we" you speak of should perhaps go back and look more closely at current launch titles over the past six months. This is the same thing akin to sticking every post process feature in game whether it needs it or not, same for the 'everything is brown' motif, same for the 'everything is desaturated to hell and and back and covered in blurry TXAA' motif we're currently enjoying now. Trend; no more, no less.
It's "fine" (I use the term very loosely) in a theoretical VR system with fast enough response time and refresh rate, it is not fine in anything currently out on the market, period, because the [I]hardware is not robust or fast enough to fully implement the effect[/I] and it looks awful, like the smeary DoF first implemented in Unreal 2.
Have you considered that maybe the reason this looks bad in non-VR games is that when you're looking at a regular screen you actually [I]do[/I] pick up all the detail because that screen doesn't come anywhere close to filling your field of vision?
-snip-
I'm going to add that while using foveated rendering in a plain old 2D screen becomes less useful as you get away from the screen, if the hardware to implement it is a camera or enough like one it could be used for other things.
Like tracking your head position/rotation for a pseudo-3D effect or your facial expresion to animate your player character.
Interesting technology, I had no idea this was being worked because funnily enough I had an idea like this floating in my head a while back when I realized that we gloss over most fancy details displayed with out fancy graphics cards anyway due to the limitation of human vision and processing of it.
I don't know how reliable would the whole eye tracking thing would be right now though.
[QUOTE=Nabile13;51559193]Same technique is used by Battlefield 1.
[video]https://youtu.be/ZZkSd3-uh40[/video][/QUOTE]
Not the same thing, dynamic resolution renders the entire frame in an either higher or lower resolution depending on overall performance
[QUOTE=Rixxz2;51563445]Not the same thing, dynamic resolution renders the entire frame in an either higher or lower resolution depending on overall performance[/QUOTE]
Doesn't explain the line artifact that is a sign of frame stitching.
Sorry, you need to Log In to post a reply to this thread.