HDR on the Old PS4: No Games, No Movies, Just a menu option
43 replies, posted
[url]http://arstechnica.co.uk/gaming/2016/09/ps4-hdr-no-games-media-useless/[/url]
[QUOTE]With the release of PlayStation 4 software update 4.0 last week, Sony brought High Dynamic Range (HDR) output to existing PS4 consoles. The announcement of the HDR update, as well as the speediness of its rollout was unexpected, particularly as it was thought existing PS4 consoles would be incapable of HDR output due to featuring older HDMI 1.4 ports rather than the required HDMI 2.0a ports.
While our investigation into the PS4's HDMI controller—via the electronics sleuthing of NeoGaf user Jeff Rigby—led us to believe that the PS4's HDMI port might indeed feature the required bandwidth to push a 2.0 signal, that still leaves the question of what exactly PS4 users can do with HDR support right now, and what, if any, HDR content is on the way.
As it turns out: not very much.[/QUOTE]
[QUOTE]In terms of games, there's nothing available to play in HDR right now. It appears the HDR update for existing PS4's took many developers by surprise, and so far most have only announced PS4 Pro updates. Thekla, the Jonathan Blow-led studio behind The Witness, did confirm that it is working on HDR support for the game on PS4, but that it's still some way off release.
"We don't know the technical details of how it will work, yet," said Blow in a blog post, "but provided that nothing prevents us, the Witness patch will use HDR on all PS4s when available."
[/QUOTE]
[QUOTE]Loading up Netflix on the PS4 while plugged into the UHD Premium Samsung TV resulted in only standard HD content being listed, with no HDR or 4K videos available. It was the same story with Amazon Prime Video. To make sure I wasn't going loopy or there wasn't anything wrong with the TV, I plugged in an Nvidia Shield, which kicked the TV into HDR mode when it played compatible content on Netflix.
In order to play back HDR media on the older PS4, content providers like Netflix and Amazon would have to separate 4K resolution from HDR—and neither currently has plans to do so.
"HDR can be independent from 4K," a Netflix spokesperson told me. "HDR can also be enjoyed at HD resolution, independent of 4K, at bandwidths slightly higher than current HD. For now, with Netflix content, the two technologies are tied. There aren't any current plans to change that."
Amazon told a similar story: "Unfortunately there's nothing to announce from Amazon on HDR content outside of 4K. As soon as this changes we'll let you know."[/QUOTE]
tl;dr don't buy a ps4 for hdr content, either get a pro or a XB1-S / Scorpio when they come out
So the feature is there, it's just up to the developers to implement it in their games.
sony playing the catch up game now
Not surprised, I doubt anyone could have planned them activating HDR on all PS4s- give it a few months time, and there'll be more support I'm sure.
[QUOTE=Dr.C;51087072]What makes this HDR different from the HDR we've been getting since Lost Coast?[/QUOTE]
Nothing at all, consoles can just do it now. Doesn't sound like a very intense change, it's just that for a while TVs didn't have this as a feature. Now that 4K and HDR are all the rage in the technical world, we're finally seeing Sony push for it. Not the biggest feature in the world, but at least they did what Xbox did for the XB1S.
What makes this HDR different from the HDR we've been getting since Lost Coast?
[QUOTE=Dr.C;51087072]What makes this HDR different from the HDR we've been getting since Lost Coast?[/QUOTE]
HDR in Lost Coast was basically just eye adaptation. When you looked at the sky, clouds would stop being overbright and you'd start seeing details in them. When you looked at a dark cave, the shadows would get more detail, but the whole sky would turn white.
[media]https://www.youtube.com/watch?v=QbFeBpMB87w[/media]
HDR TVs are a completely different thing.
On normal TVs and monitors you only have 256 values for Red, for Green, and for Blue. HDR TVs have much more range for each color, so whites can be more detailed and blacks too.
If you look at gradients on a normal monitor, you'll notice edges between colors (called banding). HDR games on HDR TVs would have none of that.
[img]http://i.imgur.com/qGZAncG.jpg[/img]
It feels like jumping from 30fps to 90fps in terms of visual impact (I know it's weird to compare color to framerate).
1080p to 4K doesn't nearly make as much of a difference as HDR.
I'm not sure where your eyes come from but going to 4K from 1080 is a huge deal, just as huge as true HDR.
[QUOTE=Dr.C;51087072]What makes this HDR different from the HDR we've been getting since Lost Coast?[/QUOTE]
HDR is basically a public marketing term for 10 bit color.
8bit (99% chance your monitor is this right now) is only 256 shades of each color. It's only possible to have 16,777,216 colors on your screen
10 bit does 1024 shades of each color, allowing over 1 billion colors.
if we ever get to 12 bit (which wont be for awhile), that is 4096 shades and we will have 68 billion colors.
[IMG]http://i.imgur.com/jWvLs6w.jpg[/IMG]
[IMG]http://i.imgur.com/WeEyzvK.jpg[/IMG]
tl;dr HDR gives you more colors. it aint a marketing trick.
Oh so it does mean 10bit after all, nice. Hope it goes mainstream for all monitors soon, but what a weird transition for developers
I wouldn't imagine it would require much more work from developers.
As long as their screen textures are stored as floats and are higher than 8 bits, and their final image is normalized within the 0-1 range, then it shouldn't matter.
[QUOTE=Karmah;51087553]I wouldn't imagine it would require much more work from developers.
As long as their screen textures are stored as floats and are higher than 8 bits, and their final image is normalized within the 0-1 range, then it shouldn't matter.[/QUOTE]
It'll take a while for developers to make an artflow that can use the expanded range without degrading srgb users.
I personally think they should just tonemap accordingly and let everything that would normally clip and be put into bloom, instead go into the higher ranges on rec.2020
HDR really confused me. In all the marketing they always say how the blacks are darker and colors are more vivid or something. I thought it was just a fancy term for a more advanced contrast. I knew about the benefits of 10 bit colors, but didn't know that was it.
Are there any TV's/monitors that support this currently? Is now a good time to buy them, or do I need to wait close to 2-3 years before they are below $1,000?
What is the "32 bit color depth" that I see in Nvidia Control Panel when looking at my monitor? Is that completely unrelated?
[QUOTE=cpt.armadillo;51088570]What is the "32 bit color depth" that I see in Nvidia Control Panel when looking at my monitor? Is that completely unrelated?[/QUOTE]
[URL]https://en.wikipedia.org/wiki/Color_depth[/URL]
Yeah, unrelated. Basically accumulates how many colors you can display
[QUOTE=NO ONE;51088554]Are there any TV's/monitors that support this currently? Is now a good time to buy them, or do I need to wait close to 2-3 years before they are below $1,000?[/QUOTE]
You need to get out more. You can buy a 50" for 500 with 4k and HDR. I was just at best buy
[QUOTE=NO ONE;51088554]Are there any TV's/monitors that support this currently? Is now a good time to buy them, or do I need to wait close to 2-3 years before they are below $1,000?[/QUOTE]
Hdr seems to be as rare as 4K was in the late 2000s and early 2010s
As in It's relatively new to the consumer market and will cost some big bucks to get. Watch become a standard though, as its just a boon overall to have.
buying a new TV or monitor will become confusing very soon because you won't be able to choose between higher refresh rates, 4K resolution or HDR support. Although im pretty sure the TVs that are already selling have hdr and are 4K, they are just expensive.
[QUOTE=darth-veger;51088573][URL]https://en.wikipedia.org/wiki/Color_depth[/URL]
Yeah, unrelated. Basically accumulates how many colors you can display[/QUOTE]
HDR usually comes with higher bit depth, and for the PS4 and PS4 Pro this means 10-bpc output.
[QUOTE=cpt.armadillo;51088570]What is the "32 bit color depth" that I see in Nvidia Control Panel when looking at my monitor? Is that completely unrelated?[/QUOTE]
Expanding on darth-veger's post 32 bit color depth is RGBA (8 bits per channel) and not HDR.
Expanding on HDR rendering as a concept: games (Halo 3 being a good example and lost cost since it was mentioned) have been using HDR for a while now. Using an HDR rendering pipeline just means you can have a scene with extremely bright or dark or contrasting lighting and it all looks right. You can just tick up or down the camera exposure just like with a real camera. Typically they automatically do this so that when the world is dark, bump it up, when you look at the bright sky, bump it down. This tries to make sure the game always looks properly exposed. But since our displays are very limited they have to then throw away all that nice HDR data and tonemap it into an 8 bit per channel color range or 16.7 million colors. Hopefully taking into account all sorts of things like screen contrast, what is the luminance of the viewing environment around the screen itself, gamma correction, and more fun (tedious) nuances because you need to make a perceptually correct image (meaning the colors and exposure look good in your viewing environment).
HDR screens just let you simply display the raw HDR render which will not only appear more detailed and vibrant, but they also [B]feature a higher dynamic range of brightness/darkness[/B]. So with far more physically accurate color and luminance you can produce images that are automatically more perceptually correct.
[B]TLDR;[/B] HDR rendering lets you make more realistic renders and has been around for a good while now. HDR screens let you display the HDR render, with more color and greater dynamic range, instead of an LDR, 8 bpc color approximation of it. HDR screens can immediately be utilized with tons of movies, games and photography with almost 0 work involved. Its a great step forward.
[editline]22nd September 2016[/editline]
[QUOTE=glitchvid;51088280]It'll take a while for developers to make an artflow that can use the expanded range without degrading srgb users.
I personally think they should just tonemap accordingly and let everything that would normally clip and be put into bloom, instead go into the higher ranges on rec.2020[/QUOTE]
If they're already using an HDR pipeline all they need to do is have an optional bypass of the conversion to 8 bit in their final postprocessing steps. For most engines this would be trivial work.
Textures will remain sRGB. No reason to change that workflow except for normal maps moving to 16 bit which is already happening. Maybe some UI texture tweaks would be warranted though.
Does anybody actually encounter color banding in their games? The only game I can recall seeing it in is mirrors edge 1 just because of its blended lighting and fast color transitions.
Most games are too heavily textured (ie: natural dithering in the image) for color banding to really become a problem, the banding always gets broken up by noise in the image, the same dithering and noise would still show up in 10 bit images
[QUOTE=NO ONE;51088554]Are there any TV's/monitors that support this currently? Is now a good time to buy them, or do I need to wait close to 2-3 years before they are below $1,000?[/QUOTE]
i got x800d
it's real nice
[QUOTE=hypno-toad;51088700]Does anybody actually encounter color banding in their games? The only game I can recall seeing it in is mirrors edge 1 just because of its blended lighting and fast color transitions.
Most games are too heavily textured (ie: natural dithering in the image) for color banding to really become a problem, the banding always gets broken up by noise in the image, the same dithering and noise would still show up in 10 bit images[/QUOTE]
I rarely notice color banding in games, but one that comes to mind is Dragon's Dogma which has color banding.
[QUOTE=DOG-GY;51088658]If they're already using an HDR pipeline all they need to do is have an optional bypass of the conversion to 8 bit in their final postprocessing steps. For most engines this would be trivial work.
[/QUOTE]
I meant for final tonemap and color correction, if you overexpose a scene in post knowing you have expanded range, you'll blow-out a lot of the detail on lower bit-depth screens.
IIRC it's approached differently by each dev, the Tomb Raider developers used to a modified filmic curve on HDR to blow out whites as they approached normal srgb max brightness.
It's also worth noting that the standard monitors (sRGB) use and HDR TVs use (rec 2020) different coefficients.
I'm personally not a fan of the rec.2020 standard, it's great that TV manufacturers are actually doing something about [I]real[/I] image quality (Especially after Samsung killed their OLED plans), but I'd rather see proper 16bpc sRGB become the standard.
I have yet to see a true HDR panel in person, but this will change since I was invited back to CES this year.
[QUOTE=DOG-GY;51088658]
Textures will remain sRGB. No reason to change that workflow except for normal maps moving to 16 bit which is already happening. Maybe some UI texture tweaks would be warranted though.
[/QUOTE]
Is 16bit normal maps moving into production for games? I know it was used in the visualization areas, but I didn't expect to see it being used mainstream. Mainly due to its significant memory footprint, and [del]diminishing returns for visual fidelity.[/del] depends on your topo and uv, can be useful.
[QUOTE=ManFlakes;51089715]I rarely notice color banding in games, but one that comes to mind is Dragon's Dogma which has color banding.[/QUOTE]
10bit LUT is a hell of drug:
[t]https://puu.sh/p22UI.jpg[/t]
(if you have an 8 bit anything ((monitor, cable, card)) this pic is pointless)
[QUOTE=ManFlakes;51089715]I rarely notice color banding in games, but one that comes to mind is Dragon's Dogma which has color banding.[/QUOTE]
The banding in Dragons Dogma is fucking obscene. But I believe it's more sourced to the low-ass resolution everythings in the game. The vignetting in particular is noticeable at night with a light source nearby, but this could be caused by whatever they use to define the vignette only having a fairly small resolution, leading to sharp changes when sampling points in it on higher resolutions.
[editline]22nd September 2016[/editline]
Wait, probably wrong on that thinking about it. dw lol
But there's a lot of low resolution shit in the game still.
Currently existing games on PC don't support HDR. I think there's a push from graphics developers to call the kind of HDR new on TVs now VDR, as in Variable Dynamic Range, because the point of these screens is that they support a wider range of brightnesses. Even if games are using high precision formats internally, they'll still output at 8 bit sRGB without some kind of changes. For instance, don't believe 27X - the above screenshot is still 8 bit, although with a fancy high contrast filter slapped on it. If it were raw HDR you'd see complete whiteness in brightness. However, HDR monitors do have an automatic conversion process.
Dragon's Dogma normally used a low precision render target for rendering, I think something like 565 rather than 8 bit colour, which is why the banding was so noticeable. Using ENB forces it to use a higher internal precision, which removed lots of problems, but even so left some banding left over.
The proper solution - and one that's still relevant on HDR displays, at least according to [url=http://gpuopen.com/wp-content/uploads/2016/03/GdcVdrLottes.pdf]this interesting presentation[/url] - is dithering the output to give the illusion of smooth gradients between otherwise chopped-up colours. That's something you can also see in modded Dragon's Dogma, as ENB added it as a feature when people noticed there was still banding after he forced it to some ridiculous 64-bit format.
That presentation also cleared up what this new HDR meant for me as well, so check it out.
Lol, there's several games that have been out for some that support HDR, Halo3, Alien:Isolation, and GriD 1 and 2 for example and as for whites, you might wanna check the rock highlights, cause they're over 600,600,600. Also hence the referral to 10b LUT, not a raw HDR output, which you conveniently missed.
The "high bit" version never made it past beta as Boris has said on the forums, the above also doesn't have dithering, nor would it be of much use in a compressed screenshot anyway, secondly pretty sure Boris/Kingeric's usage for dithering colors is for TFT and is 8+2, not 10bit native.
Dogma used a 6bit target and tiny textures to allow for enough free memory for the PS3 to handle the open world actively loaded format and realtime model changes to happen on the fly in a stage under MTF.
[QUOTE=27X;51091356]Lol, there's several games that have been out for some that support HDR, Halo3, Alien:Isolation, for example and as for whites, you might wanna check the rock highlights, cause they're over 600,600,600. Also hence the referral to 10b LUT, not a raw HDR output, which you conveniently missed.[/QUOTE]
The PNG you posted is 8bpc (Which isn't a surprise, PNG has no HDR spec*), so there's no way for it to contain values above 255.
As for games supporting HDR, there's a difference between the buffers being in higher precision, and actually tonemapping to a higher precision, which is what the current HDR (Rec 2020) is all about.
In fact, Halo 3 isn't exactly HDR either, it uses two separate 8-bit buffers, instead of a 16-bit float, so it can represent more stops by tonemapping the two buffers, none of the calculations are done in higher precision.
*PNG supports 16-bit non-float.
Exactly, it's compressed, guess I didn't clarify that enough.
[QUOTE]In fact, Halo 3 isn't exactly HDR either, it uses two separate 8-bit buffers, instead of a 16-bit float, so it can represent more stops by tonemapping the two buffers, none of the calculations are done in higher precision.[/QUOTE]
That I did not know. Guess that leaves Isolation, Grid 1 and 2 and ResEvil V.
And all PBR games.
Sorry, you need to Log In to post a reply to this thread.