• Exploring the limits of real time rendering
    23 replies, posted
[video=youtube;qlyzo9ll9Vw]https://www.youtube.com/watch?v=qlyzo9ll9Vw[/video] anyone else want a new Pikmin? [URL="https://overview.artbyrens.com/"]The creator[/URL]'s worked on BF3 and BF4 in the past and mentions using photogrammetry to get these results. [QUOTE]My setup: Dual 1080TI in SLI Intel Core i7 5960X Asus X99-E WS Corsair 64GB DDR4 2400MHz Acer 28" Predator XB281HK G-Sync 4K[/QUOTE]
The limit is actually the nonexistent bokeh in the cheap depth-of-field effect. [editline]7th February 2018[/editline] Everything else checks out tho
Bokeh or fuck off.
Photogrammetry is cool and all, and I love Rens' work, but since nobody has an actually good technology to decompose the real object into proper albedo, specular, etc components, it ends up tending to look a bit odd without a lot of manual work. Some lighting inevitably stays baked in like pre-PBR work though the detail itself is always immaculate. The tech exists in the research sphere tho not ready for prime-time. Not to mention it can only recreate what already exists in the world, and matching the look of photogrammetry with custom content is a bitch. One or the other always sticks out like a sore thumb. I remember looking around at the environments when I tried out my buddy's Battlefront 2 demo and on any human made assets you have sharp, broken geometric angles, texture stretching, uneven texel density but as soon as you hit stuff they scanned the overall quality jumps and the style changes. If you don't mind another video OP, another branch off exploring the limits of photogrammetry: Alec Moody (artist on Tribes Ascend, Borderlands 2, The Order 1886) is [url=http://polycount.com/discussion/184167/development-log-building-a-complete-car-from-scans-vr-game-dev/p1]recreating an entire car from scans[/url] and making a "game" from it. Shit even measures the amount of force on bolts as you screw them in more and more. [media]https://youtu.be/YlsobMyM8rQ[/media] Skip to ~33 for a non-VR timelapse. Oh, and about the bokeh looking bad. That one was weird. UE4 has decent bokeh for objects beyond the focal plane but shits the bed when they're before the focal plane. Has to do with separability of rasterized objects in deferred rendering. Not sure why it looked bad across the board here. Maybe it didn't scale the bokeh size correctly when he rendered at a super high res.
That Depth of Field effect looked hilariously cheap compared to everything else and completely pulverized the suspension of disbelief. Aside from that it all looks gorgeous.
[QUOTE=Oicani Gonzales;53115636]whats worse is ue4 has bokeh out of the box lol[/QUOTE] [quote=me;420]Maybe it didn't scale the bokeh size correctly when he rendered at a super high res.[/quote] probably this. the bokeh likely didnt scale up so its smaller than the blur scale.
[QUOTE=GHOST!!!!;53115559]Bokeh or fuck off.[/QUOTE] [t]http://puu.sh/ziT30.png[/t] [editline]7th February 2018[/editline] Or do you mean bokeh in some other form? Like "texture based" bokeh or whatever, in lack of better words. I think this is using the CircleDOF that UE4 has. I think it's more photographically correct and all, but at the cost of being heavy to render and quite grainy. [editline]7th February 2018[/editline] Found some info on it: [quote=UE4 Documentation]The Circle DoF is the newest addition and matches real world cameras more closely. Simialar to the Bokeh DoF you can see circular shaped Bokeh with sharp HDR content. Very large Bokeh are a weakness of this algorithm, they appear noisy and not as clear as the BokehDOF can do them. The method has good performance (quite better than Bokeh DoF) and good quality, especially in the transition areas. The defaults of CircleDOF are intentionally weak to be more pleasant. To get the maximum effect you can tweak settings. Use a low Aperture for a large Bokeh, get close to an object and change the Field of View to be lower. Play with the focal distance to get some scene content in out of focus, ideally in front and behind the focal plane. Using showflag VisualizeDepthOfField might help to find the right settings.[/quote]
[QUOTE=DOG-GY;53115615]Photogrammetry is cool and all, and I love Rens' work, but since nobody has an actually good technology to decompose the real object into proper albedo, specular, etc components, it ends up tending to look a bit odd without a lot of manual work. Some lighting inevitably stays baked in like pre-PBR work though the detail itself is always immaculate. The tech exists in the research sphere tho not ready for prime-time. Not to mention it can only recreate what already exists in the world, and matching the look of photogrammetry with custom content is a bitch. One or the other always sticks out like a sore thumb. I remember looking around at the environments when I tried out my buddy's Battlefront 2 demo and on any human made assets you have sharp, broken geometric angles, texture stretching, uneven texel density but as soon as you hit stuff they scanned the overall quality jumps and the style changes. If you don't mind another video OP, another branch off exploring the limits of photogrammetry: Alec Moody (artist on Tribes Ascend, Borderlands 2, The Order 1886) is [url=http://polycount.com/discussion/184167/development-log-building-a-complete-car-from-scans-vr-game-dev/p1]recreating an entire car from scans[/url] and making a "game" from it. Shit even measures the amount of force on bolts as you screw them in more and more. [media]https://youtu.be/YlsobMyM8rQ[/media] Skip to ~33 for a non-VR timelapse. Oh, and about the bokeh looking bad. That one was weird. UE4 has decent bokeh for objects beyond the focal plane but shits the bed when they're before the focal plane. Has to do with separability of rasterized objects in deferred rendering. Not sure why it looked bad across the board here. Maybe it didn't scale the bokeh size correctly when he rendered at a super high res.[/QUOTE] This is super impressive! I'm curious to see where this goes, loved My Summer Car and if he takes all of this stuff and adds driving or at least engine testing to it, it could be a killer app for VR, not to mention a fantastic education aid.
Photogrammetry is a mixed bag in my opinion. Yea it looks cool but it's also very constraint. This scene would totally fall apart if you tried to add something animated in there. It looks good how it is because it carefully tweaked for this exact scene, anything dynamic and your scene would fall apart. Also as people have mentioned, UE's bad DoF pretty much destroyed the scene. There is also a certain UE look to it which you can still notice, so personally it is a good scene but practically its just not very useful.
Photogrammetry is cool but it's always going to look real because it's capturing real world objects and textures, that's where most of the photoreal heavy-lifting is happening in this video, whether it's realtime or rendered it doesn't change much. I'm not complaining though, building natural environment assets by hand is a complete bitch and scanning real word objects is a huge time saver.
On the subject of DoF, the upcoming Media Molecule game Dreams has a pretty interesting approach: [t]https://puu.sh/l43vP.png[/t]
There's some people doing neat things with unity as well. [media]https://www.youtube.com/watch?v=GXI0l3yqBrA[/media] [media]https://www.youtube.com/watch?v=R8NeB10INDo[/media] [media]https://www.youtube.com/watch?v=tSDsi2ItktY[/media]
I feel like this is why HL3 is taking so long.
[QUOTE=DOG-GY;53115615]Not to mention it can only recreate what already exists in the world, and matching the look of photogrammetry with custom content is a bitch. One or the other always sticks out like a sore thumb. I remember looking around at the environments when I tried out my buddy's Battlefront 2 demo and on any human made assets you have sharp, broken geometric angles, texture stretching, uneven texel density but as soon as you hit stuff they scanned the overall quality jumps and the style changes.[/QUOTE] This makes me incredibly excited because some people will create physical sculptures and scan them in instead of creating them digitally from scratch.
I actually had a talk with someone today, about how long it might take before traditional rendering for CG is phased out completely in favor of real time rendering. Honestly, I give it 10 to 15 years. [editline]edit[/editline] Y'all can disagree all you want, technology improves and expands exponentially. The last 20 years we went from something like Toy Story needing 4 hours per frame to render, to what we're seeing in this video - live. 10-15 years for engines like vray to get phased out is not unrealistic.
[QUOTE=triplej05;53116016]I feel like this is why HL3 is taking so long.[/QUOTE] How completely unrelated.
[QUOTE=triplej05;53116016]I feel like this is why HL3 is taking so long.[/QUOTE] No way source 2 can achieve this kind of graphics. In any case Valve is in shambles right now, like its literally new people almost all the veterans left. Even if they were literally working on hl3 it could never live to the hype or expectations, it might as well just be another black mesa.
[QUOTE=Zephyrs;53115972][media]https://www.youtube.com/watch?v=tSDsi2ItktY[/media][/QUOTE] Shame this one dropped quality like crazy. Side note, a few years ago Unity hired a really great rendering researcher who has made countless contributions to improving realism. They have some cool things coming in the future. [QUOTE=Ott;53116113]This makes me incredibly excited because some people will create physical sculptures and scan them in instead of creating them digitally from scratch.[/QUOTE] This would be a ballin idea to apply to everything in a game world. Imagine a game that puts you on the scale of claymation figures or has characters made from real crafts objects.
[QUOTE=DOG-GY;53116492]Shame this one dropped quality like crazy. Side note, a few years ago Unity hired a really great rendering researching who has made countless contributions to improving realism. They have some cool things coming in the future.[/QUOTE] I think the quality drop off is more that they were trying to render skin in episode 3 instead of plastic. Skin has always been one of the major sticking points in achieving photo-realism. Or really anything organic with a waxy/oily surface. The plant leaves in the OP look like shit too, and for much the same reason.
There's practically movie quality skin shaders for realtime out there. Not sure what they're using in the Adam short. For me it was the cloth. It didn't have enough geometry and appears to lack a microfiber shader which is standard fare these days. The transparent plastic of the chicks headgear also bothered me. Seemed to lack a backface and refraction. With the plant leaves in the OP, I'd say it's much more due to two things: 1) Standard techniques for photogrammetry can't capture the fine level details of leaves. You need to take care to capture the transmission and roughness qualities with the Albedo. 2) A. Unreal 4's directional light, unless Rens is on the latest preview builds, [I]shocking has never been physically based at all[/I]. They're finally addressing this with 4.19, despite the correct formulas sitting on a Frostbite engine GDC slide for like two full years. 2) B. On top of this, they have never had a physically based camera or post processor, which means in the end, light values still get interpreted with shitty pre-pbr code, and get spat out with shitty pre-pbr code. This combined with A just inherently limits the realism of the engine by making artists hack away to a good result. It's almost certainly why people say UE4 has a look about it.
[QUOTE=Minigun;53116178]No way source 2 can achieve this kind of graphics. In any case Valve is in shambles right now, like its literally new people almost all the veterans left. Even if they were literally working on hl3 it could never live to the hype or expectations, it might as well just be another black mesa.[/QUOTE] The Vive demos showed off photogrammetry.
[QUOTE=Minigun;53116178]No way source 2 can achieve this kind of graphics.[/QUOTE] While I agree with the rest of your post, I wouldn't be so sure about this. While I've lost all hope for a (decent) Half-Life game in the foreseeable future, especially with them allowing shit like Prospekt and Hunt Down the Freeman to be sold, which just shows how little they care about the IP at this point, I've still got my hope for Source 2 to be an excellent engine. I may be wrong, but afaik Valve wants to make it as accessible as Unity and UE4, tapping into the same market, making it a direct competitor to the aforementioned. This is something that makes complete sense considering Valve's current approach, a relatively small investment first, and then you get a continuous stream of money from other people's labour. Source 2 may turn out to be complete garbage with excellent map making tools taped onto it, but I still think Valve's smarter than that, and they certainly have the economy to make both Epic and Unity Technologies sweat.
[QUOTE=triplej05;53116016]I feel like this is why HL3 is taking so long.[/QUOTE] its not coming out for the 100000th time, lose your faith and give up on it already
Sorry, you need to Log In to post a reply to this thread.