Striking Unreal Engine 4 tech demo looks good enough to live in
51 replies, posted
[QUOTE=zerosix;47055339]what were you fucking expecting realtime GI?[/QUOTE]
If everything is baked, if you put a game into that environment, the objects will look weird and out of place. All I can think of where this could work is hidden object style game.
The point is, it is not showcasing UE4 or what it can do and is rather misleading since this technique is not very viable for game use. Visualization sure but not games.
Do you know how many modern titles (that actually look pretty good) exist with baked lighting?
Most of them.
[QUOTE=xalener;47055425]Do you know how many modern titles (that actually look pretty good) exist with baked lighting?
Most of them.[/QUOTE]Besides P.T. which is pretty much a hidden object game, point me to one that looks like this.
[QUOTE=nagachief;47054731]You DO realize that realtime high resolution raytracing is still incredibly difficult for GPUs right? Not to mention the only test programs I could find for it are motherfucking CUDA-only.[/QUOTE]
There are other ways to do reflections(in UE4 even), besides raytracing, that look a lot better then this.
Its probably just kinda taxing in this demo, because the bathroom has like 5 mirrors.
[editline]1st February 2015[/editline]
The thing that's most uncanny about this is the unreal movement of the camera, because a lot of scenes do really feel more like a photo then then they do like a videogame.
[editline]1st February 2015[/editline]
[QUOTE=gjsdeath;47055274]Yeah. We can just use cubemaps for reflections and I'm sure the physically based rendering doesn't mean anything anyways.
That was sarcasm. Although it's true you actually can use really high texture resolution and poly count in Source ( and I assume Unity?). Physically based rendering and shading make a huge difference.
As for lighting and shadowing. I honestly think a good mix of realtime and pre-rendered is the best. If you try everything in real time, it can't be half as good as if it were pre-rendered. But at the same time, you need to make sure the real time methods don't contrast with the static ones. Unreal does a pretty good job at mixing dynamic and static methods. They make sure the dynamic shadows don't overlap lightmapped shadows from the same source. Plus now they have real time GI for some areas.
Really, it comes down to the content. If you have a mostly static environment. Why would you waste cycles rendering real time GI and shadows. You could actually be using those resources in another, more important part of your game. But of course if you have a rather dynamic environment, then more dynamic methods are useful.
Honestly. I think the demo was cool. For the most part the reflections were done really well, but they went a little overboard with them. Most importantly the materials looked like what they were supposed to, which is why I think physically based rendering and shading will be an important next step in making games look realistic.[/QUOTE]
Physically based rendering is just the way you define a surface/material. There is nothing that you from creating and using a PBR shader in Source or Unity. (The standard shader in Unity 5 is PBR)(Although you could probably get various PBR shaders from the assets store right now)
Also the mirrors and all reflective surfaces in this demo are some kind of cubemap.
Yeah I'm not dumb, I know proper realtime GI is way out of reach - although [URL="https://www.facebook.com/SonicEther"]Sonic Ether[/URL] has done some pretty impressive GI shaders in minecraft - but I was at least expecting something special. The only thing I criticized about the artist was the low res reflections, with them not being realtime reflections, he could have at least bumped up the resolution.
I know it's just a visualisation and not a demo - unlike some others it seems - It's an industry I'm currently working towards so I really appreciate the work that went into the creation of the material shaders - after all, it's those shaders which really make things look real. I particularly liked the wood material in the kitchen and the white marble on the wall - all the little imperfections really sold it; only being let down somewhat by the janky reflections.
[editline]1st February 2015[/editline]
I wish I could talk more about UE4 but I simply can't, I have no experience and I have no idea how PBR works, even after reading Algorithmics Beginners Guid to PBR. I've been pestering my University tutors to get Unreal 4 for us but sadly I'm not doing another 3D or render related module until next year so I'll likely not have access even in the event they do get it.
[QUOTE=RichyZ;47056144]crysis 3 used baked lighting lol[/QUOTE]
Are you sure about that? I seem to remember Crytek making a big deal about their realtime lighting and the ability to change time of day on a whim.
[editline]1st February 2015[/editline]
I mean I know both solutions exist within the engine but you seem to be implying that it had none.
[editline]1st February 2015[/editline]
I know this isn't the best example seen as a lot of it is using Image Based Lighting but here's what I mean.
[media]http://www.youtube.com/watch?v=YHNWj8HmWwQ[/media]
Tried it on my 3570k/GTX 660 it runs a bit worse than AC Unity which is pathetic considering Unity has very similar graphics and I know its not the same thing, but this is just 1 single "map" what would happen if you add interactive elements?
The mirrors need a bit of work too, go near it and its pixels overload. The textures are really nice though I don't think I have seen anything like that so far.
Lighting is covered in this video - just look at the grass in the LOD part. I spent ages just dicking about in that level.
[media]http://www.youtube.com/watch?v=4qGK5lUyCwI[/media]
I love how this turned from a article showing off a UE4 tech demo to a "Which engine has the better lightning' thread
[QUOTE=gk99;47036390]A single apartment with baked lighting running 25FPS at 720p
I mean is this supposed to be impressive or
I mean yeah it looks good, but until I see a more open scale at a steady framerate I'm not really hyped.[/QUOTE]
I don't understand the point of all this cynicism. Nobody is heralding these videos as being demonstrable proof that Unreal Engine can now simulate every minute detail of the world's geometry and physical light properties in real time on consumer hardware. The point is that it is an exceptional level of realtime rendering quality that hasn't been seen before on consumer hardware, nor achievable with a one man team and a $19 license. It's like people want to assert themselves as being knowledgeable and indifferent just by letting everyone know they understand the basic concept of "baked lighting" and the meaning of the word "static". Yeah, you can't open the cupboards or blow a hole in the wall. But you can render in real time a Parisian apartment with a level of graphical fidelity that actually challenges the viewer to discern between it and reality. Is that really not enough to impress you?
[editline]2nd February 2015[/editline]
[QUOTE=itisjuly;47026037]To be fair if this is all baked it's not impressive at all. You could do the same in Unity or even source.[/QUOTE]
Maybe with immense amounts of work put into creating your own renderer and implementing that into those engines, yes. But without the work of numerous industry leaders in the field of computer graphics collaborating and working for years on the technology to render these scenes as well as the tools to create them, you wouldn't get very far. It's like saying your 1996 Honda Civic is just as impressive and powerful as a Ferrari because hypothetically you could spend a fortune installing a Ferrari motor and transmission in it.
UE4 has the capability to do mirrors using camera trickery like most game mirrors work, but this demo is an interior walkthrough from a hobbyist, so who even cares at this point?
Even on a professional level, the people who you'd be demoing apartment walkthroughs to aren't going to be pissing themselves over how it used cubemap captures from single points instead of portal's rendertarget cameras to simulate a bathroom mirror
[QUOTE=itisjuly;47055446]Besides P.T. which is pretty much a hidden object game, point me to one that looks like this.[/QUOTE]
Pretty damn sure PT relies on baked lighting.
[editline]2nd February 2015[/editline]
[QUOTE=dai;47057890]UE4 has the capability to do mirrors using camera trickery like most game mirrors work, but this demo is an interior walkthrough from a hobbyist, so who even cares at this point?
Even on a professional level, the people who you'd be demoing apartment walkthroughs to aren't going to be pissing themselves over how it used cubemap captures from single points instead of portal's rendertarget cameras to simulate a bathroom mirror[/QUOTE]
Yeah it's completely stupid that people look at this visualization made by a guy and decide to rack down on the entirety of UE4 because of things like baked lighting.
Guess they'd prefer if it ran at 3FPS and had real-time GI.
[QUOTE=itisjuly;47054907]You know what isn't though? Render textures. If postal 2 can have realtime mirrors, I don't see why this can't.[/QUOTE]
Postal 2 probably used render targets, which essentially re-render the entire scene again. It's insanely expensive to render. It's why you had 'cheap' (Dx8, cubemapped) and 'fancy' (Dx9,render target) water in the Source engine, back then it was a really fucking big deal. That's how mirrors have worked for the most part. The problem in this scene is that it's already quite taxing, having to re-render the scene several times from all mirrors will cripple even a tri-SLI 980 setup most likely.
On the topic of reflections, screen space reflections I see have the same issue in UE4 that every other game I've seen use it have, though still look rather nice if used well.
Screen Space Reflections
[thumb]http://dl-web.dropbox.com/get/SoulsBaseSSR_001.png?_subject_uid=115198093&w=AADJ20y7XMoRAOP4WrXwmrwSQXgZ0wMgMGv99n9zn7dLxA[/thumb]
Without SSR
[thumb]http://dl-web.dropbox.com/get/SoulsBaseShadowsOnly_001.png?_subject_uid=115198093&w=AACNUWLjsLn6WxjtbNUpFtubxyDhWqJgH3Hq5JYksllLVA[/thumb]
As seen on a potato computer
[thumb]http://dl-web.dropbox.com/get/SoulsBaseWL_001.png?_subject_uid=115198093&w=AABHSFaq49sNHMTjNLvFGvTFcQJXuA6QLUECW-6Ji5USvQ[/thumb]
[QUOTE=itisjuly;47026037]To be fair if this is all baked it's not impressive at all. You could do the same in Unity or even source.[/QUOTE]
Engines don't use the same techniques to bake lighting, results will vary.
another thing on the note of light baking, you need to bake for optimization anyways. You'd have to be a total fucking idiot to want to run a static scene with complex realistic global illumination with realtime raycasting and sit there whining about why it isn't running 60fps at 4k. It's impractical at best and will either cause you to run a half-decent looking scene at .5 fps, or if you force a higher framerate it's going to be extreme grainy approximations until you sit still for a bit to let it sit and render out the scene
[media]http://www.youtube.com/watch?v=BpT6MkCeP7Y[/media]
even with this visibly awful effect, gaming sites have been climbing overtop each other to herald the dawn of the realtime raytracing in mainstream gaming just from the mere mention of this being explored at a fundamental level.
Bake shit, processing is better spent on pretty much anything else. The only time ridiculous 100% dynamic lighting should be an option is if your scene is a product showcase with adjustable lighting, and there's far better options than a game engine to do those kinds of presentations, like Autodesk Showcase
Well Light Propagation Volumes are currently default in CryEngine, and Experimental in UE4, they have little overhead. And don't do any kind of pre-processing. (although there is caching going on). And they provide full dynamic GI.
You don't need to instantly go from no GI, to ditching your whole rasterization pipeline for a path-tracers.
[editline]2nd February 2015[/editline]
Also raytracing is most likely not gonna happen unless, there is some major spike on silicon performance, or we get something like GPUs with ASIC raytracers.
There is a raytracer card: [url]http://arstechnica.com/gadgets/2013/01/shedding-some-realistic-light-on-imaginations-real-time-ray-tracing-card/[/url]
And relevant comment:
[QUOTE] John Carmackid Software, local hero
jump to post
I wrote the following (slightly redacted) up a little while ago for another company looking at consumer level ray tracing hardware as it relates to games. I do think workstation applications are the correct entry point for ray tracing acceleration, rather than games, so the same level of pessimism might not be apropriate. I have no details on Imagination’s particular technology (feel free to send me some, guys!).
------------
The primary advantages of ray tracing over rasterization are:
Accurate shadows, without explicit sizing of shadow buffer resolutions or massive stencil volume overdraw. With reasonable area light source bundles for softening, this is the most useful and attainable near-term goal.
Accurate reflections without environment maps or subview rendering. This benefit is tempered by the fact that it is only practical at real time speeds for mirror-like surfaces. Slightly glossy surfaces require a bare minimum of 16 secondary rays to look decent, and even mirror surfaces alias badly in larger scenes with bump mapping. Rasterization approximations are inaccurate, but mip map based filtering greatly reduces aliasing, which is usually more important. I was very disappointed when this sunk in for me during my research – I had thought that there might be a place for a high end “ray traced reflections” option in upcoming games, but it requires a huge number of rays for it to actually be a positive feature.
Some other “advantages” that are often touted for ray tracing are not really benefits:
Accurate refraction. This won’t make a difference to anyone building an application.
Global illumination. This requires BILLIONS of rays per second to approach usability. Trying to do it with a handful of tests per pixel just results in a noisy mess.
Because ray tracing involves a log2 scale of the number of primitives, while rasterization is linear, it appears that highly complex scenes will render faster with ray tracing, but it turns out that the constant factors are so different that no dataset that fits in memory actually crosses the time order threshold.
Classic Whitted ray tracing is significantly inferior to modern rasterization engines for the vast majority of scenes that people care about. Only when two orders of magnitude more rays are cast to provide soft shadows, glossy reflections, and global illumination does the quality commonly associated with “ray tracing” become apparent. For example, all surfaces that are shaded with interpolated normal will have an unnatural shadow discontinuity at the silhouette edges with single shadow ray traces. This is most noticeable on animating characters, but also visible on things like pipes. A typical solution if the shadows can’t be filtered better is to make the characters “no self shadow” with additional flags in the datasets. There are lots of things like this that require little tweaks in places that won’t be very accessible with the proposed architecture.
The huge disadvantage is the requirement to maintain acceleration structures, which are costly to create and more than double the memory footprint. The tradeoffs that get made for faster build time can have significant costs in the delivered ray tracing time versus fully optimized acceleration structures. For any game that is not grossly GPU bound, a ray tracing chip will be a decelerator, due to the additional cost of maintaining dynamic accelerator structures.
Rasterization is a tiny part of the work that a GPU does. The texture sampling, shader program invocation, blending, etc, would all have to be duplicated on a ray tracing part as well. Primary ray tracing can give an overdraw factor of 1.0, but hierarchical depth buffers in rasterization based systems already deliver very good overdraw rejection in modern game engines. Contrary to some popular beliefs, most of the rendering work is not done to be “realistic”, but to be artistic or stylish.
I am 90% sure that the eventual path to integration of ray tracing hardware into consumer devices will be as minor tweaks to the existing GPU microarchitectures.
John Carmack [/QUOTE]
[QUOTE=Jazer;47058133]Engines don't use the same techniques to bake lighting, results will vary.[/QUOTE]
You can bake outside the engine in your modeling app. In that case results will be consistent.
Sorry, you need to Log In to post a reply to this thread.