• Ray tracing support found in EA's Battlefield V
    91 replies, posted
Excuse you, real top fraggers play on the lowest settings possible. Who knows when that 0.001ms input delay might win you an important scrimmage?
10 years ago? pfft thats nothing, kid watch this: https://steamuserimages-a.akamaihd.net/ugc/849343453243286092/3A3BC28D562535333DB001891A576B2F7F870510/ https://steamuserimages-a.akamaihd.net/ugc/849343453243279300/9E7528192CDD830A58944B70870AC0BDF2410B87/
my eyes are burning
This was a legit hilariously and downright unfair advantage before they fixed the thing with the leaves on the branches On another note, other than the blocky artifacts at the edges of each particle effect, the un-textured look was kind of pleasant to look at in a weird simplistic way
it looks quite odd when it comes to diffuse reflection. the flamethrower bits especially look like they're specular only.
Hey how'd you do that... I could use that, my computer is so slow!
it does look like it has a bit, but is certainly too dark and falls off unnaturally. none of it really picks up on the characters either. imagine how irl they'd be well lit by both directly by the nearby flame, and via its secondary reflections coming back up from the glossy wet pavement. not sure how many real bounces theyre doing here, so the latter may not end up contributing. i also noticed quite a bit of artifacts, seemingly as if seemed rays were culled early and it reverted to cubemaps. thats just speculation based on one viewing tho so they could be from anything.
Another important point is that this is an extension that uses available hardware for DX12: Nvidia has also apparently been working on a similar extension for Vulkan. You can read a pretty detailed slide deck on their ideas and concepts so far for this here. This will start as a vendor-exclusive extension but I imagine (based on interest expressed by others and vague hints from the various Vulkan working group members) that some of this might make it back into the core API eventually. AMD is also working on a raytracing acceleration library, too. Nvidias extensions and work on RTX just takes advantage of the tensor cores on Volta GPUs. Otherwise it looks like it just executes on the regular compute hardware, and Nvidia cards tend to present several (8 on the 1070) potentially concurrent hardware queues for execution of compute jobs, which can be used to accelerate it even without the unique "dedicated" hardware.
I was stupid and without any research I assumed everything behind it was proprietary both hardware and software wise, because that's how it usually is with Nvidia, but I'm very glad that I was wrong
Hence why I said 10-15 years from now. With the current pace of technical progress we'll reach a point where pre-rendering will become obsolete. Spending hours or even minutes on rendering a frame will become a thing of the past.
I don't think this is necessarily true. Faster rendering (and advanced AI) could lead to more use of CGI. We might get movies that are completely realistic but also completely artificial. Of course, if everything is computer generated, render times may be the same.
You don't seem to understand. Prerendering and real time rendering will always use different techniques. Even if prerendering gets so fast that you're practically rendering in real time (which won't happen, you can always do something more time consuming that gives you better results), it's still going to be a fundamentally different system. Real time rendering involves compromises and prioritizes speed over accuracy. Scene rendering is accuracy over speed. Tl;dr you're comparing apples to oranges and one isn't going to replace the other.
I understand perfectly fine, and it's not at all apples to oranges. We'll see in 10-15 years.
There's nothing to see, really. Realtime applications are engineered to cut corners whenever and wherever possible. Whether it be background object detail, texture resolution, particle simulation, max number of ray bounces, whatever; and this is just to get an acceptable frame rate. Movies may use that tech for realtime previews of scenes, but they will always render the final result using non-realtime methods, because the limitations drop away. You no longer have to roughly approximate things, or use cheap tricks to eek out those extra frames. This tech is designed to quickly approximate something that looks really cool. It's awesome, but it isn't going to revolutionize the film tech industry, because it isn't trying to. Plus, rendering engines have already been using raycasting to render for a long-ass time (with beautiful results), and yet it's only just now become practical to integrate it into AAA quality games, and even then the rays seem to be sparse.
Yes indeed, this is just as bad as CUDA is.
You'd probably say the same 10-15 years ago. Nvidia RTX and similar technologies can't match modern offline path tracer renderers. And they won't in 10-15 years either. The world of real-time and offline renderers will stay strongly divided. https://files.facepunch.com/forum/upload/490/c16505ad-0eee-4525-a55f-794b7619e94b/image.png This image was made with a Ray Tracing renderer released in 1991. I strongly doubt we'll see this quality of reflections and refractions in RTX powered games.
Explain
What was interesting is that in previous titles, battlefield had the AMD bias, utilising their tech... It appears that 5 is the first title in a long time where this is no longer the case. Searched after making this just to make sure, and I'm pretty late to the punch https://www.reddit.com/r/Amd/comments/8ln43w/battlefield_5_is_an_nvidia_title_i_thought_amd/
There's already a standard to do such tasks, and nvidia chose not to follow it.
I find it amusing how the ray tracing looks very good, but the flame particles look straight out of Duke Nukem 3D
You're not wrong, but they do look worse than they normally would due to the particle effect being shown in slow motion.
I think those are people with more than zero hindsight predicting that NVidia will gatekeep the hell out of this tech and make sure to bribe publishers to cripple the alternatives
Thats why im waiting for the next gen with 7nm. Should have the same leap 9xx had to the 10xx, which RT should get that efficiency too.
OpenCL came later
That PUBG screenshot is straight up hacking territory, holy shit.
That's not the point, NVIDIA made no effort to truly standardize the technology.
Huh, if the most powerful card can't even keep up with 1080p 60fps consistently, then I think I'll just wait on the 1080 Ti to drop in price and pick up one of those.
Sorry, you need to Log In to post a reply to this thread.