• Ray tracing support found in EA's Battlefield V
    91 replies, posted
https://www.bit-tech.net/news/gaming/pc/ray-tracing-support-found-in-eas-battlefield-v/1/
NVIDIA's Livestream just confirmed this.
if it's nividia exclusive then no thanks
What's the difference between ray tracing and the way things are currently rendered? Is there a screenshot I can see which illustrates the difference?
https://www.youtube.com/watch?v=1_L43c7aQj0
Its gonna be tesselation all over again. I can see all the terrible amd performance stats already.
still not gunna make me buy it
It's hard to give you a comparison screenshot because ray tracing is fundamentally different. I'll attempt to explain the difference without a screenshot (skip to the bottom for a layman explanation); Typically, video games have been using a technique called rasterization. This is a technique where individual polygons are rendered using a pipeline system. In modern video games, we use a variety of shaders (vertex shaders, geometry shaders, pixel shaders, etc.). Shaders all take some form of input and produce output. Arguably, the most important shader is the pixel shader. This shader receives a facet of the polygon as it will be seen on screen, and using information such as its world position and normal it can calculate stuff like lighting and occlusion. https://files.facepunch.com/forum/upload/109887/9b6150c1-881f-47d9-9170-6bf7afb73111/image.png There are many varieties of rasterization pipelines. Over the years, game render engines have evolved and achieved better, faster and more realistic graphics through the use of many different approximate algorithms and lighting techniques. For example, screen-space ambient occlusion was a big change in realism because it provided extra depth to scenery where regular shadows could not. Ray tracing is different but not entirely foreign in how it works. Ray tracing works, as the name implies, by shooting a ray into some scene. For each pixel, a ray with a direction pertaining to that pixel is shot into the scene. This ray is then checked for intersection with each object. Upon intersection, a lighting contribution is calculated, typically using calculations that approximate nature. After this calculation, the ray is bounced off the surface and the process starts again - until some maximum depth. Think of it as a reverse light ray. There's a subset of ray tracing called path tracing. It's slower but typically gives more realistic results because it's unbiased and more like nature. The key difference between path and ray tracing is that in path tracing, bounce rays are randomly selected, and lighting contribution is only counted when the ray eventually hits a light source. Then there's reverse path tracing, which starts from the light source (like in nature), but is even slower since less rays end up hitting the camera grid. Layman explanation / TLDR: Ray tracing more closely follows how light works in nature and is therefore more able to produce realistic imagery. It's slower to render using ray tracing because there are more factors involved during the calculation of each ray. Movie studios use ray tracing (or a derivative technique) to produce realistic CGI, which can take days or weeks to render. Now that it's coming to games with hardware backing, we can expect a substantial increase in graphics quality. Here's a comparison screenshot anyways, for funsies: https://www.imgtec.com/wp-content/uploads/2017/02/comparisonV8.png
In addition to Natrox's answer, practically speaking the biggest difference between the two methods historically is that ray tracing is computationally much more expensive, and consequently unsuitable to real-time game graphics. It's been widely used in static rendering to create photorealistic lighting effects, but it's only now that specialized hardware is making it competitive with conventional lighting techniques for gaming.
And it'll suffer the same fate as PhysX.
Why do I have this feeling that games with this enabled will run like absolute shit even with a 2080? It's fucking ray tracing, it's the cornerstone of real time rendering for decades
Considerng it's quite common for people to want to run at 120+ FPS these days to match their refresh rate I'm guessing it's going to be a while before its use is tenable.
it'll go away for a time and someone will make a better system to replace it? that doesn't sound so bad tbh
It won't be on the entire screen. https://www.youtube.com/watch?v=moKV5_BpxjM
AMD will make a better system and then release it for everyone. Or even better, it just be a fucking pipedream that will never come back because outside of some select examples, all of the shit we've wanted from PhysX have basically been forgotten about.
https://www.youtube.com/watch?v=rpUm0N4Hsd8
It's difficult to tell without a proper benchmark, but given these new cards will have a new discrete hardware accelerator specifically for ray tracing the impact to performance will likely be smaller than you think.
It hasn't been mentioned in this thread so it should be noted that the RTX 2000 series GPUs have dedicated hardware for raytracing; it's not as simple as "the cards have gotten powerful enough to do more real-time ray tracing". Current AMD cards do not have equivalent dedicated hardware for raytracing on their cards.
If it's PC and NVidia exlcusive, yeah probably. Too bad cause games really need to upgrade from screen space reflections
Which means running it on anything else will probably run like shit, just like how Physx started if you ran it in software mode without I physx card.
Sure they'll be monsters, but it'll probably use raytracing in specific places and for specific reasons, where lower resolutions and gross approximations will be easily forgiveable. Also, there has been a lot of research and development in the last few years regarding data structures and hybrid-ish reconstruction/denoising algorithms that make both path-tracing and ray-tracing much faster than ever.
Which is why it's ass. Also PhysX was created by a company called Ageia, Nvidia bought them
Not impressed to be honest, I've seen plenty reflections in games that are good enough. Meanwhile who cares about accurate physics anymore? Bullshit.
I saw it from the leaks, didn't believe it then and still find it sort of hard to believe. Real-time ray tracing sounds like a dream for the future, but if it's actually viable with current tech and software I'll be pretty excited for it's potential within the next few years.
I'm going to stay skeptical about how it performs until I see actual benchmarks from real gameplay.
It uses an API by Microsoft, so its open to AMD as well. Problem is more hardware side. AMD has no extra Ray Tracing cores (to put it simple). They can not archive close to the performance you need to run this on current hardware. Not fully sure if there is more to it software side but until AMD changes their hardware this will be mostly NVidia exclusive. In the end its just a few more extra graphical effects.
Of course. This is a hardware solution; if you don't have the hardware, you don't get access to the benefits. At some point in the future it will propagate enough that hardware accelerated ray tracing will be a standard feature on all GPUs going forward, much like programmable shaders and H.264 decoding in the past. Until then though there's going to be many competing standards and a premium for entry.
AMD will open it up on a card that isn't all that competitive so it won't sell well. Rinse repeat.
I'm not really sure how some of you are mad that this might not be available on non-nvidia hardware. Software developers can't engineer their way around some problems, and hardware literally not supporting a feature is a pretty damn solid reason. If it can't run, it can't run.
I've said it before and I'll say it again - 10 to 15 years from now, engines like VRay or Arnold will be so insanely obsolete in favor of live-rendered graphics such as this.
Sorry, you need to Log In to post a reply to this thread.