NVIDIA talks about RTX 2080 graphics cards and virtual reality
When NVIDIA announced the RTX 2080 graphics cards, I was there. I watched Jensen Huang announcing the new graphics cards with amazing features like real-time raytracing and variable rate shading. Of course, after the announcement, a lot of questions started coming to my mind, especially regarding the use of these new graphics cards with regard to VR and AR.
So, I sent some questions to the company and after some time, I finally got some answers from NVIDIA China! Of course, they haven’t revealed to me any secrets, but it has been interesting reading some details anyway. Here you are the short interview I had with them, to which I added some personal notes.
We’ve seen that NVIDIA has been able to offer real-time raytracing for consumers’ graphics cards. What have been the difficulties you had to overcome to offer such kind of incredible functionality?
As Jensen has said during the keynote: it took 10 years of development. We worked with key industry players and academia on the right algorithm (BVH) and then developed the right hardware to accelerate that algorithm (RT Cores).
BVH stands for Bounding Volume Hierarchy and basically means that the whole scene gets represented by a hierarchical tree of simple bounding figures that define what is the scene structure. At the leaves of this tree, there are the actual 3D elements of the scene. The fathers of these leaves are simple 3D objects (e.g. cubes) that enclose each one of them completely and then traversing the tree from the bottom to the top, you get always bigger boxes containing the elements that are below them.
Why do we need to represent the scene as a hierarchical set of boxes? Well, the answer is that computations using boxes are much easier than the ones regarding complex meshes. So, when doing ray tracing, when you cast a particular ray to see if it intersects with some elements in the scene (lights or other objects), you just verify the collision of the ray with the bounding boxes: if the ray doesn’t collide with a big bounding box, you are sure that it won’t collide with all the elements contained in that box and you spare a lot of computational time (you avoid checking the collision with all that branch of the tree). If it collides, instead, you start checking the collision of the ray with the children bounding boxes, applying the same fast-rejection process. In the end, you only do heavy calculations if the ray collides with the final leaf containing the true mesh of the object. Doing ray tracing means so traversing the BVH tree for each ray.
Basically, NVIDIA is hinting to the fact that the optimizations given by BVH and by fast BVH traversal have been the key to be able to offer real-time raytracing and that RT Cores have been the right hardware to accelerate that algorithm based on BVH. (In this post series from 2012, NVIDIA already talked about BVH and its implementations)
When will this integration be ready inside the various game engines for us developers? Will they be already available when the RTX 2080 cards will be released?
RTX games use the Microsoft DX12 DXR ray tracing API. DXR will be broadly available in the Fall release, which we expect to be early October. Both Battlefield V and an RTX patch for Shadow of the Tomb Raider will be available soon after that.
Microsoft has already released an SDK for ray-tracing, supporting NVIDIA algorithms. You can refer to this Microsoft blog post, with a final link to the SDK and tutorials on Github to have more info about it. Reading the post, I also got a confirmation on the statements above regarding BVH. In fact, DXR APIs introduce a new concept:
The acceleration structure is an object that represents a full 3D environment in a format optimal for traversal by the GPU. Represented as a two-level hierarchy, the structure affords both optimized ray traversal by the GPU, as well as efficient modification by the application for dynamic objects.
The Microsoft Windows 10 October update, that includes official RTX support, is already being rolled out by Microsoft… but for instance I’ve not received it yet (if you want to verify if you have already upgraded to it, check your Windows 10 version: if the bulid number is 1809, then you have the October update)
After the update will be released, games and applications will be able to exploit the DXR APIs and so offer raytracing rendering. That’s why the games supporting RTX have not been released yet: they are waiting that the operating system will support realtime ray tracing.
The presentation at Gamescom talked about your partnership with Unreal Engine for offering real-time RTX. What about Unity, instead? Will it be available also for Unity developers?
Yes.
In the previous link, there is, in fact, a reference saying that both Unreal Engine and Unity will implement DXR APIs. According to online forums, UE4 should support DXR APIs in its 4.22 version. We are currently at version 4.20, so Unreal developers should wait a bit before playing with real-time raytracing (unless they want to use directly the DXR APIs from code). 4.22 is expected to be released around the end of the year, someone says. Regarding Unity, I haven’t been able to find any news on a possible release date.
How will be difficult to integrate Real-time raytracing inside our projects? What are the operations to be performed?
Please refer to Jensen’s “it just works message.” Game developers can just use Microsoft’s standard DXR APIs to access NVIDIA’s RTX technology. Developers use DXR to build BVHs to replace their shadows and/or reflections functions with ray tracing, shading, and denoising. It a relatively focused change, which doesn’t require reworking the engine or artwork.
So, some changes are required to transform your game into one that supports real-time raytracing. We are not talking about massive changes, but for sure on ones that will require some time to be implemented. Reading the above linked Microsoft DXR post, I found that apart from the BVH hierarchy that has to be constructed, there are new specific RTX shader types that can be used:
A set of new HLSL shader types including ray-generation, closest-hit, any-hit, and miss shaders. These specify what the DXR workload actually does computationally. When DispatchRays is called, the ray-generation shader runs. Using the new TraceRay intrinsic function in HLSL, the ray generation shader causes rays to be traced into the scene. Depending on where the ray goes in the scene, one of several hit or miss shaders may be invoked at the point of intersection. This allows a game to assign each object its own set of shaders and textures, resulting in a unique material.
What will be the performances of real-time raytracing with the newest RTX graphics cards inside Virtual Reality applications? As a VR enthusiast, I would love to see realtime raytracing in VR, but I doubt that these graphics cards can perform RTX at 90FPS for 2 eyes, as required for VR. Will it be possible to use real-time raytracing in VR at least with the RTX2080 Ti? Will we have a satisfactory framerate?
Ray Tracing performance depends on how the developer uses the technology and what effects they want to ray trace. Furthermore, NVIDIA has many additional technologies like DLSS, Variable Rate Shading, and Texture Space Shading that together can significantly enhance VR performances.
This has been basically a non-answer. And I can understand… the framerate depends on what we are going to render. I guess that even in VR if we just render a dull cube in the center of the void, real-time raytracing can guarantee 90 FPS.
But I found anyway interesting that the answer has not been a complete no: it seems that the company is hinting that thanks to DLSS (Deep Learning Super Sampling, a new innovative technology for anti-aliasing), Variable Rate Shading (the tecnology that lets you decide which parts of the frame to render at better quality and which ones at lower quality, and that, if used with eye tracking hardware, is able to offer you foveated rendering) and Texture Space Shading (that can help a lot in virtual reality because it can make the graphics card render the right eye view by recycling a lot of pixels already shaded for the left eye view), it may be possible that some VR applications can work with real-time raytracing. What is not clear is how complex a VR experience can be to work in realtime with raytracing. I hope that when the DXR support will be completely rolled out, some people will start making experiments in this sense.
For sure, in the future eye tracking can help really a lot in decreasing render times and will be a game changer in this sense.
UPDATE: As redditor /r/octoplow pointed out, we should not also forget Multi-View Rendering (MVR), that is very relevant for the owners of wide-FOV headsets, like for instance Pimax or StarVR One.
Redditor /r/slindev has commented my considerations in an interesting way:
It’s written as if the author thinks that you can either use raytracing or you can’t, while what’s really the case is as stated by Nvidia in that interview, that it depends on what for and how exactly the raytracing is used in a specific title or game scene. Scene complexity will probably effect it a bit, but likely not THAT much (it will increase the required memory and updating the acceleration structure will also not be free). However the number of intersection tests, which will mostly depend on the number of rays used be the application is gonna have a massive impact. A hard shadow should get away with just one ray per pixel, if you do shadows at a lower resolution and scale up the result you’ll get away with even less. Direct reflections could also be done cheap, while full realtime GI with a couple of bounces for the light will end up being expensive or noisy. And again might be possible at a lower resolution than what’s actually displayed in. So it depends a lot on what the developers decide to be a good balance between quality and performance for the raytracing. It’s meant to enhance things beyond what current rendering usually does, but it’s not ready to replace it yet.
I also think that this could be very interesting for audio and not just for graphics.
If we don’t consider raytracing, according to your benchmarks, what is the performance gain in using these new RTX graphics cards in a VR application (wrt the old GTX cards)?
RTX 20-series cards are up to 50% faster than the previous generation. RTX technologies like Variable Rate Shading helps further enhance performance in VR games like “In Death”.
So, even if we don’t consider real-time raytracing, these new graphics cards are much better than the previous ones.
What are the other functionalities offered by the newest RTX graphics cards for AR and VR experiences?
VirtualLink: Turing GPUs are designed with hardware support VirtualLink, a new open industry standard being developed to meet the power, display, and bandwidth demands of next-generation VR headsets through a single USB-Type-C connector.
What are the next features that NVIDIA will implement in the future regarding AR and VR?
In addition to DLSS, Variable Rate Shading and Texture Space Shading can be applied to improve performance for a given image quality in VR and AR.
And that’s it for this interesting interview… and I really want to thank a lot NVIDIA for its kind answers.
These new graphics cards seem very interesting and I hope to get one of them one day! The new technologies they offer, like real-time raytracing, are really new and we will still need time to understand how to exploit them properly in virtual reality… at this time they are not implemented in the most popular game engines, so they are really at the cutting edge. But I really hope that in the future they will help a lot in having more credible virtual reality scenes, that we’ll offer us a greater sense of immersion.
And while we wait for that moment, why don’t you subscribe to my newsletter to receive in your inbox interviews like this one? 😉
(Header image by NVIDIA)
Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.
Very interesting reading. Going to be fascinating to see what this new card can do once my PC is built, just waiting for replacement Asus Rog Hero Z390 motherboard as first one was DOA 🤔
https://uploads.disquscdn.com/images/988e4b489b175c30b2fde9ecb3c375334cd45f4bb631c1f4841f2ce45c954d2f.jpg
WOHAAA! Even if, news of the day, some people are having issues with RTX 2080, maybe there are some defective models out there
Yes have read of faulty RTX, seems to mainly FE (Nvidia founders edition) rather than AIB (add in board) partners like Asus, MSI.
Talking of faults, retailer says I damaged motherboard (it arrived damaged) refusing refund, my credit card company is pursuing them for my money.
with all the problems building new PC, the standalone headset is definitely the future of consumer VR!
Oh no! So sorry for that 🙁
Standalone is the future… maybe with some more computational power, though!
Credit Card company flexed muscles 💪retailer issued refund 👍
But seriously, how many consumers will spend $$$ and hassle on building high end PC?
What I loved about Daydream, no hassle..into VR in seconds. Wireless streaming of YoutubeVR in HD, ChromeVR, WebVR, free to spin around no tether 😘
Glad that you had your money back!
thanks. have another motherboard coming today from different supplier.
then the most difficult question..which pcvr headset! I have been spoiled by using Vive Pro, Vive Focus, Lenovo Mirage. Can Iive with Rift? Or wait for Rift S? Can i wait…do I purchase StarVR, try for Pimax 5K?
Depends on the money you have. I have read wonderful things about the StarVR, but it will be damn expensive. Pimax 5K+ seems nice, but there may be some disturbing distortions at the edge of your vision.
I love the Vive Focus, you know that… but if you buy it, you have almost no content for it. Mirage Solo is a better UX, but it is more uncomfortable and does not have integrated audio.
PC is built, time for headset! I’m probably getting a Rift, at £400 with all the free content and the great Touch controllers its a no brainer. The Oculus exclusive content looks great too. £400 also retains some budget in case we see any SteamVR partner headset with Knuckles in 2019?
Great choice! Rift+touch is great… I own it as well! Fingers crossed for Knuckles… even if a friend of mine has tried them and has said that Touch are more ergonomics
Picked up Marvel bundle this morning. Then Oculus software wouldn’t install. Then Windows became unstable and needed reinstallation. Of course the installation did not work…several times.
My mind hurts with PC and I’ve been building /using them for many years. Oculus software seems to like Chrome, crashed with Explorer, installing now 👍
I’ve always liked Touch and loved the Rift headset when I’ve used it, so much better than original Vive, Vive Pro using same lenses was letdown, if it had the new Valve lenses and knuckles it would have been a sale from me
Come on… all this stress will have a great VR reward in the end!
Got my displayport cable, just setting up now. I am going find a meditation app as i think Robo Recall will not help 🤭