Gamescom 2018 Day 0: NVIDIA reveals RTX 2080 series graphics cards
The Gamescom has not begun yet, but it has already been exciting. Today, the day before the start of the kermesse, NVIDIA has organized the event in which the new GTX 11 graphics cards should have been announced.
Thanks to my friend Fabio Mosca, CTO di AnotheReality, I got to knew about how to participate in the event. I arrived there one hour before the start time, but I already found a huge line of people excited to attend this very important launch.
At 18.20 the event started and NVIDIA CEO Jen-Hsun Huang took the stage. He immediately said that all rumors we heard were wrong and so we all got excited. After that, he started with a super long introduction about how NVIDIA has done lately cool stuff. We all just wanted to hear about the new graphics cards and he instead started talking about how the Turing architecture is disruptive and gives awesome performances for the graphics cards and is able to perform real-time raytracing thanks to a dedicated part of the GPU called the RT core.
He said that Turing now every frame is able to perform various calculations, like the standard rasterization, the innovative real-time ray tracing and the DNN processing that basically is an AI algorithm that manages to predict what is the color of the pixels that the graphics card has not processed (wow, the graphics card is able to predict colors of pixels?). Then he spent basically 1 hour, REALLY 1 HOUR, talking about how scenes rendered with ray tracing are cool: ray tracing can show better transparencies, can show multiple reflections between objects, can handle correctly area lights, can make you render more realistic soft shadows, etc… The first examples were cool, but after a while, I would have liked to go on stage and say: “Hey Mr. Huang, we got the concept, can’t we please go on and present these new GTX 11?”.
He continued showing how are things with and without raytracing and of course things with ray tracing were far better as you can see in the below video.
But sometimes this was exaggerated or superfluous. For instance he showcased some demos of reflections where without raytracing there were not reflections on a car, while we all know that in present games, even without raytracing, there are nice faked reflections on cars; he continued highlighting how the reflection of the light on the eye of the enemy was so perfect in a new FPS game while actually while playing those games no one gives a single f*ck about the reflections inside the eyes of the enemies, because you are just in a killing spree and all you want to do is killing people fast before getting killed. Then there was a demo of Battlefield 5 with the flames of a flamethrower that were so horrible that felt like moving sprites made with pixel art (really, I think Doom had a better graphics for flames)… that we all thought that no one would have liked to see that terrible pixel art reflected on all the surfaces of the game (seriously NVIDIA, there are free fire assets on the asset store that are far better than that!).
Anyway, he presented games like Assetto Corsa, the New Tomb Raider, the new Battlefield V, Metro Exodus, etc… that will all use this ray tracing technology that will make the graphical outcome of the game better.
All this mega introduction and mega praise of ray tracing were so boring I was about to fall asleep. But at a certain point, he said, “and how are you going to play all these cool games that exploit ray tracing?“. And so started the presentation of…
… the RTX 2080! So, no GTX 11 series guys and girls. Now NVIDIA is all about RTX. All the boring introduction was only to explain the shift of attention of NVIDIA towards real-time ray tracing technology. Real-time ray tracing has become the key feature for NVIDIA that in fact will now declare how many rays per second a graphics card can calculate. It is now like the main metric for this company.
The new RTX 2080 will come in three flavors: RTX 2070, RTX 2080 and RTX 2080 Ti. You can read the main RTX specs in the image below. As you can see, they now declare how many rays per second the graphics cards can evaluate. And they also offer the number of RTX-OPS. RTX is the real-time framework of Turing, that calculates the final shading by mixing rasterization, ray tracing, AI and other stuff.
The prices will be $499 for the 2070, $699 for the 2080 and $999 for the 2080 Ti. Someone has complained about the price of the 2070: usually, the low-tier NVIDIA graphics card has always been cheaper (around $350). NVIDIA consider this graphics cards as a complete revolution for graphics ecosystem and has showcased how all these 3 graphics cards are far more powerful than the GTX 1080 and GTX 1080 Ti.
You can preorder these graphics cards from today and they will be available from September, 20th.
Feedbacks from the presentation have been pretty mixed. For sure these graphics cards are a revolution, a complete paradigm change. Real-time raytracing is very welcome and can be important for sectors like architecture and design for instance. And we all appreciate the addition of the new VirtuaLink connector, that lets you connect your headset to the graphics cards with just a little USB-C connector.
BUT, there are enormous but. First of all, all these graphical improvements of real-time technology are not always noticeable: most of the examples he showcased were about things that looked good even before the introduction of RTX. Then, only 21 games will support it at launch… all other games won’t exploit it. Then, of course, the developers have to work to add these ray-tracing features to the games. NVIDIA claims that it works out of the box, but honestly, I don’t think that a Unity developer can have all this cool raytracing for free without any efforts in writing shaders and such. Then there is the comparison with the old graphics cards: all the comparison has been made talking about raytracing technology… but how the old and new graphics cards compare in games that don’t use ray tracing?
Then we have the problem of VR: as Scott Hayden of Road To VR has pointed out to me: “Ok, this is great on a single screen at 30 FPS, but what about 90 FPS on dual screen in VR?“. NVIDIA has not answered to this and in fact has never mentioned VR inside the presentation. And this is not good, considering that ray tracing could add a completely new dimension of realism to virtual reality experiences, something that could really increase immersion and so presence. I think that in VR ray tracing benefits can be more noticeable. I really hope that at least the RTX 2080 Ti is capable of that, otherwise, it would be a great pity. Maybe foveated rendering, when it will be implemented into headsets, will help in achieving this.
So, it has been a great announcement, but maybe I would have hoped for something a little better and more VR oriented.
After the presentation that there has been a little party with food and drinks all offered by NVIDIA, so participating in this event has been really worth it! And that’s it for today: now I need some sleep. Subscribe to my newsletter, share my article to support me not sleeping in these days… and see you tomorrow for my first day of Gamescom!
Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.
Well I have to admit I pulled the trigger about 2 minutes after Nvidia’s live stream from Gamescom ended, and pre-ordered a RTX 2080 Ti Founders Edition direct from Nvidia which is due towards the end of September.
It wasn’t inexpensive at £1099, but I’m already seeing pricing for AIB cards from Asus, Zotac, etc. and these are ranging from £1149 to £1350!!
After quickly hitting the limits of my GTX1070 when super sampling my Vive, I look at this new GPU as a “brute force” solution for current generation headsets until we get better optimisation in VR applications?
If you look into the technical specifications for the new RTX cards, the sheer amount of compute power, memory speed and memory bandwidth is very impressive, regardless of ray tracing / AI function.
Thanks for the feedback on the specifications… yesterday I was supertired and have not looked at them. Great to know that it is a great leap forward.
What puzzles me is that they almost didn’t care about them and didn’t care about VR during the presentation. But I think they wanted to highlight the great importance of realtime raytracing and also the reason for the name change from GTX to RTX…
I was not surprised by the lack of information about VR, their move to RTX is a paradigm shift in computer graphics, and since VR is still a tiny niche, they put their marketing muscle behind the RTX aspect as this will impact the wider games and graphics industry in a much larger way?
There seems to be a lot of negativity about the RTX launch on the usual internet sites and forums, but I have no doubt about the differences in sheer power for brute forcing poorly optimised VR applications. Until we see good optimisation practise (which you do see in some VR apps) then its a game of throwing as much horsepower as possible at the task!
A quick comparison between 2080Ti and 1080Ti:
-Cuda cores 4352 versus. 3584
-Clock speed 1635 versus. 1582
-Memory speed 14 Gbs versus. 11 Gbs
-Memory bandwidth. 616 GB/s versus. 484 GB/s
Then of course you have Tensor cores on the RTX, which were only previously on Tesla and Titan V, and that brings another aspect (floating-point fused-multiply-add (FMA) operations i.e. mixed precision).
I’ve done some more digging around and found this, with explains the generational leap and huge cost difference between 1080 (Pascal) and 2080 (Turing). You can see the additional hardware Turing has and the massive difference in compute power, especially looking at that Tensor core.
https://uploads.disquscdn.com/images/2047661376b6b1f97e33ba253484dc2a3c68ef1392bec9db6e144b0b0d84e2af.jpg
Well, thanks for the valuable comparison. Regarding the leap, they made it very clear during the presentation: the Turing does a lot of things together, while the Pascal does almost only shading. So it is a huge step forward.
And yes, VR is a niche. Our little lovely niche 🙂
This is like Apple Watch version 1, or AR core v1.0. So probably not amazing for VR yet but very promising for the future. I’d love to get my hands on one of these once Unity supports it without bugs/ low level coding required 🙂
MIght be great for real-time AR where we just want a little area of the screen realistic ?
We’d love to get real-time shadows per frame (we rasterise a hack at the moment on mobile)
I’d love to hear about Unity support. Today I got a contact at Unity, hope to know more about this.
Regarding AR I think that these calculations could be useful for that, too… but in AR we have another problem that we don’t have in VR and that is called light estimation. Virtual elements can be ultra-realistic, but if the lighting doesn’t match the one of the real world, nothing appears well…
Yeah We won’t get ‘proper’ light estimation until we have a HDR 360 camera on every smartphone , I can’t see that happening any time soon. I guess RTX isn’t coming to mobile any time soon anyway!!!
For desktop VR, maybe we just need to wait for the next gen Hmds with eye tracking and we get real-time rays.
It’s super cool tech to watch for now but I’m not going to bet our next production milestone on it!
Yes, NVIDIA CEO mentioned you on stage and said that the new RTX is great but can’t compete with you at all. He almost decided to close NVIDIA because of that. It was a sad moment.
I’ll try to check at NVIDIA booth if there is some VR demos, I’m curious as well. I’ll also try to ask the developer via e-mail…
YAY! Now I’ll try to kill AMD and I’ll call this year closed haha
Great, let us know of any other news on this!