Concluding their Gamescom festivities for their newly-introduced GeForce RTX 20-series, NVIDIA has revealed a bit more about the hardware, its features, and its expected performance this evening. Tonight NVIDIA is announcing the new Ansel RTX features in GeForce Experience, as well as some game performance metrics for the GeForce RTX 2080 up against the GeForce GTX 1080. After recent hands-on demos featuring real-time raytracing, NVIDIA is offering some numbers for out-of-the-box and Deep Learning Super Sampling (DLSS) performance in traditionally rendered games.

NVIDIA RTX Support for Games
As of August 20, 2018
Game Real-Time Raytracing Deep Learning Super Sampling (DLSS)
Ark: Survival Evolved - Yes
Assetto Corsa Competizione Yes -
Atomic Heart Yes
Battlefield V Yes -
Control Yes -
Dauntless - Yes
Enlisted Yes -
Final Fantasy XV - Yes
Fractured Lands - Yes
Hitman 2 - Yes
Islands of Nyne - Yes
Justice Yes
JX3 Yes
MechWarrior 5: Mercenaries Yes
Metro Exodus Yes -
PlayerUnknown's Battlegrounds - Yes
ProjectDH Yes -
Remnant: From the Ashes - Yes
Serious Sam 4: Planet Badass - Yes
Shadow of the Tomb Raider Yes -
The Forge Arena - Yes
We Happy Few - Yes

Starting with NVIDIA’s DLSS – and real-time raytracing for that matter – we already know of the supported games list. What they are disclosing today are some face-value 4K performance comparisons and results. For DLSS, for now we can only say that it uses tensor core-accelerated neural network inferencing to generate what NVIDIA is saying will be high-quality super sampling-like anti aliasing. Though for further technical information, this is a project NVIDIA has been working on for a while, and they have published some blogs and papers with some more information on some of the processes used. At any rate, the provided metrics are sparse on settings or details, and notably measurements include several games rendered in HDR (though HDR shouldn't have a performance impact).

Otherwise, NVIDIA presented a non-interactive Epic Infiltrator 4K demo that was later displayed on the floor, comparing Temporal Anti Aliasing (TAA) to DLSS, where the latter provided on-average near-identical-or-better image quality but at a lower performance cost. In this case, directly improving framerates. To be perfectly honest, I spent the entire floor time talking with NVIDIA engineers and driver/software developers, so I have no pictures of the floor demo (not that anything less than a direct screenshot will really do it justice). Ultimately, the matter of DLSS is somewhat nuanced and there isn’t much we can add at the moment.

Overall, the idea is that even in traditionally rasterized games without DLSS, the GeForce RTX 2080 brings around 50% higher performance than the GeForce GTX 1080 under 4K HDR 60Hz conditions. Because this excludes real-time raytracing or DLSS, this would be tantamount to ‘out of the box’ performance. Though there were no graphics settings or driver details to go with these disclosed framerates, so I'm not sure I'd suggest reading into these numbers and bar charts one way or another.

Lastly, NVIDIA announced several new features, filters, and supported games for GeForce Experience’s Ansel screenshot feature. Relating to GeForce RTX, one of the features is Ansel RT for supported ray-traced games, where a screenshot can be taken with a very high number of rays, unsuitable for real-time but not an issue for static image rendering.

Ansel RTX also leverages a similar concept to the tensor core accelerated DLSS with ‘AI Up-Res’ super resolution, which also works for games not integrated with Ansel SDK.

In terms of the GeForce RTX performance, this is more-or-less a teaser of things to come. But as always with unreleased hardware, judgement should be reserved until objective measurements and further details. We will have much more to say when the time comes.

Comments Locked

92 Comments

View All Comments

  • eva02langley - Thursday, August 23, 2018 - link

    Edit:

    It is not AMD fault that Nvidia is charging overpriced "mainstream" GPU, it is *****NVIDIA***** own greed and their investors. Stop blaming AMD for Nvidia behaviors.
  • Manch - Thursday, August 23, 2018 - link

    Well milk~ isn't wrong. The fact that AMD doesn't have anything to compete is one of the reasons NVIDA can get away with this. No competition to undercut them so they don't have to drop prices. Intel did for years with no competition. Now look. We're getting 6 core 8 core CPUs at great prices. Competition is always a win for the consumer.
  • TrackSmart - Wednesday, August 22, 2018 - link

    HOLD THE PRESSES: The launch price of the 2080 is the same as the launch price of the 1080 Ti! NVidia is comparing their new product to a lower-priced part. The 1080 Ti is about 30% faster than a vanilla 1080, so push all of those bars up to 1.3 instead of 1.0, for a more realistic comparison.

    In an apples-to-apples comparison, versus the 1080 Ti, we are looking at a ~10% to 25% performance uplift compared to a similarly priced card from the previous generation. Note that I'm ignoring DLSS speed gains, which are only relevant if anti-aliasing of some kind is used in the comparison -- something not really needed when running at 4k resolution.

    I may be wrong, but I call this an unfair comparison.
  • shabby - Wednesday, August 22, 2018 - link

    Don't worry, when reviews come out we will get all the answers. With the 1080ti hitting $529 the 2080 will seem overpriced and should drop rather quickly.
  • bloodyduster - Wednesday, August 22, 2018 - link

    The chart is not using price as a basis of comparison, but rather the class of GPU. It is comparing this year's 2080 to last year's 1080. They are comparing the performance gain from last year's model, which seems like a fair comparison to me
  • Alistair - Wednesday, August 22, 2018 - link

    It's usually fair, because the price has usually been similar or not that different. Not this time.
  • TrackSmart - Wednesday, August 22, 2018 - link

    This! The question is how much more performance-per-dollar Nvidia is providing with this new set of cards compared to the previous generation. The name they have applied to the new cards merely reflects marketing. Unless reviewers call BS, Nvidia gets away with showing "50%" more performance by applying a favorable new naming convention!

    Think about what comparing a $700 launch price 1080 Ti to a $1000 launch price 2080 Ti really means when looking at the performance per dollar that Nvidia are providing. The same applies to this "2080 vs 1080" comparison.
  • eva02langley - Thursday, August 23, 2018 - link

    Let's be realistic here, no card are at the MSRP, they are all around the Founder Edition price bracket at +-50$... and that is in the US, elsewhere in the world, we are talking crazy prizes. Europe get screwed big time for a 2080 TI.

    1400-1500 euros = 1600-1700$ US
  • PeachNCream - Thursday, August 23, 2018 - link

    Reviewers will not call BS outright because they won't get pre-NDA lift hardware to play with if they besmirch the good name of the OEM by being rightfully critical. Benchmark graphs may show the reality, but the text around it will likely be very forgiving or carefully phrased to moderate the tone. It's the world we live in now since journalists can't realistically be asked to purchase GPUs out of the business expense account and they'd not get the hardware in advance of a NDA lift even if they did which would ultimately endanger the reviewer in question's job security.
  • Icehawk - Thursday, August 23, 2018 - link

    Read HardOCP then, they have been on NV's shitlist for a while and buy their own cards - they said NV is actually going to give them RTXs this time around but I doubt they are going to sugar coat anything.

    I agree with the price:perf thing, the models don't line up at all anymore so I don't give a flying F what they call it - how much does it cost and what improvement will I get? I was hoping to move from my 970 as it's pretty maxed running 2-4k games and I recently went from 2k>4k on my main screen so I'd like a bit more performance. I imagine some of the pricing woes are due to the insane memory pricing and large amount of it on these cards and not just NV sticking it to us because AMD is pulling another Radeon.

Log in

Don't have an account? Sign up now