Gaming Performance (Discrete GPU)

For our gaming tests, we are using our AMD Ryzen 9 5950X paired with an NVIDIA RTX 2080 Ti graphics card. Our standard test suite consists of 12 titles, tested at four configurations:

  • Stage 1: Actual Gaming (1080p Maximum Quality, or equivalent)
  • Stage 2: All About Pixels (‘4K Minimum’ Quality)
  • Stage 3: Medium Low (‘1440p Minimum’)
  • Stage 4: Lowest Lows (720p Minimum or lower)

The final three settings are a set of CPU-limited gaming, and help find the limit of where we move from CPU limited to GPU limited. Some users baulk at this testing finding it irrelevant, however these configurations have been widely requested over the years. The contraire to this testing is the first setting, at 1080p Maximum: this being requested given that 1080p is the most popular gaming resolution, and Maximum Quality because this graphics card should be able to handle almost everything at that resolution at very playable framerates.

All the details for our gaming tests can be found in our #CPUOverload article.

Stage 1: Actual Gaming
AMD Ryzen 9 5950X, SMT On vs SMT Off
AnandTech Settings Average
FPS
95th
Percentile
Chernobylite 1080p Max 100% -
Civilization 6 1080p Max 103% -
Deus Ex: MD 1080p Max 99% 100%
Final Fantasy 14 1080p Max 102% -
Final Fantasy 15 8K Standard 100% 99%
World of Tanks 1080p Max 100% 102%
World of Tanks 4K Max 103% 102%
Borderlands 3 1080p Max 101% 103%
F1 2019 1080p Ultra 103% 106%
Far Cry 5 1080p Ultra 104% 104%
GTA V 1080p Max 99% 100%
RDR 2 1080p Max 100% 100%
Strange Brigate 1080p Ultra 101% 101%

In real-world gaming situations, there’s very little to pick between having SMT enabled or disabled. Almost universally it is either beneficial or a smidgen better to have it enabled, with F1 2019, Civilization 6, and Far Cry 5 seemingly the best recipients. I’ve also added in the Stage 3 result from World of Tanks, just because that benchmark doesn’t really have a proper settings menu.

Stage 2: All About Pixels
AMD Ryzen 9 5950X, SMT On vs SMT Off
AnandTech Settings Average
FPS
95th
Percentile
Chernobylite 4K Low 99% -
Civilization 6 4K Min 105% -
Deus Ex: MD 4K Min 98% 100%
Final Fantasy 14 4K Min 102% -
Final Fantasy 15 4K Standard 100% 100%
Borderlands 3 4K Very Low 101% 104%
F1 2019 4K Ultra Low 100% 100%
Far Cry 5 4K Low 101% 100%
GTA V 4K Low 100% 101%
RDR 2 8K Min 100% 100%
Strange Brigate 4K Low 100% 100%

With our high resolution settings with minimal quality, there is only one outlier in Civilization 6 on the average frame rates, which seem to be a bit higher when SMT is enabled.

Stage 3: Medium Low
AMD Ryzen 9 5950X, SMT On vs SMT Off
AnandTech Settings Average
FPS
95th
Percentile
Chernobylite 1440p Low 100% -
Civilization 6 1440p Min 105% -
Deus Ex: MD 1440p Min 97% 96%
Final Fantasy 14 1440p Min 102% -
Final Fantasy 15 1080p Standard 101% 105%
World of Tanks 1080p Standard 101% 101%
Borderlands 3 1440p Very Low 103% 105%
F1 2019 1440p Ultra Low 99% 99%
Far Cry 5 1440p Low 99% 99%
GTA V 1440p Low 100% 99%
RDR 2 1440p Low 100% 100%
Strange Brigate 1440p Low 100% 100%

At the more medium settings, we’re starting to see some more variation (Borderlands gets a few percent from SMT). We’re starting to see Deus Ex:MD drop off a bit with SMT enabled.

Stage 4: Lowest Lows
AMD Ryzen 9 5950X, SMT On vs SMT Off
AnandTech Settings Average
FPS
95th
Percentile
Chernobylite 360p Low 106% -
Civilization 6 480p Min 102% -
Deus Ex: MD 600p Min 91% 91%
Final Fantasy 14 768p Min 102% -
Final Fantasy 15 720p Standard 99% 102%
World of Tanks 768p Min 101% 100%
Borderlands 3 360p Very Low 108% 110%
F1 2019 768p Ultra Low 102% 105%
Far Cry 5 720p Low 100% 101%
GTA V 720p Low 99% 98%
RDR 2 384p Low 100% 103%
Strange Brigate 720p Low 95% 95%

This is perhaps our most varied set of results, with Deus Ex:MD showing an almost 10% drop with SMT enabled. DEMD is usually considered a CPU title, but so is Chernobylite, which sees a 6% gain. Borderlands is +8-10% with SMT enabled, which is more of a modern game. However, I doubt anyone is playing at these resolutions.

Overall Gaming Performance

If we take full averages from all the data points, then we’re seeing a rough +1% gain in performance in the more complex scenarios across the board.

Resolution Average Comparison
AMD Ryzen 9 5950X, SMT On vs SMT Off
AnandTech Setting aka Average
FPS
95th
Percentile
Stage 1 1080p Max Actual Gaming 101% 101%
Stage 2 4K+ Min All About Pixels 101% 101%
Stage 3 1440p Min Medium Lows 101% 101%
Stage 4 < 768p Min Lowest Lows 100% 101%

In reality, any loss or gain is highly dependent on the title in question, and can swing from one side of the line to the other. It’s clear that Deus Ex prefers SMT off, and F1 2019 or Borderlands prefers SMT on, but we are talking fine margins here.

CPU Performance Power Consumption, Temperature
Comments Locked

126 Comments

View All Comments

  • MrSpadge - Thursday, December 3, 2020 - link

    > We’ve known for many years that having two threads per core is not the same as having two cores

    True, and I still read this as an argument against SMT in forums. IMO it should be pointed out clearly that the cost of implementing either also differs drastically: +100% core size for another core and ~5% for SMT.
  • WaltC - Thursday, December 3, 2020 - link

    Intel began its HT journey in order to pull more efficiency from each core--basically, as performance was being left on the table. Interestingly enough, after Athlon and A64, AMD roundly criticized Intel because the SMT thread was not done by a "real core"...and then proceeded to drop cores with two integer units--which AMD then labeled as "cores"...;) Intel's HT approach proved superior, obviously. IIRC. It's been awhile so the memories are vague...;) The only problem with this article is that it tries to make calls about SMT hardware design without really looking hard at the software, and the case for SMT is a case for SMT software. Games will not use more than 4-8 threads simultaneously so of course there is little difference between SMT on and off when running most games on a 5950. You would likely see near the same results on a 5600 in terms of gaming. SMT on or off when running these games leaves most of the CPU's resources untouched. Programs designed and written to utilize a lot of threads, however, show a robust, healthy scaling with SMT on versus no SMT. So--without a doubt--SMT CPU design is superior to no SMT from the standpoint of the hardware's performance. The outlier is the software--not the hardware. And of course the hardware should never, ever be judged strictly by the software one arbitrarily decides to run on it. We learn a lot more about the limits of the software tested here than we learn about SMT--which is a solid performance design in CPU hardware.
  • WarlockOfOz - Friday, December 4, 2020 - link

    Very valid point about how games won't see a difference between 16 and 32 threads when they only use 6. Do you know if this type of analysis has been done at the lower end of the market?
  • WaltC - Friday, December 4, 2020 - link

    It's been common knowledge established a few years ago when AMD started pushing 8 core (and greater) CPUs that games don't require that many cores and that 6 cores is optimal for gaming right now. And if you do more than game, occasionally, and need more than 6 threads then SMT is there for you. As the new consoles are 8-core CPU designs, over time the number of cores required for optimal game performance will increase.
  • Flying Aardvark - Friday, December 4, 2020 - link

    Consoles are 8-core now, with 2 reserved for the OS. Count on 6-cores being optimal for gaming for quite some time.
  • Kangal - Friday, December 4, 2020 - link

    Thoset Jaguar cores was more like a 4c/8t processor to be fair. And they weren't that much better than Intel's Atom cores, a far cry from Intel's Core-i SkyLake architecture. And current gen consoles were very light on the OS, so maybe using 1-full core (or 2-threads-shared) leaving only 3-cores for games, but much better than the 2-core optimised games from the PS3/360 era.

    The new gen consoles will be somewhat similar, using only 1-full core (2-threads) reserved for the OS. But this time we have an architecture that's on-par with Intel's Core-i SkyLake, with a modern full 8-core processor (SMT/HT optional). This time leaving a healthy 7-cores that's dedicated to games. Optimisations should come sooner than later, and we'll see the effects on PC ports by 2022. So we should see a widening gap between 4vs6-core, and to a lesser extent 6vs8-core in the future. I wouldn't future-proof my rig by going for a 5700x instead of a 5600x, I would do that for the next round (ie 2022 Zen4).
  • AntonErtl - Sunday, December 6, 2020 - link

    The 8 Jaguar cores are in no way like 4c/8t CPUs; if you use only half of them, you get half the performance (unless your application is memory/L2-bandwidth-limited). Their predecessor Bobcat is about twice as fast as an Bonnell core (Atom proper), and a little slower than Silvermont (the core that replaced Bonnell), about half as fast as Goldmont+ (all at the clock rates at which they were available in fanless mini-ITX boards), one third as fast as a 3.5GHz Excavator core, and one sixth as fast as a 4.2GHz Skylake.
  • Oxford Guy - Sunday, December 6, 2020 - link

    Worse IPC than Bulldozer as far as I know. Certainly worse than Piledriver.

    Really sad. The "consoles" should have used something better than Jaguar. It's bad enough that the "consoles" are a parasitic drain on PC gaming in the first place. It's worse when they not only drain life with their superfluous walled gardens but also by foisting such a low-grade CPU onto the art.
  • Kangal - Thursday, December 24, 2020 - link

    The Jaguar cores share alot of DNA with Bulldozer, but they aren't the same. It's like Intel's Atom chips compared to Intel Core-i chips. With that said, 2015 Puma+ was a slight improvement over 2013 Jaguar, which was a modest improvement over the initial 2011 Bobcat lineup. All this started in 2006 with AMD choosing to evolve their earlier Phenom2 cores which are derivatives of the AMD Athlon-64.

    So just by their history, we can see they're inline with Intel's Atom architecture evolution, and basically a direct competitor. Where Intel had slightly less performance, but had much lower power-draw... making them the obvious winner. Leaving AMD to fill in the budget segments of the market.

    As for the core arrangement, they don't have full proper cores as people expect them. Like the Bulldozer architecture, each core had to share resources like the decoder and floating-point unit. So in many instances, one core would have to wait for the other core. This boosts multithreaded performance with simple calculations in orderly patterns. However, with more complex calculations and erratic/dynamic patterns (ie Regular PC use), it causes a hit to the single-thread performance and notable hiccups. So my statement was true. This is more like a 4c/8t chipset, and it is less like a Core-i and much more like an Atom. But don't take my word for it, take Dr Ian
    Cutress. He said the same thing during the deep dive into the Jaguar microarchitecture, and recently in the Chuwi Aerobox (Xbox One S) article.
    https://www.anandtech.com/show/16336/installing-wi...

    Now, there have been huge benefits to the Gaming PC industry, and game ports, due to the PS4/XB1. The first being the x86-64bit direct compatibility. Second was the cross-compatability thanks to Vulkan and DirectX (moreso with PS4 Pro and XB1X). The third being that it forced game developers to innovate their game engines, so that they're less narrow and more multi-threaded. With PS5/XseX we now see a second huge push with this philosophy, and the improvements of fast single-thread performance and fast-flash storage access. So I think while we have legitimate reasons to groan about the architecture (especially in the PS4) upon release, we do have to recognize the conveniences that they also brought (especially in the XB1X). This is just to show that my stance wasn't about console bashing.
  • at_clucks - Monday, December 7, 2020 - link

    @Kangal, Jaguar APUs in consoles are definitely not "like a 4c/8t processor" because they don't use CMT. They are full 8 cores. Their IPC may be comparable with some newer Atoms although it's hard to benchmark how the later "Evolved Jaguar" cores in the mid generation console refresh compares against the regular Jaguar or Atom.

Log in

Don't have an account? Sign up now