Test Bed and Setup

As per our processor testing policy, we take a premium category motherboard suitable for the socket, and equip the system with a suitable amount of memory running at the manufacturer's maximum supported frequency. This is also typically run at JEDEC subtimings where possible. It is noted that some users are not keen on this policy, stating that sometimes the maximum supported frequency is quite low, or faster memory is available at a similar price, or that the JEDEC speeds can be prohibitive for performance. While these comments make sense, ultimately very few users apply memory profiles (either XMP or other) as they require interaction with the BIOS, and most users will fall back on JEDEC supported speeds - this includes home users as well as industry who might want to shave off a cent or two from the cost or stay within the margins set by the manufacturer. Where possible, we will extend out testing to include faster memory modules either at the same time as the review or a later date.

Test Setup
AMD Ryzen 3000 AMD Ryzen 5 3600
Motherboard GIGABYTE X570 I Aorus Pro (1.12e)
CPU Cooler AMD Wraith
DRAM G.Skill FlareX 2x8 GB DDR4-3200 C14
GPU Sapphire RX 460 2GB (CPU Tests)
MSI GTX 1080 Gaming 8G (Gaming Tests)
PSU Corsair AX860i
SSD Crucial MX500 2TB
OS Windows 10 1909

Many thanks to...

We must thank the following companies for kindly providing hardware for our multiple test beds. Some of this hardware is not in this test bed specifically, but is used in other testing.

Hardware Providers
Sapphire RX 460 Nitro MSI GTX 1080 Gaming X OC Crucial MX200 +
MX500 SSDs
Corsair AX860i +
AX1200i PSUs
G.Skill RipjawsV,
SniperX, FlareX
Crucial Ballistix
DDR4
Silverstone
Coolers
Silverstone
Fans

 

Scale Up vs Scale Out: Benefits of Automation

One comment we get every now and again is that automation isn’t the best way of testing – there’s a higher barrier to entry, and it limits the tests that can be done. From our perspective, despite taking a little while to program properly (and get it right), automation means we can do several things:

  1. Guarantee consistent breaks between tests for cooldown to occur, rather than variable cooldown times based on ‘if I’m looking at the screen’
  2. It allows us to simultaneously test several systems at once. I currently run five systems in my office (limited by the number of 4K monitors, and space) which means we can process more hardware at the same time
  3. We can leave tests to run overnight, very useful for a deadline
  4. With a good enough script, tests can be added very easily

Our benchmark suite collates all the results and spits out data as the tests are running to a central storage platform, which I can probe mid-run to update data as it comes through. This also acts as a mental check in case any of the data might be abnormal.

We do have one major limitation, and that rests on the side of our gaming tests. We are running multiple tests through one Steam account, some of which (like GTA) are online only. As Steam only lets one system play on an account at once, our gaming script probes Steam’s own APIs to determine if we are ‘online’ or not, and to run offline tests until the account is free to be logged in on that system. Depending on the number of games we test that absolutely require online mode, it can be a bit of a bottleneck.

Benchmark Suite Updates

As always, we do take requests. It helps us understand the workloads that everyone is running and plan accordingly.

A side note on software packages: we have had requests for tests on software such as ANSYS, or other professional grade software. The downside of testing this software is licensing and scale. Most of these companies do not particularly care about us running tests, and state it’s not part of their goals. Others, like Agisoft, are more than willing to help. If you are involved in these software packages, the best way to see us benchmark them is to reach out. We have special versions of software for some of our tests, and if we can get something that works, and relevant to the audience, then we shouldn’t have too much difficulty adding it to the suite.

Turbo, Power, and Latency CPU Performance: System Tests
Comments Locked

114 Comments

View All Comments

  • Sonik7x - Monday, May 18, 2020 - link

    Would be nice to see 1440p benchmarks across all games, also would be nice to see a comparison against an i7-5930K running which is also a 6c/12T CPU
  • ET - Monday, May 18, 2020 - link

    Would be even nicer to see newer games. Anandtech reviews seem to be stuck in 2018, both for games and for apps, and that makes them a lot less relevant a read than they could be.
  • Dolda2000 - Monday, May 18, 2020 - link

    You exaggerate. The point of a benchmark suite can't really be to contain the specific workload you're going to put on the CPU (since that's extremely unlikely to be the case anyway), but to be representative of typical workloads, and I think the application selection here is quite adequate for that. In comparison, I find it much more important to have more comparison data in the Bench database. There may be a stronger case to be made for games, but I find even that slightly doubtful.
  • MASSAMKULABOX - Saturday, May 23, 2020 - link

    Not only that , but slightly older games are much more stable and have most of the performance ironed out. New games are getting patches and downloads all the time, which often affect perfomance. I want to see "E" versions I.e 35/45w
  • ThreeDee912 - Monday, May 18, 2020 - link

    They already mentioned in the 3300X review they'll be going back and adding in new games like Borderlands 3 and Gears Tactics: https://www.anandtech.com/show/15774/the-amd-ryzen...
  • flyingpants265 - Monday, May 18, 2020 - link

    I haven't used AnandTech benchmarks for years. They don't use enough CPUs/GPUs, they never include enough results from the previous generations, which is the most important thing when considering upgrades and $ value.

    Also, the "bench" tool does not include enough tests or hardware.
  • jabber - Tuesday, May 19, 2020 - link

    Yeah nothing annoys me more than Tech Site benchmarks that only compare the new stuff to hardware than came out 6 months before it. If I see say a new GPU tested I want to see how it compares to my RX480 (that a lot of us will be looking to upgrade this year) than just a 5700XT.
  • johnthacker - Tuesday, May 19, 2020 - link

    Eh, nothing annoys me more than Tech Site benchmarks that only compare the new stuff to other new stuff. If I have an existing GPU or CPU and I'm not sure if it's worth it for me to upgrade or stick with what I've got, I want to see how something new compares to my existing hardware so I can know whether it's worth upgrading or whether I might as well wait.
  • Pewzor - Monday, May 18, 2020 - link

    I mean Gamer's Nexus uses old games as well.
  • Crazyeyeskillah - Tuesday, May 19, 2020 - link

    Just to make this crystal clear, the reason they HAVE to use older games is because all of the PAST data has been run using those games. Most review sites only get sample hardware for a week or less to run the tests then return it in the mail. You literally wouldn't have anything to compare the data to if you only ran tests on the latest and greatest games and benchmarks.

    When I see people making this complaint I understand that they are new to computers and just want them to understand that there is a reason why benchmarks are limited. Most hardware review sites don't make any money, or if they do, it's enough to pay one or MAYBE two staff members (poorly.) Ad revenue is garbage due to add blockers on all your browsers, and legitimate sites that don't spam blickbaity rumors as news are shutting down. Just look what happened to Hardocp.com, one of the last true honest review sites.

    The idea that hardware sites all have stockpiles of every system imaginable and the thousands of hours it would take to constantly setup and run all the new games and benchmarks is pretty comical.

Log in

Don't have an account? Sign up now