The AMD Ryzen 7 5800X3D Review: 96 MB of L3 3D V-Cache Designed For Gamers
by Gavin Bonshor on June 30, 2022 8:00 AM EST- Posted in
- CPUs
- AMD
- DDR4
- AM4
- Ryzen
- V-Cache
- Ryzen 7 5800X3D
- Zen3
- 3D V-Cache
Gaming Performance: 1080p
All of our game testing results, including other resolutions, can be found in our benchmark database: www.anandtech.com/bench. All gaming tests were with an RTX 2080 Ti.
For our gaming tests in this review, we re-benched the Ryzen 7 5800X processor to compare it directly against the newer Ryzen 7 5800X3D on Windows 11. All previous Ryzen 5000 processor were tested on Windows 10, while all of our Intel Alder Lake (12th Gen Core Series) testing was done on Windows 11.
We are using DDR4 memory at the following settings:
- DDR4-3200
Civilization VI
Final Fantasy 14
Final Fantasy 15
World of Tanks
Borderlands 3
Far Cry 5
Gears Tactics
Grand Theft Auto V
Red Dead Redemption 2
Strange Brigade (DirectX 12)
Strange Brigade (Vulcan)
Focusing on our test suite at 1080p resolutions, again the AMD Ryzen 7 5800X3D performs well compared with other Ryzen 5000 processors and Intel's Alder Lake processors. In certain situations, Intel's 12th Gen Core with its higher IPC performance and faster core frequencies performs better, but only in certain titles where extra L3 cache doesn't have an effect on performance.
In titles that favor V-Cache, the performance differences are pretty conclusive and where extra L3 cache can be utilized, the 5800X3D and its 96 MB of 3D V-Cache sit comfortably above the competition.
125 Comments
View All Comments
Qasar - Thursday, June 30, 2022 - link
Makaveli, he wont, according to only him. the m1 is the best thing since sliced bread.GeoffreyA - Thursday, June 30, 2022 - link
Lor', the Apple Brigade is already out in full force.at_clucks - Saturday, July 2, 2022 - link
Look, if we're being honest the M line punches above its weight so to speak and yes, it does manage to embarrass traditional (x86) rivals on more than one occasion.This being said, I see no reason to review it here and compare it to most x86 CPUs. The reason is simple: nobody buys an M CPU, they buy a package. So comparing M2 against R7 5800X3D is pretty useless. And even if you compare "system to system" you'll immediately run into major discrepancies, starting with the obvious OS choice, or the less obvious "what's an equivalent x86 system?".
With Intel vs. AMD it's easy, they serve the same target and are more or less a drop in replacement for each other. Not so with Apple. The only useful review in that case is "workflow to workflow", even with different software on different platforms. Not that interesting for the audience here.
TheMode - Tuesday, July 5, 2022 - link
I never understood this argument. Sure some people will decide never to buy any Apple product, but I wouldn't say that this is the majority. Let's assume that M3 gets 500% faster than the competition for 5% of the power, I am convinced that some people will be convinced to switch over no matter the package.GeoffreyA - Wednesday, July 6, 2022 - link
I'd say it's interesting to know where the M series stands in relation to Intel and AMD, purely out of curiosity. But, even if it were orders faster, I would have no desire to go over to Apple.mode_13h - Thursday, July 7, 2022 - link
Yes, we want to follow the state of the art in tech. And when Apple is a leading player, that means reviewing and examining their latest, cutting edge products.Jp7188 - Friday, July 8, 2022 - link
Perhaps that could make sense in a seperate piece, but M1 doesn't really have a place in a gaming focused review. M1 gaming is still in its infancy as far as natively supported titles.Skree! - Friday, July 8, 2022 - link
Skree!mode_13h - Sunday, July 10, 2022 - link
I'm going to call spam on this. Whatever it's about, I don't see it adding to the discussion.noobmaster69 - Thursday, June 30, 2022 - link
Better late than never I guess.Am I the only one who found it puzzling that Gavin recommends DDR4-3600 and then immediately tests with a much slower kit? And ran gaming benchmarks with a 4 year old GPU?