A Quick Note on Architecture & Features

With pages upon pages of architectural documents still to get through in only a few hours, for today’s launch news I’m not going to have the time to go in depth on new features or the architecture. So I want to very briefly hit the high points on what the major features are, and also provide some answers to what are likely to be some common questions.

Starting with the architecture itself, one of the biggest changes for RDNA is the width of a wavefront, the fundamental group of work. GCN in all of its iterations was 64 threads wide, meaning 64 threads were bundled together into a single wavefront for execution. RDNA drops this to a native 32 threads wide. At the same time, AMD has expanded the width of their SIMDs from 16 slots to 32 (aka SIMD32), meaning the size of a wavefront now matches the SIMD size. This is one of AMD’s key architectural efficiency changes, as it helps them keep their SIMD slots occupied more often. It also means that a wavefront can be passed through the SIMDs in a single cycle, instead of over 4 cycles on GCN parts.

In terms of compute, there are not any notable feature changes here as far as gaming is concerned. How things work under the hood has changed dramatically at points, but from the perspective of a programmer, there aren’t really any new math operations here that are going to turn things on their head. RDNA of course supports Rapid Packed Math (Fast FP16), so programmers who make use of FP16 will get to enjoy those performance benefits.

With a single exception, there also aren’t any new graphics features. Navi does not include any hardware ray tracing support, nor does it support variable rate pixel shading. AMD is aware of the demands for these, and hardware support for ray tracing is in their roadmap for RDNA 2 (the architecture formally known as “Next Gen”). But none of that is present here.

The one exception to all of this is the primitive shader. Vega’s most infamous feature is back, and better still it’s enabled this time. The primitive shader is compiler controlled, and thanks to some hardware changes to make it more useful, it now makes sense for AMD to turn it on for gaming. Vega’s primitive shader, though fully hardware functional, was difficult to get a real-world performance boost from, and as a result AMD never exposed it on Vega.

Unique in consumer parts for the new 5700 series cards is support for PCI Express 4.0. Designed to go hand-in-hand with AMD’s Ryzen 3000 series CPUs, which are introducing support for the feature as well, PCIe 4.0 doubles the amount of bus bandwidth available to the card, rising from ~16GB/sec to ~32GB/sec. The real world performance implications of this are limited at this time, especially for a card in the 5700 series’ performance segment. But there are situations where it will be useful, particularly on the content creation side of matters.

Finally, AMD has partially updated their display controller. I say “partially” because while it’s technically an update, they aren’t bringing much new to the table. Notably, HDMI 2.1 support isn’t present – nor is more limited support for HDMI 2.1 Variable Rate Refresh. Instead, AMD’s display controller is a lot like Vega’s: DisplayPort 1.4 and HDMI 2.0b, including support for AMD’s proprietary Freesync-over-HDMI standard. So AMD does have variable rate capabilities for TVs, but it isn’t the HDMI standard’s own implementation.

The one notable change here is support for DisplayPort 1.4 Display Stream Compression. DSC, as implied by the name, compresses the image going out to the monitor to reduce the amount of bandwidth needed. This is important going forward for 4K@144Hz displays, as DP1.4 itself doesn’t provide enough bandwidth for them (leading to other workarounds such as NVIDIA’s 4:2:2 chroma subsampling on G-Sync HDR monitors). This is a feature we’ve talked off and on about for a while, and it’s taken some time for the tech to really get standardized and brought to a point where it’s viable in a consumer product.

AMD Announces Radeon RX 5700 XT & RX 5700 Addendum: AMD Slide Decks
Comments Locked

326 Comments

View All Comments

  • Hixbot - Saturday, June 15, 2019 - link

    I'm not following your point. Nvidia's terrible pricing are not being challenged by AMD. If it were, I think they would get a lot of credit from myself and others, just as they are in the CPU market. Nobody is praising Nvidia here, I was just expecting more value from AMD.
  • Korguz - Saturday, June 15, 2019 - link

    Hixbot, i think amd would still be being harped on, and what i mean is.. they would still be criticized like they are now for not having ray tracing, and cause of the price of the cards.. instead.. i think the common complaint would be.. amd had an extra year to work on their cards, and all they can do is match the performance ?? they suck.. and their cards are still over priced...
  • BenSkywalker - Saturday, June 15, 2019 - link

    So what performance level are you seeing in your games with RTX on using your 2070 and where do you think it should be?

    I just played through Quake 2 RTX on a 2060 and I thought it was great. Been playing metro exodus and Tomb Raider with ray tracing on too, not having problems with either of them either. Minecraft isn't my thing but my kids think the ray tracing in that game is great too(that one will run on older hardware).

    Part of the disconnect in this conversation is people talking about how bad performance is at 4k ultra with ray tracing when mid range cards can't run these games art those settings without ray tracing anyway.
  • Korguz - Saturday, June 15, 2019 - link

    BenSkywalker i dont have any 20 series cards.. i have a 1060, but going by the reviews.. and word of mouth with those that do have a 20 series card, it seems the performance isnt there. maybe thats it, resolution 1440 or higher.. but it seems if you mention you play @ 1080p, you get made fun of.
  • BenSkywalker - Sunday, June 16, 2019 - link

    I'll say all the people I know IRL that actually play the games that have tried ray tracing on with RTX hardware have thought performance was solid, on a 2060. Now to be fair no competitive shooters were used online, just actual gamers playing actual games. As an example Tomb Raider max settings without ray tracing at 1440p is close to identical performance as 1080p with ray tracing at ultra settings. Now you can run RT at medium 1440p and be quite playable(mid 40s to mid 50s), but between the two settings, double blind, everyone I had compare said the 1080p with ray tracing was much better(full disclosure, I picked a spot with ray traced shadows on screen).

    But my tournament level frame rates are down..... As opposed to what image enhancing technology?

    Five years from now it'll be a joke that anyone argued against it in the first place, most of those that are, rabidly, will deny they ever did.
  • Korguz - Sunday, June 16, 2019 - link

    BenSkywalker, what games are you trying RT with ?? im not sure.. but tomb raider isnt really all that taxing, is it ? i would assume the " competitive shooters were used online " you mention.. may be a lot more taxing.. and the performance may not be all that bearable with RT on, on a 2060.
    im not arguing against it, but, maybe be unlike some, for the price, it just isnt worth it... yet... i am going to assume you are in the US, but for those of us in canada, ( maybe other parts of the world as well ) take your US prices and add 200 at the low end, to around 500 at the top end and decide of the prices are worth it.. 2060's here start at $500 ( when not on sale ), and go as high as $2100 for a lot of those i know.. it just isnt worth it, as we have bills to pay, kids to feed etc...
  • BenSkywalker - Monday, June 17, 2019 - link

    Metro exodus, Tomb Raider, Minecraft and Quake 2 RTX. Take Metro Exodus, fps are higher using 100% shader resolution with ray tracing on then ray tracing off with shader resolution maxed out. Digital foundry made a video about Quake 2 RTX, really gives a great example of the impact(performance drop is *huge*, so is the visual impact).

    So you say it isn't worth the premium, in no way whatsoever would I say that's wrong, I know first hand readily available disposable income isn't always sitting around in large quantities, but what about when there is no premium?

    That's what I'm seeing with this launch. The 5700 is more expensive than the 2060, barely edges it in traditional rendering and doesn't give the option to play with ray tracing *at all*.

    There's only a handful of games and it's a big performance hit, both completely valid, but the pricing issue is kind of out the window now that AMD has decided to avoid the value position.
  • Korguz - Monday, June 17, 2019 - link

    BenSkywalker, looking that req's for Metro exodus, im a little surprised that it runs that well for you :-) but to then add in a 22??? year old game, that would run ( exaggerating here ) 400 fps on modern hardware, add RT to it, and have it run at " only " 200 fps, is a little moot... what would the performance be, and i dont mean patching in RT support, but if the game was made for RT from the start ? the friends i talked to.. and i think this is part of the reason why that dont think it is worth it to get an 20 series card, yet, is be cause with Metro exodus, Tomb Raider, Minecraft and Quake 2 RTX, they dont play any of those games.. so the money spent for RT and the other features the 20 series brings to the table, wouldn't be used. to compare the 5700 series to the low end 20 series.. is also a little lopsided, as amd is aiming these 2 cards well above that, so comparing the 2060 series to say a 5600 or even a 5500 type series might be a little more even. but it really comes down to how much is one willing to spend on a video card, how much they can afford to, and would the games one plays, see any benefit from that purchase? one of the friends i talked to.. is one of those with a lot of disposable income, and even he says the 20 series isnt worth the cash right now :-)
  • BenSkywalker - Monday, June 17, 2019 - link

    The canned bench from metro exodus is like a torture test, the game runs quite nicely. Quake 2 is actually 32 years old, you were way off on you're estimates(think more like 1.5k to sub 100), but I'd say watch the digital foundry video.

    Now, the rest of your comments, they are directly refuting AMD's claims. AMD is calling the 5700 a 2060 competitor, saying it is 10% faster for 8.5% more money. That's not my interpretation, that's what they have come out and said. AMD is saying, based on their hands picked benches that they are going to charge almost exactly the same $/fps as nVidia but no option for ray tracing. Again, this isn't me spinning anything, it's all in their slides, their words.

    Is it unreasonable to assume AMD chose benches that made them look slightly better than average? Combine that with the price premium and you may find that real world has a factory overclock 2060 at the same price as the 5700 MSRP is dead even with AMD, but with the option of playing with RTX.
  • BenSkywalker - Monday, June 17, 2019 - link

    That should say Quake2 is 22 years old.

Log in

Don't have an account? Sign up now