Comments Locked

38 Comments

Back to Article

  • BigMamaInHouse - Monday, March 18, 2019 - link

    So you can think now what's AMD gonna do with all their compute power on Polaris and Vega :-).
  • BigMamaInHouse - Monday, March 18, 2019 - link

    P.S: Vega Supports 2:1 FP16!
  • Sttm - Monday, March 18, 2019 - link

    Offer 6 fps on the Vega VII instead of 5 fps on the 1080ti.

    :)
  • uefi - Monday, March 18, 2019 - link

    Unless if DXR allows running FP32 and/or INT32 solely on a second card. It should be more common in games to utilise a second GPU for compute tasks. It's something low level APIs like Vulkan promises, right?
  • Kevin G - Tuesday, March 19, 2019 - link

    Oh, the promise is there from hardware manufacturers but that takes effort from software developers and anything that requires effort there of a minority of users (second GPU) is a non-starter.

    The sole exception would be a handful of developer that see fit to leverage both an discrete and Intel integrated graphics together due to the shear market share of this combination. Even though, I wonder how useful leveraging this would actually be in the real world.
  • WarlockOfOz - Tuesday, March 19, 2019 - link

    So long as the IGP provides more benefit than is lost by coordinating the different processors it would be nice to see it used. I'm guessing it's a fairly hard problem to solve at a general level otherwise something like Ryzen Vega + GPU Vega (which I presume to be the easiest to do technically, the easiest to get both sides to agree to and the easiest to benefit from cross-marketing) would be a thing.
  • CiccioB - Tuesday, March 19, 2019 - link

    The cost of an added capable enough GPU that can support real time raytracing without dedicated units is the same as having those units inside the single GPU.
    That simplifies everything, from a programming point of view, to the general support (you are not targeting the few 0.x% of users with 2 big enough GPUs), to power consumption to calculation efficiency in general (dedicated units use less energy for the same work and being inside a single GPU does not require the constant copy of data forth and back between the two GPU).

    You may think that this will not be an issue when chiplets will be in use for MCM architectures, but again, implementing the dedicated units near the general purpose ALUs (and filtering cores) just increases efficiency a lot.
  • shing3232 - Tuesday, March 19, 2019 - link

    but it would subject to same bandwidth constrain and power. They would rather to have another dedicated RT chip until they reach 7nm
  • CiccioB - Monday, March 18, 2019 - link

    The same that was and is available with Pascal and Volta, that is not enough raw power to do anything serious with RT.
  • blppt - Monday, March 18, 2019 - link

    In a way, it makes sense from a business point of view---allow the Pascal user to experience the eye candy of enabled RT, at a framerate so bad that they'll be more inclined to dump money on the RTX upgrade.
  • cmdrdredd - Monday, March 18, 2019 - link

    Go back and read the article again. It says DXR will allow non RT products to enable "some" Ray Tracing effects. It didn't say that all the same effects will be enabled. They even went so far as to say it's a stripped down version in order to keep a playable frame rate. That doesn't necessarily mean 60fps but you can get global illumination or better shadows etc using DXR and perhaps something near 30fps which IMO isn't terrible for certain types of games.
  • blppt - Monday, March 18, 2019 - link

    Either way, it lets you whet your appetite on "Ray Tracing Lite" which might entice you to upgrade to the slow-selling RTX.
  • Qasar - Monday, March 18, 2019 - link

    the only thing that would entice me to upgrade to RTX would be for nvidia to drop the price between $200 on the low end entry 2060 and 1k on the high end :-) current 20 RTX series cards range from $470 cdn to has high as $2100 cdn at the very top........
  • nathanddrews - Tuesday, March 19, 2019 - link

    "New GPUs offer better image quality via new features, more at 11."
  • CiccioB - Tuesday, March 19, 2019 - link

    Yes, enabling RT effects on everything is a marketing move to make both developers start using those new features and to wet players' tongue with the salty sauce.
    But this has never been a real problem.
    MS DXR is a general API as are all DX API. You can do them in HW or in SW or s mix. There are no specific HW requirements on anything, just on functionalities that can be achieved as anyone wants with whatever architecture he wants (or is able to create).
    So why investing so much money on things as new HW architectures each year?
    Because what changes between using dedicated HW units and not are performances, which is ultimately what games buy.

    Is RT possible on HW without dedicated units?
    Yes, it has always been. See the demos nvidia made with Volta and then the same done with Turing to see what dedicated units can achieve.
  • CiccioB - Tuesday, March 19, 2019 - link

    "what games buy" -> "what gamers buy"
  • haukionkannel - Tuesday, March 19, 2019 - link

    Yep! Good marketing to make old cards to look really slow!
  • blppt - Tuesday, March 19, 2019 - link

    It actually is, if your goal is to move more silicon.

    Otherwise, given the meh speed increases for non-RT games that a 2080Ti provides over the late model heavily factory oc'd 1080ti's, gamers have little incentive to drop $1200 on a new video card.
  • Cullinaire - Monday, March 18, 2019 - link

    The thing that interests me most about RT is the end of weird and fuzzy shadows, esp self shadowing.
  • CiccioB - Tuesday, March 19, 2019 - link

    Shadows, illumination, reflections, but also refraction (with enough calculation power), but also AI. What is lacking in today's games is gameplay, which in most cases is highly repetitive (who said FarCry?).
    AI applied to opponent strategy may really change the final quality of games, which is not only graphics effects and pumpued-up texturing (to balance the low polygon counts that consoles can barely stand).
  • azrael- - Tuesday, March 19, 2019 - link

    Meanwhile everyone else is talking about CryTek's Neon Noir raytracing demo. You know... the one rendered on a Vega 56 without dedicated RT hardware.

    https://www.youtube.com/watch?v=1nqhkDm2_Tw
  • CiccioB - Tuesday, March 19, 2019 - link

    Everyone else are those that just discovered that raytracing is possible without dedicated units.
    There is lately a WOW effects on the ignorant masses!
    You know, it has always been possible to do raytracing. You do not even need a GPU to do that.
    The problem is not the feasibility of the process, the problem are the performances.
    That's the same thing that is being said here: you can do RT effects, but without dedicated units the performance can be up to 1/10th of those with those units.
  • shing3232 - Tuesday, March 19, 2019 - link

    That demo get decent performance from a VEGA56.
  • edzieba - Tuesday, March 19, 2019 - link

    It also used a lot of the performance optimisations for RT bumped all the way up to 11 to he point of artefacting. Look around the 1:18 mark at the spinning mirrors with ghosting artefacts from temporal filtering. Looks to be about 7 frames visible at once, so RT framerate easily could be 1/6 or less the raster render rate (i.e. 30FPS display rate, 5FPS render rate for RT reflections).
  • SirPerro - Tuesday, March 19, 2019 - link

    People could mine cryptocurrencies with a "decent performance" until they discovered specialized hardware basically redefined "decent performance".

    Same happens with AI training. Yeah performance was "decent" until stopped being decent. And once high-end GPUs and TPUs are in the game, suddenly the baseline and the use cases completely change the landscape.

    No matter the performance of that demo, a correct implementation in hardware would completely crush it.
  • bigvlada - Tuesday, March 19, 2019 - link

    Ordinary people got the ability to play with raytracing in 1986 with Turbosilver (later known as Imagine) on vanilla 512kb Amiga 500 using the Sculpt 3D program.

    Nothing new under the sun, a while ago ignorant masses were being persuaded that 3D gaming is only possible on Voodoo cards. Power VR Rendition Verite 1000 and nVidia Riva 128 showed them how wrong they were.
  • bigvlada - Tuesday, March 19, 2019 - link

    Edit: Sculpt 3D appeared in 1987.
  • blppt - Wednesday, March 20, 2019 - link

    The Riva 128 had some really bad rendering issues IIRC. It might have delivered slightly higher framerates than the equivalent Voodoo of the time, but those graphical anomalies....

    (I had both a RIva 128 and a Voodoo 1 and then 2)
  • ET - Tuesday, March 19, 2019 - link

    Looks to me like the 2080+RT+DLSS means that it's all rendered at a lower resolution. (Standard DLSS does this, and all the non-RT work seems compressed.) That would make the comparison to the other architectures a lot less meaningful.
  • PeachNCream - Tuesday, March 19, 2019 - link

    Great! Now I can run the two ray-tracing titles that will come out between now and the end of 2020 on Pascal at 640x480! Where do I go to sign up for that great stuff?
  • JackTheBear - Tuesday, March 19, 2019 - link

    That's the best part! You don't have to sign up. You already have it.
  • sonny73n - Wednesday, March 20, 2019 - link

    How can a normal person spend hundreds to thousands of dollars for a graphic card so he can just sit on his ass all day to play with it? The cheapest RTX card 2060 is around $400 with tax. Does he work? Does he have to pay bills? What’s about saving? Does he have chores around the house? Does he have any friend beside the virtual ones he games with? Except for rich spoiled brats and drug dealers, of course.

    I don’t understand how people can invest so much money in a rig just to play stupid boring 3D games. They must be eye candy suckers who has nothing better to do and nothing better to spend their money on. I used to build and troubleshoot systems for gamers. I have tried many 3D games, including some new releases lately such as Metro Exodus, BF5, BF1, CoD4, Assassin Creed etc... If online social interaction functions taken away from these games, they’d be just a bunch of boring craps. They’re so much less meaningful and fun compared to simple 2D games such as Angry Birds, Flappy Bird or Chess games on my low specs phone. But hey, it’s your life. Be useless and stupid is your right after all.
  • catavalon21 - Wednesday, March 20, 2019 - link

    What you or I value for the cost of an entertainment dollar is not necessarily what the next person values. Some people eat out every weekend. Some people go the movies every weekend. Some people play video games every weekend. The cost of a computer, including an expensive video card, for people to whom that's their happy place, isn't necessarily higher over the life of that hardware than what others choose to do. Yes, you have to be able to afford it, and many of us will never have the latest cutting edge hardware in a rig. For those who can, a $3500 USD computer, if it lasts 3 years, is an investment (rig only) of $22 per week. For those who can afford "their happy place" at that price, have a marvelous time. The rest of us will buy a rung (or three) down the hardware tier.

    Never judge someone else's values.
  • sonny73n - Wednesday, March 20, 2019 - link

    Well, if the “gamers” can be happy with their current hardwares for a while. I live with a hardcore gamer, I know. And he’s my 24 yo son. Don’t get me wrong, he’s a good kid and all. But spending all his free time beside school in front of his PC isn’t exactly the value of one’s life should have. Like I said if you wanna be unproductive member of society, it’s your choice but you would be a loser in my book. I do judge people based on their values and actions. Don’t we all?

    If you ever step into this world as an NPC for the first time, you would be saying “wow, this is a real 3D world” and you would be thinking how wonderful to interact with others with god given 5 senses. GPU makers try to fool people with their 3D imitations new tech (ray tracing in this case) but the display is physically 2D, pixels alignment and all. Only eye candy suckers would fall for this trick and fork out more money to upgrade. Now back to this topic, I’ll just quote someone else’s. He said it better than I could ever had.

    “Giving more customers the opportunity to test RTX features is a solid move, but we don’t expect to see much in the way of playable frame rates. Nor does Nvidia have any reason whatsoever to provide them. The point is to push players into upgrading, not offer acceptable performance on older hardware. This is not to say that Nvidia would take steps to cripple the performance of ray tracing on older GPUs, but they certainly don’t have any reason whatsoever to optimize it.”
  • Qasar - Thursday, March 21, 2019 - link

    sonny73n as catavalon21 said.. to each is own.. while you see some one spending hundreds of dollars on hardware to play games as a waste.. others.. have no issues... thats their choice.. same for those that eat out all the time.. go to the theater to watch a movie, their choice.. again.. to each their own.. myself.. i dont spend hundreds to thousands of dollars for a graphic card, or comp hardware for that matter.. i do spend, what i can afford to, and feel comfortable with... and i have made some good friends playing the games i do, even made some friends that are also local people, that we get together in person quite often.
    while you consider 3d games to be stupid and boring.. the games you listed.. i consider to be stupid and boring.. and they’re so much less meaningful and fun compared 3d games, again.. to each their own. But hey, it’s your life. Be useless and stupid is your right after all. :-)
  • Tams80 - Wednesday, March 20, 2019 - link

    And you can be ignorant, condescending, rude, arrogant, disdainful, ununderstanding and vain.
    But hey, that's your right.
  • webdoctors - Wednesday, March 20, 2019 - link

    I don't know how old you are, but pre-1999, any decent home PC would be upwards of $1500, and $400 of that was just for the CRT monitor.

    Desktop computers last for a looong time, I'm still rocking my Sandbridge CPU based system from 6+ years ago, and it was a cheap upgrade because I sold the prior components when I upgraded to it. Nowadays you only need to keep upgrading the videocard, and its relatively "cheap" compared to the cost of AAA games which are often $30-60/pop.

    Look at how many ppl spend $2k+ to get leather seats in their cars, isn't that a huge waste of money? Its all relative. My wife blows way more than $500/month on tons of random garbage, not even including the cosmetics. Some coffee mug of a cat, or her 20th handbag.
  • sonny73n - Wednesday, March 20, 2019 - link

    “NVIDIA is bringing DirectX 12 DXR raytracing support to the company's GeForce 10 series and GeForce 16 series cards.”

    Does that mean Windows 7 is not supported? Or perhaps that depends on the partially functioned DX12 Microsoft will bring to Windows 7?

Log in

Don't have an account? Sign up now