It's almost ironic that the one industry we deal with that is directly related to entertainment has been the least exciting for the longest time. The graphics world has been littered with controversies surrounding very fickle things as of late; the majority of articles you'll see relating to graphics these days don't have anything to do with how fast the latest $500 card will run. Instead, we're left to argue about the definition of the word "cheating". We pick at pixels with hopes of differentiating two of the fiercest competitors the GPU world has ever seen, and we debate over 3DMark.

What's interesting is that all of the things we have occupied ourselves with in recent times have been present throughout history. Graphics companies have always had questionable optimizations in their drivers, they have almost always differed in how they render a scene and yes, 3DMark has been around for quite some time now (only recently has it become "cool" to take issue with it).

So why is it that in the age of incredibly fast, absurdly powerful DirectX 9 hardware do we find it necessary to bicker about everything but the hardware? Because, for the most part, we've had absolutely nothing better to do with this hardware. Our last set of GPU reviews were focused on two cards - ATI's Radeon 9800 Pro (256MB) and NVIDIA's GeForce FX 5900 Ultra, both of which carried a hefty $499 price tag. What were we able to do with this kind of hardware? Run Unreal Tournament 2003 at 1600x1200 with 4X AA enabled and still have power to spare, or run Quake III Arena at fairytale frame rates. Both ATI and NVIDIA have spent countless millions of transistors, expensive die space and even sacrificed current-generation game performance in order to bring us some very powerful pixel shader units with their GPUs. Yet, we have been using them while letting their pixel shading muscles atrophy.

Honestly, since the Radeon 9700 Pro, we haven't needed any more performance to satisfy the needs of today's games. If you take the most popular game in recent history, the Frozen Throne expansion to Warcraft III, you could run that just fine on a GeForce4 MX - a $500 GeForce FX 5900 Ultra was in no way, shape or form necessary.

The argument we heard from both GPU camps was that you were buying for the future; that a card you would buy today could not only run all of your current games extremely well, but you'd be guaranteed good performance in the next-generation of games. The problem with this argument was that there was no guarantee when the "next-generation" of games would be out. And by the time they are out, prices on these wonderfully expensive graphics cards may have fallen significantly. Then there's the issue of the fact that how well cards perform in today's pixel-shaderless games honestly says nothing about how DirectX 9 games will perform. And this brought us to the joyful issue of using 3DMark as a benchmark.

If you haven't noticed, we've never relied on 3DMark as a performance tool in our 3D graphics benchmark suites. The only times we've included it, we've either used it in the context of a CPU comparison or to make sure fill rates were in line with what we were expecting. With 3DMark 03, the fine folks at Futuremark had a very ambitious goal in mind - to predict the performance of future DirectX 9 titles using their own shader code designed to mimic what various developers were working on. The goal was admirable; however, if we're going to recommend something to millions of readers, we're not going to base it solely off of one synthetic benchmark that potentially may be indicative of the performance of future games. The difference between the next generation of games and what we've seen in the past is that the performance of one game is much less indicative of the performance of the rest of the market; as you'll see, we're no longer memory bandwidth bound - now we're going to finally start dealing with games whose pixel shader programs and how they are handled by the execution units of the GPU will determine performance.

All of this discussion isn't for naught, as it brings us to why today is so very important. Not too long ago, we were able to benchmark Doom3 and show you a preview of its performance; but with the game being delayed until next year, we have to turn to yet another title to finally take advantage of this hardware - Half-Life 2. With the game almost done and a benchmarkable demo due out on September 30th, it isn't a surprise that we were given the opportunity to benchmark the demos shown off by Valve at E3 this year.

Unfortunately, the story here isn't as simple as how fast your card will perform under Half-Life 2; of course, given the history of the 3D graphics industry, would you really expect something like this to be without controversy?

It's Springer Time
Comments Locked

111 Comments

View All Comments

  • uturnsam - Friday, November 28, 2003 - link

    #110 continued
    Now I know why the guy behind the counter told me to steer clear of the ATI Radeon cards because of the known compatability problems when running games.

    (Computer sales guy thinking-I just read the article in the AnandTech post)

    Translated: I have a shit load of Nvidia cards and if I don't lie my ass off to my Customer's it will be game over for me!!!

    The only reason I started looking at ATI cards was I decided to spend what I saved on the CRT monitor (over the $$LCD) for higher performer card. Mr $Sales$ had me convinced I would be buying an inferior card with ATI. Worth shopping around and scouring reviews :O)
  • uturnsam - Friday, November 28, 2003 - link

    I was going to buy a Geforce5600 but looked at a 9600Pro today the thing is I was wondering if I should really blow the budget and lash out on a 9800Pro.
    I am so glad I came across this article I will stick with the 9600Pro, save some cash, sleep better at night and know when half life 2 is released I will be getting the best performance for the outlay.

  • Anonymous User - Thursday, October 16, 2003 - link

    you can count on your 9500 being in between the 9800 and the 9600, about 30% frame rate above the 9600. the 4 pipelines will help.
  • Anonymous User - Tuesday, September 30, 2003 - link

    I would like to see a test of the dx8 paths on some of the really older cards for those of us who are too broke for these new ones!!

    For instance, I have a geforce2 GTS that I love very much and works just fine on everything else. I don't want to have to upgrade for one game.
  • Anonymous User - Sunday, September 21, 2003 - link

    I would like to see how they compare with a 5900 using Detonator 44.03 driver. Yes I know its an older driver. But in my tests it provided higher benchmarcks than the 45.23 driver.

    Has any body else noticed this?
  • Anonymous User - Friday, September 19, 2003 - link

    So actually Nvidia shader(16/32) are not
    comparable with ATI shader(24-ms dx9 standard)!
    Too bad in a way or another they try to cheat
    again and again.......
    Very bad idea!
  • Anonymous User - Tuesday, September 16, 2003 - link

    #104, the benchmarks and anand's analysis show that hl2 is gpu power limited, not memory/fillrate limited... the 9600 will be limited more by that than by memory or fillrate.
  • Anonymous User - Monday, September 15, 2003 - link

    I think #84 mentioned this, but I didn't see a reply. In the benches, the 9600 pro pulled the exact same (to within .1 fps, which could just be roundoff error) frame rates at 1024 and 1280.

    I don't think I've ever seen a card bump up res without taking a measurable hit (unless it was cpu-limited). In every other game, the 9600 takes a hit going from 1024 to 1280. And the 9700 and 9800 slow down when the resolution goes up, even though they're basically the same architecture. Someone screwed up, either the benchmarks or the graphs.
  • Anonymous User - Monday, September 15, 2003 - link

    #61 Did you take the time to see that valve limited their testing use. Anandtech had no say in all the tests because they were very time limited. Also, try to make coherent sentences.
  • Anonymous User - Sunday, September 14, 2003 - link

    It's not as if GIFs gobble bandwidth, I (as CAPTAIN DIALUP) don't even notice them loading. They're tiny. Even though I don't have trouble receiving this Flash stuff, it pisses me off, because sometimes the same scores will load for all the pages. Why not have a poll or something on this?

Log in

Don't have an account? Sign up now