"The physical specifications are standard for a GeForce FX 5900 Ultra, and the ASUS V9950 Ultra uses 450 MHz core and 850 MHz memory clock speeds. These high clock speeds and the new 256-MB memory bus allow the GeForce FX 5900 Ultra to have a theoretical advantage over the Radeon 9800 Pro 256-MB, both in terms of fill-rate and memory bandwidth." from Sharky Extreme....
Dear ATI fanboys..........when you look at this you will notice that the radeon.....which is a sub standard card......cannot match the Geforce is spesifications....fucking shame....now, before you guys all go out to buy a radeon....that is if it will work....wait for the newer Nvidia Detonator drivers.....don't base anything on the beta drivers, it just shows the ATI fannies are idiots.
Guys, I bought a Radeon 9600 pro 4 weeks ago. Believe me, it is a slow card. I tried playing Halo on it, but had to drop to 800x600 and the frame rate was still very poor. I recently bought the Geforce FX 5900. I can now play the game at 1280x1024,with all the setting, no problem. PLUS the textures looks better on the Geforce than the Radeon.........now shut up and start playing games, I presume that is the reason you bought the cards....
[quote]Whats better for DOOM 3? GEFORCE FX 5950 or RADEON 9800XT?[/quote]
What does this have to do with HL2? Doom3 was pratically funded by nvidia and uses Nvidia code paths. In other words it is completely OPTIMIZED for nvidia cards where as in HL2 it is OPTIMIZED again since it sucks at DX9.
In other words: Nvidia cards don't work with standard drivers.
guess what??? no one cares what u ATI idiots have say so buy what u want and shut the hell up.... continue on with ure merry life... JESUS, if NVidia beat ATI by 200 fps with, ude still praise ATI. FUCK MAN.... don't be so god damn linear
Cheating emm.. I don't care how they fix it.. just as long as the fix it.. They could use duck tape and chicken wire for all I care I just want my FPS up to playable. :-P
Stupid nvidia is back cheating again.:( When their GeForceFX's cards suck in DX9 they ofcourse try to minimize their reputation damage by cheating. And some newbies believe nvidia honest...
I think it was a smart move for ATI to get other people to use their chip to make ATI video cards. ATI has been assed up in the production of their cards. Maybe once ASUS puts out a ATI video card it will lower the return rate of ATI video cards. Go to any best buy or Ccity and ask them about the high return rate of the ATI products. When I returned mine they didn't even ask me what happen they just said "Oh another bad ati card." I wanted to run an ATI video card but it just didn't work. And I don't think Nvidia cards are toys. ATI cards are great for games and porn. But try running a 5k 3d app on a ATI card it runs like crap. ATI is working hard on that problem that you guys seem to forget about. Not everone uses a 2000 dollar computer to play games on. I really don't think ATI or Nvidia has the best of anything. It's just all we have to pick from at the moment. Lets not fight each other about it. lets fight Nvidia and ATI about it. Whould I would like to know is why a 2k card like the wildcat video card with 512 megs of ram can't even run quake3?
Well, ATI´s cards are still the best. Nvidia makes just toycards, ATI cards are great in multimedia and they have better picture quality. I have Radeon9700 Pro and i have to say that this is the best card i have ever had in my machine. And i haven´t got any problems with the drivers and they are getting better all the time. Before this i had GF2Ti and GF256.
And its great to see that companies like ASUS have started making ATI cards instead of nvidia toys.
Funny how company after company is leaving nvidias "quality products" and start making ATI "crap" I guess that Gigabyte, Hercules and ASUS are all idiots, or? Next in line are Gainward, Chaintec and MSI. Who will make nvidias "quality products" then?
lol, years of crap ATI cards and lame fanboys.. now it's just ATI's card being better on the few DX9 games, and they're 10x bader than the worst Nvidia fanoboy:)
to me, it's just years of quality agains years of crap product for idiots... enjoy you (temporary) small victory.. ATIDIOTS :)
Good drivers can make all the difference when it comes to how hardware works... Thats why they call them driver. It's not magic but I'm sure there are some tricks going on that we can't see.
Nvidia and ATI use little tricks to get the most out of their drivers. Nvidia pulled a "cheat/trick" out of their ass and some how got the fps up to a playable level. And the driver is still in beta. Maybe with more hard work they can get more fps out of the drivers and make the Nvidia user happy about playing dx9 games. I think Nvidia was more geared to Opengl becasue I can run quake3 at 1600x1200 with 8x AA and 8x Anso @ 125 fps constantly no mater how complex the level gets. And on a 23" tft display it looks nice. Why did everyone jump on the MS DX9 bandwagon? Doesn't that mean that other OS's will not beable to run DX9 games? I thought it was better to use Opengl so that it could be ported to different systems. Is unreal 2004 a DX9 game or a OpenGL game? And why does dx9 run so slow on both ATI and Nvidia hardware.. If you ask me Half Life on a ATI or nvidia card should run at more than 125 fps at 1024x768. And plz don't start that all you need is 30 fps BS... I may not beable to see over 30 fps, but I can damn sure feel it. And to #78 I come in here to better understand what I spent 300 bucks on. Thats not to pathetic is it? :-P
Oh yeah! I think it would have been better if valve put in both a pure Opengl and DX9 mode.
Good drivers can make all the difference when it comes to how hardware works... Thats why they call them driver. It's not magic but I'm sure there are some tricks going on that we can't see.
Nvidia and ATI use little tricks to get the most out of their drivers. Nvidia pulled a "cheat/trick" out of their ass and some how got the fps up to a playable level. And the driver is still in beta. Maybe with more hard work they can get more fps out of the drivers and make the Nvidia user happy about playing dx9 games. I think Nvidia was more geared to Opengl becasue I can run quake3 at 1600x1200 with 8x AA and 8x Anso @ 125 fps constantly no mater how complex the level gets. And on a 23" tft display it looks nice. Why did everyone jump on the MS DX9 bandwagon? Doesn't that mean that other OS's will not beable to run DX9 games? I thought it was better to use Opengl so that it could be ported to different systems. Is unreal 2004 a DX9 game or a OpenGL game? And why does dx9 run so slow on both ATI and Nvidia hardware.. If you ask me Half Life on a ATI or nvidia card should run at more than 125 fps at 1024x768. And plz don't start that all you need is 30 fps BS... I may not beable to see over 30 fps, but I can damn sure feel it. And to #78 I come in here to better understand what I spent 300 bucks on. Thats not to pathetic is it? :-P
Oh yeah! I think it would have been better if valve put in both a pure Opengl and DX9 mode.
To all the guys who are screaming about how nVidia is using a mixed codepath for HL2... everyone knows this. And to say nVidia is cheating, isn't exactly correct. That they are to blame? Probably. But, nVidia didn't make the mixed mode in HL2. Valve did. That's right. So before you go bashing nVidia about them cheating, remember that Valve, MADE the codepath. They could've chosen to only have DX8, DX8.1 and DX9 code paths.. no mixedmode... but they didn't. Now, why did they make this mixedmode? Cuz they want to sell more games. Understandable. But, I think it's wrong to get so upset at nVidia. They have inferior hardware. Don't buy it if you don't like it. Stop whining about it. If you bought ATI, be glad you'll get to appreciate HL2 "How it was meant to be played™" ;)
Also, I'd like everyone to remember who started this whole cheating business in the first place, by going back to fall 2001. When a certain graphics card was released. I seem to remember someone having drivers that detected that a game (quake3.exe) was started, and doctored IQ accordingly to increase speed (and I guess the same could be said about 3dfx with their minigl driver)...
I have a Radeon9700Pro. I'm very happy with it. Before that I had a GeForce3Ti200. Also, I was very happy with that at the time. And the next card I buy, depends on who -I-, not anyone else, think is the better (R420 or NV40). Stop following what hardware sites say blindly. I take everything with a grain of salt. I read as many reviews about a product I'm curious about, as possible. Then I make up my own opinion. Anandtech is a damn good hardware site. Still I don't trust it blindly. However, if two hardware sites have different results, I tend to lean towards AT's conclusions, untill I get more sources to base my opinion on.And just cuz he posted a few benchmarks from an unknown source, IN HIS P E R S O N A L weblog, DOES NOT make him a sellout.
Stop bashing Anand. I wouldn't be surprised if he stopped updating his weblog on account on the comments from this post alone :/
LOL, chill guys... when anand hs he's own benchmarks he'll tell us. right now i'm rather waiting for the iq-tests from anand and derek. and we don't know what will happen to half life 2 after some of the source was leaked...
#78, HAHA, isn't it pathetic? I think ATI trolls are now probably the #1 most hated bunch of dolts in the hardware community. For that matter, anyone who posts all day about NVIDIA or ATI (these two ATI trolls named Digitalwanderer and Hellbinder) is usually just a loser with nothing better to do with their time.
heh ... its funny so many ppl are saying that AT is biased towards NV ... AT recommended ATi cards all the way in their most recent article, and these numbers are in ATi's favor as well...
If this is true, why botheer comparing benchmarks from HL 2 at this at all?
Better yet, lets wait for the damn game to come out and other next gen titles and compare performance and iq with those before getting into a worthless and useless debate.
Here comes another delay. Next 1/2Life 2 will be a NIN64 game only..
This is so sad.. Now they well have to redo alot of the code to keep the cheaters out. Even the steam source code was taken. I don't know about you but I'm not typing in my Ccard number into anything that has to do with Valve.
Maybe now the nvidia guys can look at the scorce code and make some better drivers. :-P
Anyway.. I'm sorry to hear that Valve lost 5 years of work. I'm sure this will only delay the game for just a few more months.
#65, you're a true idiot. Repeat that as much as you like, you look dumber each time you do it. The 9800XT review was posted BEFORE Anand received the benchmark numbers he posted. Christ, get a clue you dope.
Oh, and I really hope that these numbers turn out to be true and that the IQ is at least close to ATI’s. That would make a lot of FX owners much happier, and it would of course shut the annoying ATI fanboys up.
I bet half the ATI fanboys never used a GeForceFX 5900 card. You guys prolly sound like the review sites and wouldn't be bitching and moaning about poor image quality... Gabe Newell said himself that going from DX8.1 to DX9 would look 95% the same. Going from DX9 FP24 to FP16 would probably look 99.99% the same judging from what he said...
lol @ #68... I have the exact same experience... I went out and bought an ati 9700pro(based on ati fanboy's recommendations... same experience as you... traded it for a geforcefx 5900... everything running without a single hitch...
yayaya.. go on fanboys, go call me an idiot, cuz it worked perfectly for you.... i bet you have the same setup as me too :roll: and i bet you love to go through all the extra hassle just to get your ati shit working...
Why do you people become fanboys? You spend 200 to 300 dollars on a video card to then sit and spew out how good you're new video card is and how bad X-card is. The joke is on you. I read all reviews with a grain of salt. I go out and buy all the video cards that are out and try them out in my system. Then I keep the best card. I went out and got a ATI card and put it in my system and it didn't even turn on my 23" tft display. I was like god damn it... Then I had to go get the Nvidia card and it worked just fine. I can run all my 3d program and games just fine.. Buying a 300 dollar video card just to play 1 fucking game is just lame. As for the nvidia dx9 problem I ran aquamark with the current drivers and was getting 14 to 20 fps.. I was like shit...WTF I'm taking this nvidia card back.. then my friend gave me some leaked drivers and the FPS went up to 36 to 47 FPS. So things can be fixed just in drivers.
My point is we need to be on the side of the gamer not on the side of the people that make the hardware. They don't give a shit about us. They pimp us out like little bitches and watch us fight amongst ourselfs. I can see if ATI or Nvidia sent out checks to people who got other people to buy their products. For the people that run out and by an ATI card just to get ready to play Half Life... Don't wait until the game comes out and you have more video cards to pick from. I remember when Nvidia and HALO got me to buy a GeForce 3 card. I ran out, got my video card like a good liltte sucker and got bitched slaped when halo turned into a x-box game only. And what happens if Half Life 2 turns out to be a bad game? What happens if someone else comes out with a video card that blows away both the Nvidia and ATI line of video cards. Don't be a fanboy you will always lose in the end.. I know I was a VOODOO fan boy for a long time... Ouch... Never again..
What we need to do is come up with a way to tell both ATI and Nvidia that paying 300 bucks or more for a video card that out dates in 6 months is just wrong. I can get a game cube for 100 bucks... damn!!
I don't think it's entirely fair flaming Anand for posting this as it's not the main page, -however- it wasn't exactly that smart a move as without further details and IQ samples the flame war evident here could have been seen coming a mile off.
Sharing is cool and all, but when the information is so incomplete it just feeds the fanboy fires.
Okay people, Its still clear the radeon cards are faster. If you have one good for you. If you want to buy one, even better. But for fucks sake stop stuffing ATi down everybody's throats. Yes yippee doo day the Radeon's faster, no shit people. You'll get a Noddy badge for stating what everybody knows.
At least nVidia's doing something about it and believe it or not, they don't owe you an apology since you're not buying their graphics cards anyway. (So F**K off!)
Thanks for the unbias Anandtech, good on you. And no I actually do trust your site somewhat more than Extreme, Toms or the Inquirer.
This post says: "I wouldn't draw any conclusions based on this data yet, just wanted to share :)"
The CONCLUSION of the 9800XT review says: "and although the latest drivers have closed the gap significantly, ATI is still ahead in Half Life 2. The numbers we’ve seen indicate that in most tests ATI only holds single digit percentage leads (< 5%)"
This post says: "I wouldn't draw any conclusions based on this data yet, just wanted to share :)"
The CONCLUSION of the 9800XT review says: "and although the latest drivers have closed the gap significantly, ATI is still ahead in Half Life 2. The numbers we’ve seen indicate that in most tests ATI only holds single digit percentage leads (< 5%)"
So you really think that nvidia pimps like Tom and Anand are trustworthy? Nvidia got them in their pocket, and that "gentlemen" is the whole issue! Nuff said
Well there's clearly been some controversy around this stuff. Can you fanboy queers (ATI & NVIDIA) reserve judgement until some real numbers come out from Anandtech or Tom's Hardware; I don't understand why you'll so readily believe the first HL2 numbers that came out anyway.
This post says: "I wouldn't draw any conclusions based on this data yet, just wanted to share :)"
The CONCLUSION of the 9800XT review says: "and although the latest drivers have closed the gap significantly, ATI is still ahead in Half Life 2. The numbers we’ve seen indicate that in most tests ATI only holds single digit percentage leads (< 5%)"
"This is some dangerous marketing stuff, meaning some uninformed people that might buy an ATI card might now be confused or changing lanes to Cheatzilla. "
Ya we never saw this before, like, um on Shader Day? Gabe Newell proclaiming Nvidia sucks, ATI rules, and we should all buy an ATI card to run HL2. Then he decides to not even release the game until any of the shader benchmarks are worthless and pointless. I would feel more sorry for people who ran out to get a 9800 Pro only to find out all they get a voucher for the game they bought the card for, and by the time they get the game they could of spent the same amount of money on an NV40 or R420 and got better performance.
its amazing how the nvidia fanboys say something to bash someone when the other party is just telling facts. Go get a real card ( ati ) nvidia is cheating crap
This post says: "I wouldn't draw any conclusions based on this data yet, just wanted to share :)"
The CONCLUSION of the 9800XT review says: "and although the latest drivers have closed the gap significantly, ATI is still ahead in Half Life 2. The numbers we’ve seen indicate that in most tests ATI only holds single digit percentage leads (< 5%)"
sounds alot like the same stuff we went through with the 5900FX nv35 lots of AIR alot of lies alot of trickery and when you want to use the card decently we'll end up with the same garbage. Scary stuff.and i wasted already 500$ on a Fx5900 ultra that gave cute performance tests too until you actually wanted to enable some quality and play some decent games and the card becomes worse then a geforce 4
dood do you guys get together in little groups at your homes and just hate on nvidia til the bright and early morning? why so much animosity? how sad is that. "look fool my shit goes 5 fps faster!!" who gives a shit? nvidia was the top dog for a while, now it's ATi's turn. this is how business goes. you watch, soon ATi will slip up and nvidia will be back on top. so just save your breath and shut the hell up, and don't call me an nvidia apologist either, cause i have a 9800 pro in my system right now, but before that i had a geforce 3. so just find something else to get your panties in a bunch ladies...
This gen both ATI and nvidia screwed up. Most companies screw at least something up in their new products but most of the time it can be fixed or disabled. Okay so nvidias issues in this gen (doesnt effect GF4/GF3 etc) cant be fixed. Fine, but whos to say the nv40 wont be amazing? We have 4 + companies offering DX9 technology now or in the near future, these sorts of mistakes wont be a common thing for long. In the end, buy what ya need, or wait for next gen.
Yeah.. nVidia is cheating. If you can't grasp that concept yet, Then you are a dumb mofo loser! It's time to take the nVidia shrine down and face the facts that they are Cheating!
And to the losers above, nVidia left the discussions about DX9 because MS didn't bend to their demands of making their stuff in DX9. So why don't you guys read up before knowing what really happend and why the FX sucks in DX9... Pffft.
Yeah.. nVidia is cheating. If you can't grasp that concept yet, Then you are a dumb mofo loser! It's time to take the nVidia shrine down and face the facts that they are Cheating!
And to the losers above, nVidia left the discussions about DX9 because MS didn't bend to their demands of making their stuff in DX9. So why don't you guys read up before knowing what really happend and why the FX sucks in DX9... Pffft.
Yeah.. nVidia is cheating. If you can't grasp that concept yet, Then you are a dumb mofo loser! It's time to take the nVidia shrine down and face the facts that they are Cheating!
And to the losers above, nVidia left the discussions about DX9 because MS didn't bend to their demands of making their stuff in DX9. So why don't you guys read up before knowing what really happend and why the FX sucks in DX9... Pffft.
its amazing how the nvidia fanboys say something to bash someone when the other party is just telling facts. Go get a real card ( ati ) nvidia is cheating crap
Why it's called cheats? Because we know it has architectural problems. These big hardware sites never publicize the hardware problem of FX series properly. And everytime nvidia cheats with new driver, they just accept it and spew it to readers. They just do that over and over again. That's why ppl criticize nvidia and hardware sites like [H], AT, Tom's over and over again.
its amazing how the nvidia fanboys say something to bash someone when the other party is just telling facts. Go get a real card ( ati ) nvidia is cheating crap
It's amazing how blatant and ridiculous some of the ATI fanboys are. Every time something that shows nVIdia in a good light it's immediately "cheating," sometime without a single shred of evidence. It is indeed possible that GeForce cards can perform well, much to the chagrin of fanATIcs. And these aren't even verified, so no point in getting features ruffled over nothing.
Hey if the optimizations/cheats they made in the drivers make HL2 all that much faster and the IQ differences are neglible or you need to blow up a single frame to 400% to notice then kudos to nvidia. Conversely if the IQ differences are drastic enough to notice during actual gameplay then I'm not interested.
I swear you fanatics could drive a guy to nvidia just on grounds of being so irritating....
Ffs, every one is allready jumping to conclusion's. Just f*cking wait till the cards are both for sale and tested with the official HL2 benchmark before every one is calling that Nvidia is cheating. To comment on the 51.75 drv it's a BETA what did everyone expect?? That it was perfect the first time? As far as i know there are no perfect drivers. Yes i own a Nvidia card and no i am not a fanboy (before some stupid flame war is beginning about fanboys) I am thinking of purchasing a radeon card in the near future (R420) or stay with Nvidia (NV40).
Anand did a fine job on publishing this news, i looked with interest to this "bench" and can't wait to see some official figures, maybe Nvidia did find a way to improve performance without loosing IQ only Time will tell
All he did was put up some preliminary results from a source HE considers reliable. People visiting Anandtech consider Anand to be a reliable person. If he was to jump ship (or sell his soul..) to the marketing side, he would have done so a LoNG time ago!
WAiT for official benchmarks before making any judement calls...
Anand, PleAsE don't be taken aback by these peope comments. I, for one, really hope that you will keeping posting stuff like this! Get us on the inside of what is going on!!! Thanks... :)
.... do you feal the fresh air in your face?!? Do you see birds, dogs, people in the streats?? No?!? Ok... turn off your comp and get a life.... please... theres a lot of things to see and feald.... get a life....
I would say calling DX9 a standard is a bit of a stretch. It's not as if it was defined by some independent standard body, like say the OpenGL ARB. Let's call it what it is - a Microsoft specification.
And it seems rather obvious that Microsoft has been somewhat concerned of late with Nvidia dominance in the graphics industry. Nvidia had the audacity to refuse to lower the already slim margin on the XGPU, despite repeated aggressive demands from Microsoft. Aside from that, Nvidia has consistently pushed a multi-OS strategy and OpenGL as an alternative API. This obstinance clearly is intolerable.
Could it be a coincidence that MS gave their seal of approval to Nvidia's struggling competitor and at the same time left Nvidia high and dry with their ambitious 32-bit architecture?
I would say the surprise is not that Nvidia is behind. It's how well they manage to keep up. And it remains to be seen how much of a tradeoff the mixed precision path will be.
Um, relax folks. "...take them with a grain of salt." ATI will always run at higher quality anyhow, so no big deal. The ATI folks will be happy, and the nVidia folks will be happy because they can at least RUN the game at acceptable frame rates at the expense of visual quality. Oh Well.
They didn't make the hardware change anything, they had to make the software use a different hardware path that doesn't run at the correct DX9 specifications.
Do you have any clue what you're talking about?
Nvidia just has a different architecture that doesn't run at the standard DX9 specs, thus it is a inferior card and isn't in any way "superior and flexible" since it isn't a standard.
It's like comparing a rocket engine to a car motor. Sure it'll generate more thrust, but you can't just stick it on wheels and call it an efficent car. You need to change the road layout completely, but that doesn't make it a car.
Hahahaha... in the face of the MS and ATI's "let's make the R3xx design the actual DX9 spec" conspiracy, Nvidia comes to the rescue once more and makes their superior and flexible architecture beat the cronies at their own game. Look who have egg on their faces now :)
Hahahaha... in the face of the MS and ATI's "let's make the R3xx design the actual DX9 spec" conspiracy, Nvidia comes to the rescue once more and makes their superior and flexible architecture beat the cronies at their own game. Look who have egg on their faces now :)
Hahahaha... in the face of the MS and ATI's "let's make the R3xx design the actual DX9 spec" conspiracy, Nvidia comes to the rescue once more and makes their superior and flexible architecture beat the cronies at their own game. Look who have egg on their faces now :)
Think it is HIGHTIME that site owners & reviewers remember who is reading your "story's". You have an obligation to your readers to inform them with the thruth, and not to kiss and s#ck with your paymasters. In the last few weeks Anandtech has IMO shown on wich "side" they are on! And with this the final insult to me and your readers will end my visits to this site!. ps; I do not stand on one side or another. but I do care for good and honest information!!.
This is some dangerous marketing stuff, meaning some uninformed people that might buy an ATI card might now be confused or changing lanes to Cheatzilla.
In other words it stops potential buyers who 'think' that info coming from such a "reliable source" (pun intended) should be taken seriously, just to find out later that the game might look like this: http://www.iol.ie/~baz8080/crap.jpg
I, for a second, thought that [T] people hacked this site and posted their senseless blabbering.
At least the source should have been revealed.
To the guy on NVidias drivers: ATI may have bugs in games (and only if NOLF2 I have seen one), but when NVidia does them it affects hardware. I lost to GF2MX thnx to that "double speed bug after comming from standby" in 6.xx and 7.xx before I found out. How about the fan that doesn't spin?
well you guys are seriously a bunch of ungrateful little sh*ts. this is his WEBLOG. it's NOT anandtech front page news-- it's something that he found interesting and decided to post in his friggin journal. i hope you sacs don't go on your friends xanga and ujournal pages and bitch and moan at them for writing about things they find interesting.
if you don't like what you read here, go get your news somewhere else
I'm personally a fan of both Nvidia and ATI. They each have their plus and minuses. While the developers obviously have to worry about what is the best to develop for and obviously get tired of dealing with certain companies agendas *cough Nvidia cough*, I personally could care less as long the end result is that it works, regardless of cheats, optimizations, etc.
To me, it all comes down to the games you play. If the games you play are largely DirectX games, then ATI is the way to go, but if you run OpenGL games, Nvidia is your choice. I own both a 5800 Ultra and an ATI 9800 Pro. In games like Battlefield that use DirectX, the ATI card averages about 5-10fps more than the Nvdia card. While in a game like Quake3, the Nvidia cards running in OpenGL run about 15-20pfs faster.
The games I like to compare are the games that have both DirectX and OpenGL options, such as Nascar Racing 2003. On an Nvidia card in OpenGL, I get around 100-110fps, while in D3D I only get around 30fps. But on an ATI card, in OpenGL, I get around 20fps, but in D3D, I get 80-90fps.
I think a lot of people forget that the graphics API makes a HUGE difference in performance, and as such is a key indicator in the type of performance you will get depending on the card you are using.
Perfect example of this is Nvidia's capability to be "twice as fast" as the ATI in Doom3 (OpenGL), while the ATI is "twice as fast" as the Nvidia in Half-Life 2 (DirectX).
While I have no affinity to either card manufacturer, Nvidia's drivers seem much more polished, while ATI's can be troublesome. ATI made drastic improvements with the Catalyst 3.7's and I was able to uninstall, install, overwrite, etc. those drivers many times with no real problems. Same could not be said for the older Catalyst drivers. So for that, I'm extremely pleased.
Yeah, I'd hate to delve into that benchmark and discover all the cheats and shortcuts taken to achieve those numbers. I trust Valve more than "unnamed, but reliable sources".
I think it's gonna take more than a driver update or two to dig the NV35 out of the mess it is currently in. nVidia shot and missed. Get with it.
Hey, really unprofessional to post benchmark scores - of a benchmark you didn't do yourself - of a game/benchmark not available yet to the public - using drivers that are not yet and/or will never be published, let alone WHQL'ed - where competitioners are not doing the same workload (codepaths anyone?) - without verification of IQ - without even STATING THE GODDAMN source (not that it isn't pretty obvious)
See, even without having to use the words "cheat" or "optimisation" this is unacceptable, even as a "first look" or whatever.
So when you run Nvidias card at crappy IQ and DX 8.1 with inserted clip planes and cheats to even enhance screenshots it is almost as fast as an ATI card running full precision full blown DX9 with all the bells and whistles on.
Absolutely amazing!!!!
I guess I'll run right out and get an Nvidia card!!!
Oh wait, I can buy a $200 ATI card that spanks the crap out of Nvidia's best $500 card.. Um ... maybe not....
Let me guess the source of the benchmarks.... Nvidia.
Oh, for chrissakes, these aren't HIS numbers. HE didn't do any benchmarking. They were give to him so any comments about not doing IQ diffs is ridiculous. Just have some patience and wait for the official benchmarks/Part II of the 9800XT review to come out before commenting.
To #5, if you don't like it, don't read it. Go somewhere else. It's not like these benchmarks are official. Anand even said to take it with a grain of salt. If anything, those numbers are interesting. Guess we'll find out how it stands when the benchmark is released and Anand does a real benchmark.
Maybe it's worth waiting until Anand can do some in house benchies. If he doesn't have the demo/time/etc, then it's not his fault he has to use someone else's numbers and not give us much info about them.
Wow I feel sorry for Valve. They wasted 5x the time optimizing the NV codepath compared to the standard DX9 path. All of this time, work, and effort was in vain.
In the end, NVidia just optimizes Half Life 2 its own way.
Even though performance is very close, its comparing apples to oranges. You are comparing the NV path, with mixed precision, and missing certain high quality effects such as HDR(which looks AMAZING on screenshots), and are comparing it to ATI's full blown DX9.
You might as well just benchmark Nvidia cards at 640x480 and compare them to ATI cards at 1200x1600. Who is this source and why didn't they benchmark BOTH the FX and 9800 at full precision?
it is using pp codepath... as anand says.. mixed precision... if it is like the nv35 therefore it will be using fp16/32 likely...
I do however find it quite shocking that there is no additional column for dx9 codepath performance figures by anand for the nv38... that comparison and the subsequent performance gains would be what we would be looking @ for comparison...
Hopefully valve will keep those release updates along with HL2 that stop cheating as they find them when time goes on. Very very odd numbers if you ask me, they don't jump that high "automagicly" and we all know nvidia's hardware is not up to it. like every one is saying, there's going to be some serious IQ problems.
Performance like this doesn't come from thin air. Probably reducing everything to as low as FX12. Oh and you also failed to mention that these scores are without certain features like HDR enabled, since FX cards can't do it at all. Maybe throw in some static clip planes specifically for the benchmark camera sequences.....
Its unbelievable that you would post these numbers coming from a "reliable source" without any sort of IQ comparison from tests in lab. And what source might that be....Derek Perez? Brian Burke? Thats Anand for you
i will look fine , maybe not as nice as ati's but we dont know that yet , plus ive seen some pics of ps 1.4 and ps 2.0 and to tell you the truth i could barely tell the diff {tomb raider}, it was there but barely notisable
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
102 Comments
Back to Article
Project-X - Thursday, October 9, 2003 - link
Redet hier auch jemand Deutsch? bzw. German?Anonymous - Thursday, October 9, 2003 - link
hahaAnonymous - Thursday, October 9, 2003 - link
"The physical specifications are standard for a GeForce FX 5900 Ultra, and the ASUS V9950 Ultra uses 450 MHz core and 850 MHz memory clock speeds. These high clock speeds and the new 256-MB memory bus allow the GeForce FX 5900 Ultra to have a theoretical advantage over the Radeon 9800 Pro 256-MB, both in terms of fill-rate and memory bandwidth." from Sharky Extreme....Dear ATI fanboys..........when you look at this you will notice that the radeon.....which is a sub standard card......cannot match the Geforce is spesifications....fucking shame....now, before you guys all go out to buy a radeon....that is if it will work....wait for the newer Nvidia Detonator drivers.....don't base anything on the beta drivers, it just shows the ATI fannies are idiots.
Anonymous - Thursday, October 9, 2003 - link
Guys, I bought a Radeon 9600 pro 4 weeks ago. Believe me, it is a slow card. I tried playing Halo on it, but had to drop to 800x600 and the frame rate was still very poor. I recently bought the Geforce FX 5900. I can now play the game at 1280x1024,with all the setting, no problem. PLUS the textures looks better on the Geforce than the Radeon.........now shut up and start playing games, I presume that is the reason you bought the cards....Anonymous - Tuesday, October 7, 2003 - link
"In other words: Nvidia cards don't work with standard drivers." BSJahara - Monday, October 6, 2003 - link
[quote]Whats better for DOOM 3? GEFORCE FX 5950 or RADEON 9800XT?[/quote]What does this have to do with HL2? Doom3 was pratically funded by nvidia and uses Nvidia code paths. In other words it is completely OPTIMIZED for nvidia cards where as in HL2 it is OPTIMIZED again since it sucks at DX9.
In other words: Nvidia cards don't work with standard drivers.
Anonymous - Sunday, October 5, 2003 - link
JR, guess what? If you want people to take you seriously, you have to use "you" instead of "u"!I think I saw a /. comment that put the situation pretty well.. anyone remember it?
JR - Sunday, October 5, 2003 - link
guess what??? no one cares what u ATI idiots have say so buy what u want and shut the hell up.... continue on with ure merry life... JESUS, if NVidia beat ATI by 200 fps with, ude still praise ATI. FUCK MAN.... don't be so god damn linearDustbusterII - Sunday, October 5, 2003 - link
Whats better? OpenGL or DirectX?Anonymous - Sunday, October 5, 2003 - link
Definitely GeForce FX line. Radeons run Doom3 like crap.DustbusterII - Sunday, October 5, 2003 - link
Whats better for DOOM 3? GEFORCE FX 5950 or RADEON 9800XT?anbot - Sunday, October 5, 2003 - link
It's easier for NV to cheat on HL2 now that they have the source code.StormGFX - Saturday, October 4, 2003 - link
Cheating emm.. I don't care how they fix it.. just as long as the fix it.. They could use duck tape and chicken wire for all I care I just want my FPS up to playable. :-PAnonymous - Friday, October 3, 2003 - link
Stupid nvidia is back cheating again.:(When their GeForceFX's cards suck in DX9 they ofcourse try to minimize their reputation damage by cheating.
And some newbies believe nvidia honest...
Keith - Friday, October 3, 2003 - link
I wonder how well my ati radoen 9700 pro on a 2.00ghz sys and 512 ddr 400 computer run hl2?StormGFX - Friday, October 3, 2003 - link
I think it was a smart move for ATI to get other people to use their chip to make ATI video cards.ATI has been assed up in the production of their cards. Maybe once ASUS puts out a ATI video card it will lower the return rate of ATI video cards.
Go to any best buy or Ccity and ask them about the high return rate of the ATI products. When I returned mine they didn't even ask me what happen they just said "Oh another bad ati card." I wanted to run an ATI video card but it just didn't work. And I don't think Nvidia cards are toys. ATI cards are great for games and porn. But try running a 5k 3d app on a ATI card it runs like crap. ATI is working hard on that problem that you guys seem to forget about. Not everone uses a 2000 dollar computer to play games on. I really don't think ATI or Nvidia has the best of anything. It's just all we have to pick from at the moment. Lets not fight each other about it. lets fight Nvidia and ATI about it. Whould I would like to know is why a 2k card like the wildcat video card with 512 megs of ram can't even run quake3?
Anonymous - Friday, October 3, 2003 - link
Well, ATI´s cards are still the best. Nvidia makes just toycards, ATI cards are great in multimedia and they have better picture quality.I have Radeon9700 Pro and i have to say that this is the best card i have ever had in my machine. And i haven´t got any problems with the drivers and they are getting better all the time.
Before this i had GF2Ti and GF256.
And its great to see that companies like ASUS have started making ATI cards instead of nvidia toys.
Anonymous - Friday, October 3, 2003 - link
Funny how company after company is leaving nvidias "quality products" and start making ATI "crap" I guess that Gigabyte, Hercules and ASUS are all idiots, or? Next in line are Gainward, Chaintec and MSI. Who will make nvidias "quality products" then?Anonymous - Friday, October 3, 2003 - link
lol, years of crap ATI cards and lame fanboys.. now it's just ATI's card being better on the few DX9 games, and they're 10x bader than the worst Nvidia fanoboy:)to me, it's just years of quality agains years of crap product for idiots... enjoy you (temporary) small victory.. ATIDIOTS :)
Really enjoying...
ebe
StormGFX - Friday, October 3, 2003 - link
To #76Good drivers can make all the difference when it comes to how hardware works... Thats why they call them driver. It's not magic but I'm sure there are some tricks going on that we can't see.
Nvidia and ATI use little tricks to get the most out of their drivers. Nvidia pulled a "cheat/trick" out of their ass and some how got the fps up to a playable level. And the driver is still in beta. Maybe with more hard work they can get more fps out of the drivers and make the Nvidia user happy about playing dx9 games. I think Nvidia was more geared to Opengl becasue I can run quake3 at 1600x1200 with 8x AA and 8x Anso @ 125 fps constantly no mater how complex the level gets. And on a 23" tft display it looks nice. Why did everyone jump on the MS DX9 bandwagon? Doesn't that mean that other OS's will not beable to run DX9 games? I thought it was better to use Opengl so that it could be ported to different systems. Is unreal 2004 a DX9 game or a OpenGL game? And why does dx9 run so slow on both ATI and Nvidia hardware.. If you ask me Half Life on a ATI or nvidia card should run at more than 125 fps at 1024x768. And plz don't start that all you need is 30 fps BS... I may not beable to see over 30 fps, but I can damn sure feel it. And to #78 I come in here to better understand what I spent 300 bucks on. Thats not to pathetic is it? :-P
Oh yeah! I think it would have been better if valve put in both a pure Opengl and DX9 mode.
Thats my last 2cents.
StormGFX - Friday, October 3, 2003 - link
To #76Good drivers can make all the difference when it comes to how hardware works... Thats why they call them driver. It's not magic but I'm sure there are some tricks going on that we can't see.
Nvidia and ATI use little tricks to get the most out of their drivers. Nvidia pulled a "cheat/trick" out of their ass and some how got the fps up to a playable level. And the driver is still in beta. Maybe with more hard work they can get more fps out of the drivers and make the Nvidia user happy about playing dx9 games. I think Nvidia was more geared to Opengl becasue I can run quake3 at 1600x1200 with 8x AA and 8x Anso @ 125 fps constantly no mater how complex the level gets. And on a 23" tft display it looks nice. Why did everyone jump on the MS DX9 bandwagon? Doesn't that mean that other OS's will not beable to run DX9 games? I thought it was better to use Opengl so that it could be ported to different systems. Is unreal 2004 a DX9 game or a OpenGL game? And why does dx9 run so slow on both ATI and Nvidia hardware.. If you ask me Half Life on a ATI or nvidia card should run at more than 125 fps at 1024x768. And plz don't start that all you need is 30 fps BS... I may not beable to see over 30 fps, but I can damn sure feel it. And to #78 I come in here to better understand what I spent 300 bucks on. Thats not to pathetic is it? :-P
Oh yeah! I think it would have been better if valve put in both a pure Opengl and DX9 mode.
Thats my last 2cents.
Morten - Friday, October 3, 2003 - link
To all the guys who are screaming about how nVidia is using a mixed codepath for HL2... everyone knows this. And to say nVidia is cheating, isn't exactly correct. That they are to blame? Probably. But, nVidia didn't make the mixed mode in HL2. Valve did. That's right. So before you go bashing nVidia about them cheating, remember that Valve, MADE the codepath. They could've chosen to only have DX8, DX8.1 and DX9 code paths.. no mixedmode... but they didn't. Now, why did they make this mixedmode? Cuz they want to sell more games. Understandable. But, I think it's wrong to get so upset at nVidia. They have inferior hardware. Don't buy it if you don't like it. Stop whining about it. If you bought ATI, be glad you'll get to appreciate HL2 "How it was meant to be played™" ;)Also, I'd like everyone to remember who started this whole cheating business in the first place, by going back to fall 2001. When a certain graphics card was released. I seem to remember someone having drivers that detected that a game (quake3.exe) was started, and doctored IQ accordingly to increase speed (and I guess the same could be said about 3dfx with their minigl driver)...
I have a Radeon9700Pro. I'm very happy with it. Before that I had a GeForce3Ti200. Also, I was very happy with that at the time. And the next card I buy, depends on who -I-, not anyone else, think is the better (R420 or NV40). Stop following what hardware sites say blindly. I take everything with a grain of salt. I read as many reviews about a product I'm curious about, as possible. Then I make up my own opinion. Anandtech is a damn good hardware site. Still I don't trust it blindly. However, if two hardware sites have different results, I tend to lean towards AT's conclusions, untill I get more sources to base my opinion on.And just cuz he posted a few benchmarks from an unknown source, IN HIS P E R S O N A L weblog, DOES NOT make him a sellout.
Stop bashing Anand. I wouldn't be surprised if he stopped updating his weblog on account on the comments from this post alone :/
Luagsch - Friday, October 3, 2003 - link
LOL, chill guys... when anand hs he's own benchmarks he'll tell us. right now i'm rather waiting for the iq-tests from anand and derek. and we don't know what will happen to half life 2 after some of the source was leaked...Anonymous - Friday, October 3, 2003 - link
#78, HAHA, isn't it pathetic? I think ATI trolls are now probably the #1 most hated bunch of dolts in the hardware community. For that matter, anyone who posts all day about NVIDIA or ATI (these two ATI trolls named Digitalwanderer and Hellbinder) is usually just a loser with nothing better to do with their time.Anonymous - Friday, October 3, 2003 - link
heh ... its funny so many ppl are saying that AT is biased towards NV ... AT recommended ATi cards all the way in their most recent article, and these numbers are in ATi's favor as well...Some of you ppl don't make sense.
Anonymous - Friday, October 3, 2003 - link
http://www.halflife2.net/ 2nd news postingIf this is true, why botheer comparing benchmarks from HL 2 at this at all?
Better yet, lets wait for the damn game to come out and other next gen titles and compare performance and iq with those before getting into a worthless and useless debate.
Blackie - Friday, October 3, 2003 - link
WOHAAAAAAANvidia cards MAGICLY double the frame rate!!!!
Who make the detonators?
Harry Potter or Arsenio Lupin?????
ROTFL
Anonymous - Thursday, October 2, 2003 - link
#71: I posted #54 exactly once. Someone else (or other people) took it from there and reposted.Anonymous - Thursday, October 2, 2003 - link
#71: I posted #54 exactly once. Someone else (or other people) took it from there and reposted.StormGFX - Thursday, October 2, 2003 - link
Halflife source code leaked.Here comes another delay. Next 1/2Life 2 will be a NIN64 game only..
This is so sad.. Now they well have to redo alot of the code to keep the cheaters out. Even the steam source code was taken. I don't know about you but I'm not typing in my Ccard number into anything that has to do with Valve.
Maybe now the nvidia guys can look at the scorce code and make some better drivers. :-P
Anyway.. I'm sorry to hear that Valve lost 5 years of work. I'm sure this will only delay the game for just a few more months.
Anonymous - Thursday, October 2, 2003 - link
Wrong. Fanboys don't shut up on either side regardless of the facts, as demonstrated several times in this thread.Anonymous - Thursday, October 2, 2003 - link
#65, you're a true idiot. Repeat that as much as you like, you look dumber each time you do it. The 9800XT review was posted BEFORE Anand received the benchmark numbers he posted. Christ, get a clue you dope.Oh, and I really hope that these numbers turn out to be true and that the IQ is at least close to ATI’s. That would make a lot of FX owners much happier, and it would of course shut the annoying ATI fanboys up.
experienced one - Thursday, October 2, 2003 - link
I bet half the ATI fanboys never used a GeForceFX 5900 card. You guys prolly sound like the review sites and wouldn't be bitching and moaning about poor image quality...Gabe Newell said himself that going from DX8.1 to DX9 would look 95% the same. Going from DX9 FP24 to FP16 would probably look 99.99% the same judging from what he said...
experienced one - Thursday, October 2, 2003 - link
lol @ #68...I have the exact same experience... I went out and bought an ati 9700pro(based on ati fanboy's recommendations... same experience as you... traded it for a geforcefx 5900... everything running without a single hitch...
yayaya.. go on fanboys, go call me an idiot, cuz it worked perfectly for you.... i bet you have the same setup as me too :roll: and i bet you love to go through all the extra hassle just to get your ati shit working...
StormGFX - Thursday, October 2, 2003 - link
Why do you people become fanboys? You spend 200 to 300 dollars on a video card to then sit and spew out how good you're new video card is and how bad X-card is. The joke is on you. I read all reviews with a grain of salt. I go out and buy all the video cards that are out and try them out in my system. Then I keep the best card. I went out and got a ATI card and put it in my system and it didn't even turn on my 23" tft display. I was like god damn it... Then I had to go get the Nvidia card and it worked just fine. I can run all my 3d program and games just fine.. Buying a 300 dollar video card just to play 1 fucking game is just lame. As for the nvidia dx9 problem I ran aquamark with the current drivers and was getting 14 to 20 fps.. I was like shit...WTF I'm taking this nvidia card back.. then my friend gave me some leaked drivers and the FPS went up to 36 to 47 FPS. So things can be fixed just in drivers.My point is we need to be on the side of the gamer not on the side of the people that make the hardware. They don't give a shit about us. They pimp us out like little bitches and watch us fight amongst ourselfs. I can see if ATI or Nvidia sent out checks to people who got other people to buy their products. For the people that run out and by an ATI card just to get ready to play Half Life... Don't wait until the game comes out and you have more video cards to pick from. I remember when Nvidia and HALO got me to buy a GeForce 3 card. I ran out, got my video card like a good liltte sucker and got bitched slaped when halo turned into a x-box game only. And what happens if Half Life 2 turns out to be a bad game? What happens if someone else comes out with a video card that blows away both the Nvidia and ATI line of video cards. Don't be a fanboy you will always lose in the end.. I know I was a VOODOO fan boy for a long time... Ouch... Never again..
What we need to do is come up with a way to tell both ATI and Nvidia that paying 300 bucks or more for a video card that out dates in 6 months is just wrong. I can get a game cube for 100 bucks... damn!!
Jaiph - Thursday, October 2, 2003 - link
I don't think it's entirely fair flaming Anand for posting this as it's not the main page, -however- it wasn't exactly that smart a move as without further details and IQ samples the flame war evident here could have been seen coming a mile off.Sharing is cool and all, but when the information is so incomplete it just feeds the fanboy fires.
Gouhan - Thursday, October 2, 2003 - link
Okay people, Its still clear the radeon cards are faster. If you have one good for you. If you want to buy one, even better. But for fucks sake stop stuffing ATi down everybody's throats. Yes yippee doo day the Radeon's faster, no shit people.You'll get a Noddy badge for stating what everybody knows.
At least nVidia's doing something about it and believe it or not, they don't owe you an apology since you're not buying their graphics cards anyway. (So F**K off!)
Thanks for the unbias Anandtech, good on you. And no I actually do trust your site somewhat more than Extreme, Toms or the Inquirer.
Anonymous - Thursday, October 2, 2003 - link
Again:This post says:
"I wouldn't draw any conclusions based on this data yet, just wanted to share :)"
The CONCLUSION of the 9800XT review says:
"and although the latest drivers have closed the gap significantly, ATI is still ahead in Half Life 2. The numbers we’ve seen indicate that in most tests ATI only holds single digit percentage leads (< 5%)"
Anyone else find this ironic?
Anonymous - Thursday, October 2, 2003 - link
Most claims of nvidiots usually come from another kind of idiot. Judging from this thread, well, you get my point.Look, are we even remotely on topic or is this going to be another flamefest?
Anonymous - Thursday, October 2, 2003 - link
Again:This post says:
"I wouldn't draw any conclusions based on this data yet, just wanted to share :)"
The CONCLUSION of the 9800XT review says:
"and although the latest drivers have closed the gap significantly, ATI is still ahead in Half Life 2. The numbers we’ve seen indicate that in most tests ATI only holds single digit percentage leads (< 5%)"
Anyone else find this ironic?
Anonymous - Thursday, October 2, 2003 - link
So you really think that nvidia pimps like Tom and Anand are trustworthy? Nvidia got them in their pocket, and that "gentlemen" is the whole issue! Nuff saidAnonymous - Thursday, October 2, 2003 - link
Well there's clearly been some controversy around this stuff. Can you fanboy queers (ATI & NVIDIA) reserve judgement until some real numbers come out from Anandtech or Tom's Hardware; I don't understand why you'll so readily believe the first HL2 numbers that came out anyway.Anonymous - Thursday, October 2, 2003 - link
Again:This post says:
"I wouldn't draw any conclusions based on this data yet, just wanted to share :)"
The CONCLUSION of the 9800XT review says:
"and although the latest drivers have closed the gap significantly, ATI is still ahead in Half Life 2. The numbers we’ve seen indicate that in most tests ATI only holds single digit percentage leads (< 5%)"
Anyone else find this ironic?
Anonymous - Thursday, October 2, 2003 - link
Im sure all these intelligent half-life players know alot about hardware and 3d engines...Benchmark Bob - Thursday, October 2, 2003 - link
Hi my name is bob and I like to do benchmarks. I am not an NVidia fan boy so by saying that I am objective and have nothing to gain from NVidia.Here are my numbers:
Radeon NVidia
e3_bugbait 71.4 123354.32
e3_c17_01 57.6 123456789.10
e3_c17_02 49.9 4564565.32
e3_phystown 74.5 5555555.55
e3_seafloor 53.9 l337l337.00
e3_techdemo_5 83.5 13579.13579
e3_techdemo_6 76.9 0101010101.01
e3_under_02 77.3 455455455.455
I assure you that these numbers are real and not falsified. Look at how much faster nvidia is!
Anonymous - Thursday, October 2, 2003 - link
nVidia is the king oh the Hill.Once again.:-)
David Kirk-BillytheKid has spoken.
Genx87 - Thursday, October 2, 2003 - link
"This is some dangerous marketing stuff, meaning some uninformed people that might buy an ATI card might now be confused or changing lanes to Cheatzilla."
Ya we never saw this before, like, um on Shader Day? Gabe Newell proclaiming Nvidia sucks, ATI rules, and we should all buy an ATI card to run HL2. Then he decides to not even release the game until any of the shader benchmarks are worthless and pointless. I would feel more sorry for people who ran out to get a 9800 Pro only to find out all they get a voucher for the game they bought the card for, and by the time they get the game they could of spent the same amount of money on an NV40 or R420 and got better performance.
Anonymous - Thursday, October 2, 2003 - link
its amazing how the nvidia fanboys say something to bash someone when the other party is just telling facts. Go get a real card ( ati ) nvidia is cheating crapAnonymous - Thursday, October 2, 2003 - link
This post says:"I wouldn't draw any conclusions based on this data yet, just wanted to share :)"
The CONCLUSION of the 9800XT review says:
"and although the latest drivers have closed the gap significantly, ATI is still ahead in Half Life 2. The numbers we’ve seen indicate that in most tests ATI only holds single digit percentage leads (< 5%)"
Anyone else find this ironic?
bruno - Thursday, October 2, 2003 - link
sounds alot like the same stuff we went through with the 5900FX nv35 lots of AIR alot of lies alot of trickery and when you want to use the card decently we'll end up with the same garbage.Scary stuff.and i wasted already 500$ on a Fx5900 ultra that gave cute performance tests too until you actually wanted to enable some quality and play some decent games and the card becomes worse then a geforce 4
Anonymous - Thursday, October 2, 2003 - link
It is not sucky FX series that is annoying.It is neverending nvidia cheats/major hardware sites endorsing it that is extremely annoying.
mike - Thursday, October 2, 2003 - link
dood do you guys get together in little groups at your homes and just hate on nvidia til the bright and early morning? why so much animosity? how sad is that. "look fool my shit goes 5 fps faster!!" who gives a shit? nvidia was the top dog for a while, now it's ATi's turn. this is how business goes. you watch, soon ATi will slip up and nvidia will be back on top. so just save your breath and shut the hell up, and don't call me an nvidia apologist either, cause i have a 9800 pro in my system right now, but before that i had a geforce 3. so just find something else to get your panties in a bunch ladies...Anonymous - Thursday, October 2, 2003 - link
Ati is good, ATI outperforms everythingAti is good, ATI outperforms everything
Ati is good, ATI outperforms everything
Do you repeat those 1000 times in front of the mirror, every day?
Anonymous - Thursday, October 2, 2003 - link
Yes it will, because nVidiot fuck ups will always defend, and forgive ncheatia no matter what!TB - Thursday, October 2, 2003 - link
This gen both ATI and nvidia screwed up. Most companies screw at least something up in their new products but most of the time it can be fixed or disabled. Okay so nvidias issues in this gen (doesnt effect GF4/GF3 etc) cant be fixed. Fine, but whos to say the nv40 wont be amazing? We have 4 + companies offering DX9 technology now or in the near future, these sorts of mistakes wont be a common thing for long. In the end, buy what ya need, or wait for next gen.God - Thursday, October 2, 2003 - link
Yeah.. nVidia is cheating. If you can't grasp that concept yet, Then you are a dumb mofo loser! It's time to take the nVidia shrine down and face the facts that they are Cheating!And to the losers above, nVidia left the discussions about DX9 because MS didn't bend to their demands of making their stuff in DX9. So why don't you guys read up before knowing what really happend and why the FX sucks in DX9... Pffft.
God - Thursday, October 2, 2003 - link
Yeah.. nVidia is cheating. If you can't grasp that concept yet, Then you are a dumb mofo loser! It's time to take the nVidia shrine down and face the facts that they are Cheating!And to the losers above, nVidia left the discussions about DX9 because MS didn't bend to their demands of making their stuff in DX9. So why don't you guys read up before knowing what really happend and why the FX sucks in DX9... Pffft.
God - Thursday, October 2, 2003 - link
Yeah.. nVidia is cheating. If you can't grasp that concept yet, Then you are a dumb mofo loser! It's time to take the nVidia shrine down and face the facts that they are Cheating!And to the losers above, nVidia left the discussions about DX9 because MS didn't bend to their demands of making their stuff in DX9. So why don't you guys read up before knowing what really happend and why the FX sucks in DX9... Pffft.
Anonymous - Thursday, October 2, 2003 - link
its amazing how the nvidia fanboys say something to bash someone when the other party is just telling facts. Go get a real card ( ati ) nvidia is cheating crapAnonymous - Thursday, October 2, 2003 - link
Why it's called cheats? Because we know it has architectural problems.These big hardware sites never publicize the hardware problem of FX series properly. And everytime nvidia cheats with new driver, they just accept it and spew it to readers. They just do that over and over again.
That's why ppl criticize nvidia and hardware sites like [H], AT, Tom's over and over again.
Anonymous - Thursday, October 2, 2003 - link
its amazing how the nvidia fanboys say something to bash someone when the other party is just telling facts. Go get a real card ( ati ) nvidia is cheating crapAnonymous - Thursday, October 2, 2003 - link
It's amazing how blatant and ridiculous some of the ATI fanboys are. Every time something that shows nVIdia in a good light it's immediately "cheating," sometime without a single shred of evidence. It is indeed possible that GeForce cards can perform well, much to the chagrin of fanATIcs. And these aren't even verified, so no point in getting features ruffled over nothing.Anonymous - Thursday, October 2, 2003 - link
Hey if the optimizations/cheats they made in the drivers make HL2 all that much faster and the IQ differences are neglible or you need to blow up a single frame to 400% to notice then kudos to nvidia. Conversely if the IQ differences are drastic enough to notice during actual gameplay then I'm not interested.I swear you fanatics could drive a guy to nvidia just on grounds of being so irritating....
UnI - Thursday, October 2, 2003 - link
Ffs, every one is allready jumping to conclusion's. Just f*cking wait till the cards are both for sale and tested with the official HL2 benchmark before every one is calling that Nvidia is cheating. To comment on the 51.75 drv it's a BETA what did everyone expect?? That it was perfect the first time? As far as i know there are no perfect drivers. Yes i own a Nvidia card and no i am not a fanboy (before some stupid flame war is beginning about fanboys) I am thinking of purchasing a radeon card in the near future (R420) or stay with Nvidia (NV40).Anand did a fine job on publishing this news, i looked with interest to this "bench" and can't wait to see some official figures, maybe Nvidia did find a way to improve performance without loosing IQ only Time will tell
UlricT - Thursday, October 2, 2003 - link
WHY THE HELL IS EVERYONE FLAMING ANAND?All he did was put up some preliminary results from a source HE considers reliable. People visiting Anandtech consider Anand to be a reliable person. If he was to jump ship (or sell his soul..) to the marketing side, he would have done so a LoNG time ago!
WAiT for official benchmarks before making any judement calls...
Anand, PleAsE don't be taken aback by these peope comments. I, for one, really hope that you will keeping posting stuff like this! Get us on the inside of what is going on!!! Thanks... :)
Anonymous - Thursday, October 2, 2003 - link
.... do you feal the fresh air in your face?!? Do you see birds, dogs, people in the streats?? No?!? Ok... turn off your comp and get a life.... please... theres a lot of things to see and feald.... get a life....Andy - Thursday, October 2, 2003 - link
I would say calling DX9 a standard is a bit of a stretch. It's not as if it was defined by some independent standard body, like say the OpenGL ARB. Let's call it what it is - a Microsoft specification.And it seems rather obvious that Microsoft has been somewhat concerned of late with Nvidia dominance in the graphics industry. Nvidia had the audacity to refuse to lower the already slim margin on the XGPU, despite repeated aggressive demands from Microsoft. Aside from that, Nvidia has consistently pushed a multi-OS strategy and OpenGL as an alternative API. This obstinance clearly is intolerable.
Could it be a coincidence that MS gave their seal of approval to Nvidia's struggling competitor and at the same time left Nvidia high and dry with their ambitious 32-bit architecture?
I would say the surprise is not that Nvidia is behind. It's how well they manage to keep up. And it remains to be seen how much of a tradeoff the mixed precision path will be.
Bay - Thursday, October 2, 2003 - link
Um, relax folks. "...take them with a grain of salt." ATI will always run at higher quality anyhow, so no big deal. The ATI folks will be happy, and the nVidia folks will be happy because they can at least RUN the game at acceptable frame rates at the expense of visual quality. Oh Well.Jahara - Thursday, October 2, 2003 - link
"superior and flexible architecture"?They didn't make the hardware change anything, they had to make the software use a different hardware path that doesn't run at the correct DX9 specifications.
Do you have any clue what you're talking about?
Nvidia just has a different architecture that doesn't run at the standard DX9 specs, thus it is a inferior card and isn't in any way "superior and flexible" since it isn't a standard.
It's like comparing a rocket engine to a car motor. Sure it'll generate more thrust, but you can't just stick it on wheels and call it an efficent car. You need to change the road layout completely, but that doesn't make it a car.
Reality - Thursday, October 2, 2003 - link
Well, at least the game will run at playable framerates on Nvidia hardware now. Too bad the game will look very sub-par compared to ATi's offerings.Glad I bought my 9800 Non Pro (Modified to Pro) for $250, instead of paying $300+ for a 5900 Non Ultra. Hahah.
BeyondBeyond3D - Thursday, October 2, 2003 - link
Hahahaha... in the face of the MS and ATI's "let's make the R3xx design the actual DX9 spec" conspiracy, Nvidia comes to the rescue once more and makes their superior and flexible architecture beat the cronies at their own game. Look who have egg on their faces now :)BeyondBeyond3D - Thursday, October 2, 2003 - link
Hahahaha... in the face of the MS and ATI's "let's make the R3xx design the actual DX9 spec" conspiracy, Nvidia comes to the rescue once more and makes their superior and flexible architecture beat the cronies at their own game. Look who have egg on their faces now :)BeyondBeyond3D - Thursday, October 2, 2003 - link
Hahahaha... in the face of the MS and ATI's "let's make the R3xx design the actual DX9 spec" conspiracy, Nvidia comes to the rescue once more and makes their superior and flexible architecture beat the cronies at their own game. Look who have egg on their faces now :)HawK - Thursday, October 2, 2003 - link
Think it is HIGHTIME that site owners & reviewersremember who is reading your "story's".
You have an obligation to your readers to inform them with the thruth, and not to kiss and s#ck with your paymasters.
In the last few weeks Anandtech has IMO shown on wich "side" they are on!
And with this the final insult to me and your readers will end my visits to this site!.
ps; I do not stand on one side or another. but I do care for good and honest information!!.
BetrayerX - Thursday, October 2, 2003 - link
You Anand defenders are missing one point.This is some dangerous marketing stuff, meaning some uninformed people that might buy an ATI card might now be confused or changing lanes to Cheatzilla.
In other words it stops potential buyers who 'think' that info coming from such a "reliable source" (pun intended) should be taken seriously, just to find out later that the game might look like this:
http://www.iol.ie/~baz8080/crap.jpg
I, for a second, thought that [T] people hacked this site and posted their senseless blabbering.
At least the source should have been revealed.
To the guy on NVidias drivers:
ATI may have bugs in games (and only if NOLF2 I have seen one), but when NVidia does them it affects hardware. I lost to GF2MX thnx to that "double speed bug after comming from standby" in 6.xx and 7.xx before I found out.
How about the fan that doesn't spin?
Anonymous - Thursday, October 2, 2003 - link
Quote from Anand:"I just thought you'd like to see what we're seeing, I wouldn't draw any conclusions based on this data yet, just wanted to share :)"
Half of you already did come to conclusions, you should be happy he shared this with you!
Anonymous - Thursday, October 2, 2003 - link
well you guys are seriously a bunch of ungrateful little sh*ts. this is his WEBLOG. it's NOT anandtech front page news-- it's something that he found interesting and decided to post in his friggin journal. i hope you sacs don't go on your friends xanga and ujournal pages and bitch and moan at them for writing about things they find interesting.if you don't like what you read here, go get your news somewhere else
zim2323 - Thursday, October 2, 2003 - link
I'm personally a fan of both Nvidia and ATI. They each have their plus and minuses. While the developers obviously have to worry about what is the best to develop for and obviously get tired of dealing with certain companies agendas *cough Nvidia cough*, I personally could care less as long the end result is that it works, regardless of cheats, optimizations, etc.To me, it all comes down to the games you play. If the games you play are largely DirectX games, then ATI is the way to go, but if you run OpenGL games, Nvidia is your choice. I own both a 5800 Ultra and an ATI 9800 Pro. In games like Battlefield that use DirectX, the ATI card averages about 5-10fps more than the Nvdia card. While in a game like Quake3, the Nvidia cards running in OpenGL run about 15-20pfs faster.
The games I like to compare are the games that have both DirectX and OpenGL options, such as Nascar Racing 2003. On an Nvidia card in OpenGL, I get around 100-110fps, while in D3D I only get around 30fps. But on an ATI card, in OpenGL, I get around 20fps, but in D3D, I get 80-90fps.
I think a lot of people forget that the graphics API makes a HUGE difference in performance, and as such is a key indicator in the type of performance you will get depending on the card you are using.
Perfect example of this is Nvidia's capability to be "twice as fast" as the ATI in Doom3 (OpenGL), while the ATI is "twice as fast" as the Nvidia in Half-Life 2 (DirectX).
While I have no affinity to either card manufacturer, Nvidia's drivers seem much more polished, while ATI's can be troublesome. ATI made drastic improvements with the Catalyst 3.7's and I was able to uninstall, install, overwrite, etc. those drivers many times with no real problems. Same could not be said for the older Catalyst drivers. So for that, I'm extremely pleased.
What are your guys thoughts on this?
TMS - Thursday, October 2, 2003 - link
Go NVDA! Those are some schweeet numbers!nyo - Thursday, October 2, 2003 - link
Yeah, I'd hate to delve into that benchmark and discover all the cheats and shortcuts taken to achieve those numbers. I trust Valve more than "unnamed, but reliable sources".I think it's gonna take more than a driver update or two to dig the NV35 out of the mess it is currently in. nVidia shot and missed. Get with it.
God - Thursday, October 2, 2003 - link
pathetics humans... get a life before its too lateMM/XF - Thursday, October 2, 2003 - link
Hey,really unprofessional to post benchmark scores
- of a benchmark you didn't do yourself
- of a game/benchmark not available yet to the public
- using drivers that are not yet and/or will never be published, let alone WHQL'ed
- where competitioners are not doing the same workload (codepaths anyone?)
- without verification of IQ
- without even STATING THE GODDAMN source (not that it isn't pretty obvious)
See, even without having to use the words "cheat" or "optimisation" this is unacceptable, even as a "first look" or whatever.
The big hardware sites really ARE dead.
Anonymous - Thursday, October 2, 2003 - link
Um... where did they come from then?Anonymous - Thursday, October 2, 2003 - link
I know where these numbers came from. Anand forgot to mention that NVidia card was benched with 16-bit colour!Anonymous - Thursday, October 2, 2003 - link
I know where these numbers came from. Anand forgot to mention that NVidia card was benched with 16-bit colour!Anonymous - Thursday, October 2, 2003 - link
WOW!!!So when you run Nvidias card at crappy IQ and DX 8.1 with inserted clip planes and cheats to even enhance screenshots it is almost as fast as an ATI card running full precision full blown DX9 with all the bells and whistles on.
Absolutely amazing!!!!
I guess I'll run right out and get an Nvidia card!!!
Oh wait, I can buy a $200 ATI card that spanks the crap out of Nvidia's best $500 card.. Um ... maybe not....
Let me guess the source of the benchmarks.... Nvidia.
Pathetic, really pathetic.
Shalmanese - Thursday, October 2, 2003 - link
Oh, for chrissakes, these aren't HIS numbers. HE didn't do any benchmarking. They were give to him so any comments about not doing IQ diffs is ridiculous. Just have some patience and wait for the official benchmarks/Part II of the 9800XT review to come out before commenting.Anonymous - Thursday, October 2, 2003 - link
Yes those nvidiot fanboys are hillarius!Anonymous - Thursday, October 2, 2003 - link
Jesus the fanboys are getting worse and worse these days.Anonymous - Thursday, October 2, 2003 - link
i better not start see the way they're meant to be surfed, anandtech.com ads all over this website soon. fishy indeedMorten - Thursday, October 2, 2003 - link
To #5, if you don't like it, don't read it. Go somewhere else. It's not like these benchmarks are official. Anand even said to take it with a grain of salt. If anything, those numbers are interesting. Guess we'll find out how it stands when the benchmark is released and Anand does a real benchmark.Wotan81 - Thursday, October 2, 2003 - link
Yet an other bougt by nvidia benchmark! LOLAnonymous - Thursday, October 2, 2003 - link
Maybe it's worth waiting until Anand can do some in house benchies. If he doesn't have the demo/time/etc, then it's not his fault he has to use someone else's numbers and not give us much info about them.Anonymous - Thursday, October 2, 2003 - link
Wow I feel sorry for Valve. They wasted 5x the time optimizing the NV codepath compared to the standard DX9 path. All of this time, work, and effort was in vain.In the end, NVidia just optimizes Half Life 2 its own way.
Even though performance is very close, its comparing apples to oranges. You are comparing the NV path, with mixed precision, and missing certain high quality effects such as HDR(which looks AMAZING on screenshots), and are comparing it to ATI's full blown DX9.
Anonymous - Thursday, October 2, 2003 - link
You might as well just benchmark Nvidia cards at 640x480 and compare them to ATI cards at 1200x1600. Who is this source and why didn't they benchmark BOTH the FX and 9800 at full precision?Sazar - Thursday, October 2, 2003 - link
it is using pp codepath... as anand says.. mixed precision... if it is like the nv35 therefore it will be using fp16/32 likely...I do however find it quite shocking that there is no additional column for dx9 codepath performance figures by anand for the nv38... that comparison and the subsequent performance gains would be what we would be looking @ for comparison...
Anonymous - Thursday, October 2, 2003 - link
Hopefully valve will keep those release updates along with HL2 that stop cheating as they find them when time goes on. Very very odd numbers if you ask me, they don't jump that high "automagicly" and we all know nvidia's hardware is not up to it. like every one is saying, there's going to be some serious IQ problems.Anonymous - Thursday, October 2, 2003 - link
Performance like this doesn't come from thin air. Probably reducing everything to as low as FX12. Oh and you also failed to mention that these scores are without certain features like HDR enabled, since FX cards can't do it at all. Maybe throw in some static clip planes specifically for the benchmark camera sequences.....Its unbelievable that you would post these numbers coming from a "reliable source" without any sort of IQ comparison from tests in lab. And what source might that be....Derek Perez? Brian Burke? Thats Anand for you
Anonymous - Thursday, October 2, 2003 - link
i will look fine , maybe not as nice as ati's but we dont know that yet , plus ive seen some pics of ps 1.4 and ps 2.0 and to tell you the truth i could barely tell the diff {tomb raider}, it was there but barely notisableAnonymous - Thursday, October 2, 2003 - link
I am tired of these BS.Why these major hardware sites are too ready to believe whatever nvidia spews and spread false around the mass?
Anonymous - Thursday, October 2, 2003 - link
yeah,not possible. IQ will be the killer...Anonymous - Thursday, October 2, 2003 - link
No IQ comments, eh? Which path is nVidia running?I refuse to believe they pulled this rabbit out of their hat that fast...