I think the people bitching about the article being biased to one side or another are biased themselves. Let go of your hangups with ATI and Nvidia and read the article again. IMO the article was well written and about as unbiased as it could be. The fine folks at Anandtech are human after all.
Here's something no one pointed out: Check the Jedi Knight: Jedi Academy screenshots, and notice how the NVIDIA employs the light sabre alpha-blending almost in *post* processing, so that even the head and body (which obstruct our view of the light sabre) glow as much as where it isn't obstructed.
ATI's alpha blending works in that only visible areas 'glow', however, where it does glow, it is as irregular as heck.
Point is, they both have faults with alpha blending.
I bought a Leadtek GeForce 4 Ti4400 some time ago on the review that it was a speedy, stable and "cool" card for DirectX games. It was all those, but a piece of visual junk otherwise. DVD's looked horrible because of the lack of a gamma adjustment. I had to settle for brightness/contrast alone which made it washed out. The overall contrast between 0 black and 255 white in any photo program was no where near my current Radeon 9600, or even an older Matrox G400. It makes it hard to do any kind of calibration for photo work, let alone enjoy the occasional DVD.
“WRT dx 10, it still seems to me that nvidia is closer to ps3 support than ati. I will definitely give the Meltdown presentations some more in depth study time though.”
Derek – As I said, PS3.0 is already part of *DX9*, DX10 will be PS/VS4.0. Please read out DirectX Next article to gain an understanding of the directions DX10 is taking
We need to fill in the gap between DX9 PS/VS2.0 and DX10 PS/VS4.0, being the update to DX9.0 that will allow hardware targets for PS/VS3.0. Regardless of whether NVIDIA are closer to PS3.0 or not that doesn’t mean that ATI will have issues support it – ATI hadn’t support true multisampling FSAA before R300 and yet leapfrogged NVIDIA in this respect, NVIDIA hadn’t supported PS1.4 before the FX series and yet they went to PS2.0 Extended – what they currently support doesn’t necessarily have much bearing on what they are going to support.
“I know (as do many who were present when David Kirk was speaking about the issue) that the FX hardware supports Multiple Render Targets.”
FX natively supports MET’s, not MRT’s. The FX series would probably need the driver to support MRT’s by packing and unpacking instructions.
I asked a developer about the difference between MRT’s and MET’s and this was his reply:
“MRT are a lot more flexible, effectively upto 4 render-target can be rendered to simulatously, the only real restrictions are same size and for some chipsets (ATI but not AFAIK PowerVR) same format. MET are special textures that are wider than usual (usual being upto 4 channels (i.e. RGBA), MET aren't really anything special there just >4 channel textures (RGBAXYZW). It may be possible in MET's are different formats for each 'logical' texture. i.e. 8bit per channel RGBA and float16 XY. MRT are much better as you can mix and match standard render targets as you like. MET have to be special cases where MRT can be used much more like any render target.
Both are handy but MRT is the perferred route (AFAIK MET were added to Dx9 to try and cope with the GFFX style pack/unpack render targets). MRT are much more the theorytical idea, you can choose to render to multiple places at teh same time.”
[Derek] “I'm not sure if flexible float buffers fall into the same catagory, but I will look into it.”
NVIDIA has some odd addressing restrictions, outside of DX’s standard requirements (for any revision), with their float texture support that restricts use for output buffers a conditional flag needs to be added to DX9 to support these restriction on NV’s hardware. This flag is likely to be added to the update to DX9 that will also allow hardware PS/VS3/0 support next year.
WRT dx 10, it still seems to me that nvidia is closer to ps3 support than ati. I will definitely give the Meltdown presentations some more in depth study time though.
I wasn't quite clear on a couple things with NV supporting DX9... Not all of the features are exposed with current drivers even thought the hardware is capable of them. I know (as do many who were present when David Kirk was speaking about the issue) that the FX hardware supports Multiple Render Targets. We could not get any confirmation on a reason this feature is not yet enabled in software (and I will leave the speculations to the reader). I'm not sure if flexible float buffers fall into the same catagory, but I will look into it.
Of course, my original assesment of these facts not mattering except from an experience for NVIDIA standpoint still stands.
I really appreciate your comments Dave, please keep them coming.
And Valnar (#25):
I am really looking forward to doing just what you are asking for in future reviews. I don't feel comfortable with doing such analysis on any real current display technology, and am looking for ways to capture the output of cards over DVI. I can use photoshop for as much analysis as necessary to compare images at that point (including spectrum/brightness and crispness or blurryness of text).
Of course, this stuff won't test the cards ramdac, which is still very important. I'm hoping to also come up with some solid 2D benches in the coming year that should include such analysis.
Please keep the comments and suggestions coming as I am very open to including the kinds of information you all want.
"So what about the missing 24bit fpu in the FX series which have 16 and 32bit the latter being very slow. Doesnt this mean that the FX series is not even compliant?"
No. DirectX9 specification say that the *minimum* precision requirement for "full precision" pixel shaders is FP24 - this being the minimum requirement means that anything above that is also deemed as full precision. FP16 is the minimum requirement for partial precision shaders.
"And given the scarcity of DX9 games at present, inevitable that FXs will run into trouble in the future"
Its not an inevitability, but it may make developers live a little more difficult in the interim since, as NVIDIA themselves point out, they will have to work a little harder to make them optimal for FX hardware. Whether or not that means that some development houses will put off implementing serious use of DX9 for the time being is a real question.
However, I'd say that the DX9 talk doesn't tell the whole story as ATI's shaders appear more effective even with many DX8 shaders in many cases. For instance, look at these performance numbers with a test application:
Thank you 27 for explaining compliant versus full DX9. So what about the missing 24bit fpu in the FX series which have 16 and 32bit the latter being very slow. Doesnt this mean that the FX series is not even compliant? And given the scarcity of DX9 games at present, inevitable that FXs will run into trouble in the future,
“The FX cards support full DX9. In fact, they support more than the minimum to be DX9 cards and support some features that people speculate will be in DX10 (fp32, longer shader programs, etc...).”
Derek, the FX series supports the full features required for DX9 “compliancy”, they do not support “full” DX9 – there are numerous optional features within DirectX specifications that IHV’s can choose to support or not. Two such features within DX9 is the support of flexible float buffers and MRT’s, neither of which the FX series do support.
The lack of float buffer support is causing some head scratching among developers as it would generally make their life easier (check with some to see if they would find it useful if all vendors would support it) – float buffers are already use in some game which means that the FX series can’t support some options (check Tomb Raider). MRT’s also have their uses, and we’ve published some research work from one developer on lighting which is made easier by the support of MRT’s:
As for the “speculation of the FX’s support for what will be in DX10” please check the Micorsoft Meltdown presentations – they list the specification requirements for Pixel Shader 3.0 compliant which is already spec’ed within DX9 and the FX series falls short of this requirement.
“Part of the reason ATI is able to lead so well in performance is that they don't support many of these features.”
And NVIDIA don’t support some features that are also within DX9 – I’m not sure you can claim that ATI has better DX9 performance as they both have elements that each other support and don’t support.
I don't get it. You guys are content for sub perfect products? When I buy something, I want it to be the best for my money. But you guys don't seem to care for the best image quality possible while getting high frame rates. Isn't that why you buy these high end cards?
Another thing, how come anandtech doesn't do an article by finding the minimum frame rates. Remember how we found out ATI and Nvidia sometimes have high average frame rates, but very low minimum frame rates? Now this would be the deciding factor for me. Image quality plus consistent frame rates.
Really guys, I expected more from Anandtech readers.
I read the headline of this article and actually thought it was going to be about image quality. No shocker that it wasn't.
This article talked about rendering quality, not image quality. I'd really like somebody to bring up a color spectrum analyzer. I'd like to hear about text crispness and/or fuzzyness of displaying text at 6 point. I'd like to hear about color accuracy and brightness/contrast/hue/sat/gamma capabilities.
The FX cards support full DX9. In fact, they support more than the minimum to be DX9 cards and support some features that people speculate will be in DX10 (fp32, longer shader programs, etc...).
Unfortuntely, this doesn't make the FX cards worth any more.
Part of the reason ATI is able to lead so well in performance is that they don't support many of these features. And when games come out that can take advantage of them, it isn't likely that FX cards will run those features at very high framerates.
The only thing FX feature support does for NVIDIA is give them one more generation of experience in supporting those features. Of course, it remains to be seen what they will do with that.
ATI has proven that it can do a good job of hitting the nail on the head with actually leading the release of DX9 with full DX9 support. If they can do the same for DX10, then they will be doing very well.
When DX10 emerges, things will get very interesting...
So does this meam NVidia FX series cards are now true DX9 cards or DX9 compatible (whatever that means?) or partly DX9 but better at DX8.1 and hopefully DX10?
This article was excellent. The explanations of all the technologies used in these cards was very clear. I thought the NVidia screenshots were brighter (the flashlight pic in Halo, the first pic in UT on the front right of the scnene). It seems NVdidia is being more accurate in their methodologies. To me, correct lighting is more important than the antialiasing so I would tend to prefer the NVidia. However, I doubt I would really notice any difference at full game speed and so maybe it is better to get the card that is fastest (ATI), although both seem to be plenty fast enough with their current drivers. Tough call. If there are stability issues with ATI drivers that might swing me although this is the first I have heard of problems with ATI's latest cards and drivers.
#17... I'm in the exact same shoes as you are in. I have gone from ATI 9700pro to nvidia geforcefx 5900 from game incompatibility issues. Half my OpenGL games would crash. Tried every driver.. only to find that other games crash while my old game is fixed. Switched to nvidia, not one problem However, if you are lucky and do not have problems, I would agree the 9700pro had far superior AA quality, and takes less performance hit doing AA, but nvidia is good for me since 2xAA is all i need when I use 1600x1200 resolution
He's written a program to test the accuracy of your cards alpha blending, which is kinda cool (and also confirms that ATI is a little off in the calculation).
The theory Scali has is that the problem is due to ATI substituing in a couple shifts to save a division.
We are definitely going to continue looking into the issue, and thanks, everyone, for your feedback.
Exactly, Araczynski. I mean, I love my UT2k3 with 4xAA/8xAF(64-tap), I really do, but I don't NEED it to enjoy the game. As long it plays fast and looks better than the games before it, it doesn't matter if I have to dial back my AA/AF; hell, Halo won't even run above 10fps if I don't knock my filter to bilinear, let alone whining about AF quality at 8x!! Sometimes people just need to realize that IQ is a secondary concern to getting the game to run in the first place; suffer in vain with a Ti4200 in Halo at 1024x768 and then tell me that the "lower" IQ for an FX makes it less worthwhile than a Ti or an ATI.
aside from the rest of the points, as far as i'm personally concerned, AA is a waste of time anyway. The only time i bother to use it is when i'm forced to run something in 800x600 or less (assuming it even supports D3D).
Other then that I will always choose a higher resolution over an AA'd lower one. Personally i prefer the sharpnees/clarity of higher resolutions to the look of AA, but that's my opinion and my taste, I know others think the opposite, which is fine by me.
Point is, i wish revewiers would stop focusing on AA quality/speed as if EVERYBODY gives a rat's hiny about it.
Having owned a few ATI and Nvidia cards I can say that I agree with the article. Finding a difference in image quality between the current top models is mostly subjective. It is amazing to see the amount of bickering over the subject.
What I don't understand is the failure to mention all of the driver problems people have experienced with ATI cards. At this point I will probably never buy an ATI product again due to the poor drivers. Nvidia drivers don't crash my machine.
Everybody here who says the article 'concluded suddenly and without concluding anything' could not have read the article. The whole point of the article is that the author provided enough information for the reader to come his/her own conclusions.
The only conclusion that Derek made was that nVidia does more work to get approximately the same thing done. If you understand anything about real time hardware, this is not a good thing.
Having read every article Derek has written so far, I think this is probably the best one. Unbiased throughout. And if you want a screenshot for every possible combination of AA, AF, and resolutions, then go buy the cards and see how long it really takes to perform this many benchmarks. There is such a thing as article deadlines.
there is hardly any difference between the images. and nvidia used to be way behind in that area. so they have caught up, and are actually in some instances doing more work to get the image to look a little different, or maybe they render everything that should be there, while ati doesnt (halo 2)
Maybe I'm going blind at only 17, but I couldn't tell a difference between nVidia's and ATI's AF, and I even had a hard time seeing the effects of the AA. I agree with AT's conclusion, it's very hard to tell which one is better, and it's especially hard to tell the difference when you're in the middle of a firefight; yes, it's nice to have 16xAF and 6xAA, but is it NECESSARY if it looks pretty at 3 frames per second? I'm thinking "No"; performance > quality, that's why quality is called a LUXURY and not a requirement. Now I imagine that since I didn't hop up and down and screech "omg nvidia is cheeting ATI owns nvidia" like a howler monkey on LSD, I'll be called an nVidia fanboy and/or told that A) I'm blind, B) my monitor sucks, C) I'm color blind, and D) my head is up my biased ass. Did I meet all the basic insults from ATI fanboys, or are there some creative souls out there who can top that comprehensive list? ;)
Argh, this article concluded suddenly and without concluding anything. Not to mention, I saw definite blurry textures in UT 2003, and TRAOD. Not to mention the use of D3D AF Tester seemed to imply a major problem with one or the other hardware but they didn't use it at different levels of AF. I mean I only use 4-8 AF, I'd like to see the difference.
AND ANOTHER THING. OKAY THE TWO ARE ABOUT THE SAME IN AF BUT IN AA ATI USUALLY WINS. SO, IF YOU CAN'T CONCLUDE ANYTHING, GIVE US A PERFORMACE CONCLUSION, like which runs better with AA or AF? Which creates the best with both settings enabled?
Oh and AT. Remember back to the very first ATI 9700 Pro, you did tests with 6x AA and 16 AF. DO IT AGAIN. I want to see which is faster and better quality when their settings are absoulutely maxed out. Because I prefer playing 1024*768 at 6x AA and 16 AF then 1600*1200 at 4x AA 8x AF because I have a small moniter.
I am VERY disappointed in this article. You say Nvidia has cleaned up their act, but you don't prove anything conclusive as to why. You say they are similar but don't say why. The D3D AF Tester was totally different for the different levels. WHAT DOES THIS MEAN? Come on Anand clean up this article, it's very poorly designed and concluded and is not at all like your other GPU articles.
I have been visiting Anandtech for well over 4 years , and I have often exclaimed how thorough, fair, and unbiased this site is to others...
This is the first article I have ever read here that I think is complete poop. I cannot beleive that in any fair IQ test Nvidia came anywhere close to ATI. Either the author is not being honest, or is color blind. Anyone with eyeballls can compare the two and see that ATI is much sharper, and vibrant especially with AA... Nvidia is WAY blurry.
Excellent article. I would much rather be taught why things are different than be showed some differences in rendering and then have someone declare which one is cheating. Thanks for teaching us enough to let us come to our own conclusions. Keep up the good work AT.
Artical seemed fair and unbias to me. Your AA and AF question is odvious. Look at the URL of the png file. It clearly states what is on.
It seems they have cleaned up there DX9 proformance, but they still treat synthitic benchmarks badly. Most recintly the 3DMark03 patch a month ago and how they handeled the media (PR on one side of the pond said one thing, on the other saide, they said another)
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
35 Comments
Back to Article
Nate420 - Monday, December 22, 2003 - link
I think the people bitching about the article being biased to one side or another are biased themselves. Let go of your hangups with ATI and Nvidia and read the article again. IMO the article was well written and about as unbiased as it could be. The fine folks at Anandtech are human after all.titus - Saturday, December 20, 2003 - link
Here's something no one pointed out:Check the Jedi Knight: Jedi Academy screenshots, and notice how the NVIDIA employs the light sabre alpha-blending almost in *post* processing, so that even the head and body (which obstruct our view of the light sabre) glow as much as where it isn't obstructed.
ATI's alpha blending works in that only visible areas 'glow', however, where it does glow, it is as irregular as heck.
Point is, they both have faults with alpha blending.
valnar - Wednesday, December 17, 2003 - link
Derek #30,I bought a Leadtek GeForce 4 Ti4400 some time ago on the review that it was a speedy, stable and "cool" card for DirectX games. It was all those, but a piece of visual junk otherwise. DVD's looked horrible because of the lack of a gamma adjustment. I had to settle for brightness/contrast alone which made it washed out. The overall contrast between 0 black and 255 white in any photo program was no where near my current Radeon 9600, or even an older Matrox G400. It makes it hard to do any kind of calibration for photo work, let alone enjoy the occasional DVD.
My 2 cents...
DaveBaumann - Sunday, December 14, 2003 - link
“WRT dx 10, it still seems to me that nvidia is closer to ps3 support than ati. I will definitely give the Meltdown presentations some more in depth study time though.”Derek – As I said, PS3.0 is already part of *DX9*, DX10 will be PS/VS4.0. Please read out DirectX Next article to gain an understanding of the directions DX10 is taking
http://www.beyond3d.com/articles/directxnext/
We need to fill in the gap between DX9 PS/VS2.0 and DX10 PS/VS4.0, being the update to DX9.0 that will allow hardware targets for PS/VS3.0. Regardless of whether NVIDIA are closer to PS3.0 or not that doesn’t mean that ATI will have issues support it – ATI hadn’t support true multisampling FSAA before R300 and yet leapfrogged NVIDIA in this respect, NVIDIA hadn’t supported PS1.4 before the FX series and yet they went to PS2.0 Extended – what they currently support doesn’t necessarily have much bearing on what they are going to support.
“I know (as do many who were present when David Kirk was speaking about the issue) that the FX hardware supports Multiple Render Targets.”
FX natively supports MET’s, not MRT’s. The FX series would probably need the driver to support MRT’s by packing and unpacking instructions.
I asked a developer about the difference between MRT’s and MET’s and this was his reply:
“MRT are a lot more flexible, effectively upto 4 render-target can be rendered to simulatously, the only real restrictions are same size and for some chipsets (ATI but not AFAIK PowerVR) same format.
MET are special textures that are wider than usual (usual being upto 4 channels (i.e. RGBA), MET aren't really anything special there just >4 channel textures (RGBAXYZW). It may be possible in MET's are different formats for each 'logical' texture. i.e. 8bit per channel RGBA and float16 XY.
MRT are much better as you can mix and match standard render targets as you like. MET have to be special cases where MRT can be used much more like any render target.
Both are handy but MRT is the perferred route (AFAIK MET were added to Dx9 to try and cope with the GFFX style pack/unpack render targets). MRT are much more the theorytical idea, you can choose to render to multiple places at teh same time.”
[Derek] “I'm not sure if flexible float buffers fall into the same catagory, but I will look into it.”
NVIDIA has some odd addressing restrictions, outside of DX’s standard requirements (for any revision), with their float texture support that restricts use for output buffers a conditional flag needs to be added to DX9 to support these restriction on NV’s hardware. This flag is likely to be added to the update to DX9 that will also allow hardware PS/VS3/0 support next year.
DerekWilson - Sunday, December 14, 2003 - link
Dave (27+29),Thanks for responding in so thuroughly.
WRT dx 10, it still seems to me that nvidia is closer to ps3 support than ati. I will definitely give the Meltdown presentations some more in depth study time though.
I wasn't quite clear on a couple things with NV supporting DX9... Not all of the features are exposed with current drivers even thought the hardware is capable of them. I know (as do many who were present when David Kirk was speaking about the issue) that the FX hardware supports Multiple Render Targets. We could not get any confirmation on a reason this feature is not yet enabled in software (and I will leave the speculations to the reader). I'm not sure if flexible float buffers fall into the same catagory, but I will look into it.
Of course, my original assesment of these facts not mattering except from an experience for NVIDIA standpoint still stands.
I really appreciate your comments Dave, please keep them coming.
And Valnar (#25):
I am really looking forward to doing just what you are asking for in future reviews. I don't feel comfortable with doing such analysis on any real current display technology, and am looking for ways to capture the output of cards over DVI. I can use photoshop for as much analysis as necessary to compare images at that point (including spectrum/brightness and crispness or blurryness of text).
Of course, this stuff won't test the cards ramdac, which is still very important. I'm hoping to also come up with some solid 2D benches in the coming year that should include such analysis.
Please keep the comments and suggestions coming as I am very open to including the kinds of information you all want.
Thanks,
Derek Wilson
DaveBaumann - Saturday, December 13, 2003 - link
"So what about the missing 24bit fpu in the FX series which have 16 and 32bit the latter being very slow. Doesnt this mean that the FX series is not even compliant?"No. DirectX9 specification say that the *minimum* precision requirement for "full precision" pixel shaders is FP24 - this being the minimum requirement means that anything above that is also deemed as full precision. FP16 is the minimum requirement for partial precision shaders.
"And given the scarcity of DX9 games at present, inevitable that FXs will run into trouble in the future"
Its not an inevitability, but it may make developers live a little more difficult in the interim since, as NVIDIA themselves point out, they will have to work a little harder to make them optimal for FX hardware. Whether or not that means that some development houses will put off implementing serious use of DX9 for the time being is a real question.
However, I'd say that the DX9 talk doesn't tell the whole story as ATI's shaders appear more effective even with many DX8 shaders in many cases. For instance, look at these performance numbers with a test application:
http://www.beyond3d.com/forum/viewtopic.php?p=1991...
GF FX 5950, 52.16:
PS 1.1 - Simple - 938.785889M pixels/sec
PS 1.4 - Simple - 885.801453M pixels/sec
9800 PRO
PS 1.1 - Simple - 1489.901123M pixels/sec
PS 1.4 - Simple - 1489.822754M pixels/sec
Pumpkinierre - Saturday, December 13, 2003 - link
Thank you 27 for explaining compliant versus full DX9. So what about the missing 24bit fpu in the FX series which have 16 and 32bit the latter being very slow. Doesnt this mean that the FX series is not even compliant? And given the scarcity of DX9 games at present, inevitable that FXs will run into trouble in the future,DaveBaumann - Friday, December 12, 2003 - link
“The FX cards support full DX9. In fact, they support more than the minimum to be DX9 cards and support some features that people speculate will be in DX10 (fp32, longer shader programs, etc...).”Derek, the FX series supports the full features required for DX9 “compliancy”, they do not support “full” DX9 – there are numerous optional features within DirectX specifications that IHV’s can choose to support or not. Two such features within DX9 is the support of flexible float buffers and MRT’s, neither of which the FX series do support.
The lack of float buffer support is causing some head scratching among developers as it would generally make their life easier (check with some to see if they would find it useful if all vendors would support it) – float buffers are already use in some game which means that the FX series can’t support some options (check Tomb Raider). MRT’s also have their uses, and we’ve published some research work from one developer on lighting which is made easier by the support of MRT’s:
http://www.beyond3d.com/articles/deflight/
As for the “speculation of the FX’s support for what will be in DX10” please check the Micorsoft Meltdown presentations – they list the specification requirements for Pixel Shader 3.0 compliant which is already spec’ed within DX9 and the FX series falls short of this requirement.
“Part of the reason ATI is able to lead so well in performance is that they don't support many of these features.”
And NVIDIA don’t support some features that are also within DX9 – I’m not sure you can claim that ATI has better DX9 performance as they both have elements that each other support and don’t support.
BlackShrike - Friday, December 12, 2003 - link
I don't get it. You guys are content for sub perfect products? When I buy something, I want it to be the best for my money. But you guys don't seem to care for the best image quality possible while getting high frame rates. Isn't that why you buy these high end cards?Another thing, how come anandtech doesn't do an article by finding the minimum frame rates. Remember how we found out ATI and Nvidia sometimes have high average frame rates, but very low minimum frame rates? Now this would be the deciding factor for me. Image quality plus consistent frame rates.
Really guys, I expected more from Anandtech readers.
valnar - Friday, December 12, 2003 - link
I read the headline of this article and actually thought it was going to be about image quality. No shocker that it wasn't.This article talked about rendering quality, not image quality. I'd really like somebody to bring up a color spectrum analyzer. I'd like to hear about text crispness and/or fuzzyness of displaying text at 6 point. I'd like to hear about color accuracy and brightness/contrast/hue/sat/gamma capabilities.
Oh well.
zhangping0233 - Thursday, January 5, 2012 - link
Thank you for your post. Strongly recomend that you used the xeccon flashlight.DerekWilson - Friday, December 12, 2003 - link
#23The FX cards support full DX9. In fact, they support more than the minimum to be DX9 cards and support some features that people speculate will be in DX10 (fp32, longer shader programs, etc...).
Unfortuntely, this doesn't make the FX cards worth any more.
Part of the reason ATI is able to lead so well in performance is that they don't support many of these features. And when games come out that can take advantage of them, it isn't likely that FX cards will run those features at very high framerates.
The only thing FX feature support does for NVIDIA is give them one more generation of experience in supporting those features. Of course, it remains to be seen what they will do with that.
ATI has proven that it can do a good job of hitting the nail on the head with actually leading the release of DX9 with full DX9 support. If they can do the same for DX10, then they will be doing very well.
When DX10 emerges, things will get very interesting...
Pumpkinierre - Friday, December 12, 2003 - link
So does this meam NVidia FX series cards are now true DX9 cards or DX9 compatible (whatever that means?) or partly DX9 but better at DX8.1 and hopefully DX10?MOwings - Friday, December 12, 2003 - link
This article was excellent. The explanations of all the technologies used in these cards was very clear. I thought the NVidia screenshots were brighter (the flashlight pic in Halo, the first pic in UT on the front right of the scnene). It seems NVdidia is being more accurate in their methodologies. To me, correct lighting is more important than the antialiasing so I would tend to prefer the NVidia. However, I doubt I would really notice any difference at full game speed and so maybe it is better to get the card that is fastest (ATI), although both seem to be plenty fast enough with their current drivers. Tough call. If there are stability issues with ATI drivers that might swing me although this is the first I have heard of problems with ATI's latest cards and drivers.virtualgames0 - Thursday, December 11, 2003 - link
#17...I'm in the exact same shoes as you are in. I have gone from ATI 9700pro to nvidia geforcefx 5900 from game incompatibility issues. Half my OpenGL games would crash. Tried every driver.. only to find that other games crash while my old game is fixed. Switched to nvidia, not one problem
However, if you are lucky and do not have problems, I would agree the 9700pro had far superior AA quality, and takes less performance hit doing AA, but nvidia is good for me since 2xAA is all i need when I use 1600x1200 resolution
DerekWilson - Thursday, December 11, 2003 - link
In response to the article, Scali over at Beyond3d put up a thread about the alpha blending issues we observed on the ATI cards:http://www.beyond3d.com/forum/viewtopic.php?t=9421
He's written a program to test the accuracy of your cards alpha blending, which is kinda cool (and also confirms that ATI is a little off in the calculation).
The theory Scali has is that the problem is due to ATI substituing in a couple shifts to save a division.
We are definitely going to continue looking into the issue, and thanks, everyone, for your feedback.
Shinei - Thursday, December 11, 2003 - link
Exactly, Araczynski. I mean, I love my UT2k3 with 4xAA/8xAF(64-tap), I really do, but I don't NEED it to enjoy the game. As long it plays fast and looks better than the games before it, it doesn't matter if I have to dial back my AA/AF; hell, Halo won't even run above 10fps if I don't knock my filter to bilinear, let alone whining about AF quality at 8x!! Sometimes people just need to realize that IQ is a secondary concern to getting the game to run in the first place; suffer in vain with a Ti4200 in Halo at 1024x768 and then tell me that the "lower" IQ for an FX makes it less worthwhile than a Ti or an ATI.araczynski - Thursday, December 11, 2003 - link
aside from the rest of the points, as far as i'm personally concerned, AA is a waste of time anyway. The only time i bother to use it is when i'm forced to run something in 800x600 or less (assuming it even supports D3D).Other then that I will always choose a higher resolution over an AA'd lower one. Personally i prefer the sharpnees/clarity of higher resolutions to the look of AA, but that's my opinion and my taste, I know others think the opposite, which is fine by me.
Point is, i wish revewiers would stop focusing on AA quality/speed as if EVERYBODY gives a rat's hiny about it.
Ineptitude - Thursday, December 11, 2003 - link
Having owned a few ATI and Nvidia cards I can say that I agree with the article. Finding a difference in image quality between the current top models is mostly subjective. It is amazing to see the amount of bickering over the subject.What I don't understand is the failure to mention all of the driver problems people have experienced with ATI cards. At this point I will probably never buy an ATI product again due to the poor drivers. Nvidia drivers don't crash my machine.
I've got a 9800 going cheap if anybody wants it.
tyski - Thursday, December 11, 2003 - link
Everybody here who says the article 'concluded suddenly and without concluding anything' could not have read the article. The whole point of the article is that the author provided enough information for the reader to come his/her own conclusions.The only conclusion that Derek made was that nVidia does more work to get approximately the same thing done. If you understand anything about real time hardware, this is not a good thing.
Having read every article Derek has written so far, I think this is probably the best one. Unbiased throughout. And if you want a screenshot for every possible combination of AA, AF, and resolutions, then go buy the cards and see how long it really takes to perform this many benchmarks. There is such a thing as article deadlines.
Tyrel
nourdmrolNMT1 - Thursday, December 11, 2003 - link
i hate flame wars but, blackshrike....there is hardly any difference between the images. and nvidia used to be way behind in that area. so they have caught up, and are actually in some instances doing more work to get the image to look a little different, or maybe they render everything that should be there, while ati doesnt (halo 2)
MIKE
Icewind - Thursday, December 11, 2003 - link
I have no idea what your bitching about Blackshrike, the UT2k3 pics look exactly the same to me.Perhaps you should go work for for AT and run benchmarks how you want them done instead of whining like a damn 5 year old..sheesh.
Shinei - Thursday, December 11, 2003 - link
Maybe I'm going blind at only 17, but I couldn't tell a difference between nVidia's and ATI's AF, and I even had a hard time seeing the effects of the AA. I agree with AT's conclusion, it's very hard to tell which one is better, and it's especially hard to tell the difference when you're in the middle of a firefight; yes, it's nice to have 16xAF and 6xAA, but is it NECESSARY if it looks pretty at 3 frames per second? I'm thinking "No"; performance > quality, that's why quality is called a LUXURY and not a requirement.Now I imagine that since I didn't hop up and down and screech "omg nvidia is cheeting ATI owns nvidia" like a howler monkey on LSD, I'll be called an nVidia fanboy and/or told that A) I'm blind, B) my monitor sucks, C) I'm color blind, and D) my head is up my biased ass. Did I meet all the basic insults from ATI fanboys, or are there some creative souls out there who can top that comprehensive list? ;)
nastyemu25 - Thursday, December 11, 2003 - link
cheer up emo kidBlackShrike - Thursday, December 11, 2003 - link
For my first line I forget to say blurry textures on the nvidia card. Sry, I was frustrated at the article.BlackShrike - Thursday, December 11, 2003 - link
Argh, this article concluded suddenly and without concluding anything. Not to mention, I saw definite blurry textures in UT 2003, and TRAOD. Not to mention the use of D3D AF Tester seemed to imply a major problem with one or the other hardware but they didn't use it at different levels of AF. I mean I only use 4-8 AF, I'd like to see the difference.AND ANOTHER THING. OKAY THE TWO ARE ABOUT THE SAME IN AF BUT IN AA ATI USUALLY WINS. SO, IF YOU CAN'T CONCLUDE ANYTHING, GIVE US A PERFORMACE CONCLUSION, like which runs better with AA or AF? Which creates the best with both settings enabled?
Oh and AT. Remember back to the very first ATI 9700 Pro, you did tests with 6x AA and 16 AF. DO IT AGAIN. I want to see which is faster and better quality when their settings are absoulutely maxed out. Because I prefer playing 1024*768 at 6x AA and 16 AF then 1600*1200 at 4x AA 8x AF because I have a small moniter.
I am VERY disappointed in this article. You say Nvidia has cleaned up their act, but you don't prove anything conclusive as to why. You say they are similar but don't say why. The D3D AF Tester was totally different for the different levels. WHAT DOES THIS MEAN? Come on Anand clean up this article, it's very poorly designed and concluded and is not at all like your other GPU articles.
retrospooty - Thursday, December 11, 2003 - link
Well a Sony G500 is pretty good in my book =)Hanners - Thursday, December 11, 2003 - link
Not a bad article per se - Shame about the mistakes in the filtering section and the massive jumping to conclusions regarding Halo.gordon151 - Thursday, December 11, 2003 - link
I'm with #6 and the sucky monitor theory :P.Icewind - Thursday, December 11, 2003 - link
Or your monitor sucks #5ATI wins either way
retrospooty - Thursday, December 11, 2003 - link
I have been visiting Anandtech for well over 4 years , and I have often exclaimed how thorough, fair, and unbiased this site is to others...This is the first article I have ever read here that I think is complete poop. I cannot beleive that in any fair IQ test Nvidia came anywhere close to ATI. Either the author is not being honest, or is color blind. Anyone with eyeballls can compare the two and see that ATI is much sharper, and vibrant especially with AA... Nvidia is WAY blurry.
I am very VERY dissapointed in this. :(
TheGoldenMenkey - Thursday, December 11, 2003 - link
Excellent article. I would much rather be taught why things are different than be showed some differences in rendering and then have someone declare which one is cheating. Thanks for teaching us enough to let us come to our own conclusions. Keep up the good work AT.tazdevl - Thursday, December 11, 2003 - link
Better look @ that... then we might have something to discusshttp://www.anandtech.com/video/showdoc.html?i=1931...
dvinnen - Thursday, December 11, 2003 - link
Artical seemed fair and unbias to me. Your AA and AF question is odvious. Look at the URL of the png file. It clearly states what is on.It seems they have cleaned up there DX9 proformance, but they still treat synthitic benchmarks badly. Most recintly the 3DMark03 patch a month ago and how they handeled the media (PR on one side of the pond said one thing, on the other saide, they said another)
tazdevl - Thursday, December 11, 2003 - link
So Derek to you own stock in nVIDIA? Did Brian Burke write this for you?Were AA and Aniso used in all tests or a few? Which ones? What modes are we comparing against which benchmarks?
Ever thought that BOTH nVIDIA and ATI can fix the outstanding instead of just nVIDIA?
I swear, every since Anand got caught up in the whole NV30 fiasco, the site's credibility is worth absolutely squat when it comes to nVIDIA.
I'm not saying ATI is without faults, but let's try to appear unbiased at a minimum in the article.