#110 continued Now I know why the guy behind the counter told me to steer clear of the ATI Radeon cards because of the known compatability problems when running games.
(Computer sales guy thinking-I just read the article in the AnandTech post)
Translated: I have a shit load of Nvidia cards and if I don't lie my ass off to my Customer's it will be game over for me!!!
The only reason I started looking at ATI cards was I decided to spend what I saved on the CRT monitor (over the $$LCD) for higher performer card. Mr $Sales$ had me convinced I would be buying an inferior card with ATI. Worth shopping around and scouring reviews :O)
I was going to buy a Geforce5600 but looked at a 9600Pro today the thing is I was wondering if I should really blow the budget and lash out on a 9800Pro. I am so glad I came across this article I will stick with the 9600Pro, save some cash, sleep better at night and know when half life 2 is released I will be getting the best performance for the outlay.
I would like to see how they compare with a 5900 using Detonator 44.03 driver. Yes I know its an older driver. But in my tests it provided higher benchmarcks than the 45.23 driver.
So actually Nvidia shader(16/32) are not comparable with ATI shader(24-ms dx9 standard)! Too bad in a way or another they try to cheat again and again....... Very bad idea!
#104, the benchmarks and anand's analysis show that hl2 is gpu power limited, not memory/fillrate limited... the 9600 will be limited more by that than by memory or fillrate.
I think #84 mentioned this, but I didn't see a reply. In the benches, the 9600 pro pulled the exact same (to within .1 fps, which could just be roundoff error) frame rates at 1024 and 1280.
I don't think I've ever seen a card bump up res without taking a measurable hit (unless it was cpu-limited). In every other game, the 9600 takes a hit going from 1024 to 1280. And the 9700 and 9800 slow down when the resolution goes up, even though they're basically the same architecture. Someone screwed up, either the benchmarks or the graphs.
#61 Did you take the time to see that valve limited their testing use. Anandtech had no say in all the tests because they were very time limited. Also, try to make coherent sentences.
It's not as if GIFs gobble bandwidth, I (as CAPTAIN DIALUP) don't even notice them loading. They're tiny. Even though I don't have trouble receiving this Flash stuff, it pisses me off, because sometimes the same scores will load for all the pages. Why not have a poll or something on this?
Umm.. could you PLEASE not use shockwave for those tables? Our firewalls & browser configs also won't let it through, so these reviews become pretty much useless to read.
where are the benchmarks comparing HL2 at different CPUs? I mean, i obviously know I'm gonna have to upgrade my gf3 to a new card (first game to make me even think of that... didnt care for ut2k3), but what about my venerable athlon xp 1800+ ? :(
Still, if I were to see another set of benchmarks, I'd DEFINITELY want these:
GeForce4 MX440 OR GeForce2 Ti - As an example of how well GF2/GF4MX cards perform on low detail settings, being DX7 parts. GeForce3 Ti200 OR GeForce3 Ti500 - It's a DX8 part, and still respectably fast; lots of people have Ti200s, anyway. GeForce4 Ti4200 - This is an incredibly common and respectably fast card, tons of people would be interested in seeing the numbers for these. GeForce FX 5600 Ultra - Obvious. GeForce FX 5900 Ultra - Obvious. Radeon 8500 - It's still a good card, you know. Radeon 9500 Pro - Admit it, you're all interested. Radeon 9600 Pro - Obvious. Radeon 9700 vanilla - Because it would show how clock speed scales, and besides these (and softmodded 9500s) are quite common. Radeon 9700 Pro - Obvious. Radeon 9800 Pro - Obvious.
The GeForce FX 5200 and GeForce4 Ti4600 might be nice too, but the Radeons 9000 through 9200 would be irrelevant (R200-based).
Also, obviously, I'd like to see them on two or three different detail levels (preferably three), to show how well some of the slower ones run at low detail and see how scalable Source really is. Speaking of scalability, a CPU scaling test would be extremely useful as well, like AnandTech's UT2003 CPU scaling test.
This sort of thing would probably take a lot of time, but I'd love to see it, and I bet I'm not alone there. I think something like what AnandTehc did with UT2003 would be great.
I can't believe the Radeon 9500 hacked to a 9700 wasn't included in the benchmarks. What was he thinking? I guess Anand didn't have any luck scoring the right card. There are some still available, you know.
Nvidia quote: "Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit."
3Dfx said some years ago that no one ever would use or notice the benefits of 32 bit textures. Nvidia did and 3Dfx is gone. Will Nvidia follow the 3Dfx path?
http://www.nvnews.net/#1063313306 "The GeForce FX is currently the fastest card we've benchmarked the Doom technology on and that's largely due to NVIDIA's close cooperation with us during the development of the algorithms that were used in Doom. They knew that the shadow rendering techniques we're using were going to be very important in a wide variety of games and they made some particular optimizations in their hardware strategy to take advantage of this and that served them well. --John Carmack"
Of course those D3 numbers were early (as are these HL2 ones), so things can change with updated drivers.
I don't know if it has already been asked, but even if it has I ask again for emphasis.
Anand, it would be nice if you could add a 9600 non-pro bench to the results. You mention raw GPU power being the determining factor now, and as the 9600 Pro's difference in memory clock is more significant than its engine clock, it would be interesting and informative to the budget/performance croud to note the 9600 non-pro performance in HL2.
Thanks for all your informative, insightful, accurate, in-depth articles.
I always find it interesting how people say ATI is the "little guy" in this situation.
ATI has been a major player in the video card market since the eighties (I had a 16-bit VGA Wonder stuck in an 8-bit ISA slot in an 8088/2 system) and that didn't change much even when 3dfx came onto the scene. A lot of those voodoo pass through cards got their video from an ATI 3d expression or some other cheap 2d card (Cirrus Logic or Trident anyone?).
Nvidia and ATI have been at each others throats ever since Nvidia sold its first video card on the OEM market. 3dfx was just a little blip to ATI, Nvidia stealing away a bunch of its OEM sales with a bad 2d/good 3d video card on the other hand, well, that was personal.
I imagine someone at ATi saying something like this:
"All of you guys working on making faster DACs and better signal quality are being transferred to our new 3d department. Its sort of like 2d cept its got one more d, thats all we know for right now.".
ATI knows how to engineer and build a video card, they have been doing it for long enough. Same with Matrox (Matrox builds the Rolls Royce's of video cards for broadcast and video editing use), Nvidia on the other hand knew how to build 3d accelerators, and not much else. The 2d on any early Rage card slaughtered the early Nvidia cards.
Course, the 3d sucked balls, thats what a Canopus Pure 3d was for though.
Now ATI has the whole "3d" part of the chip figured out. The driver guys have their heads wrapped around the things as well (before 3d cards came around ATI's drivers were the envy of the industry). Its had many years of experience dealing with games companies, OS companies, standards, and customers. And its maturity is really starting to show after a few minor bumps and bruises.
ATI wants its market back, and after getting artx it has the means to do it. Of course, Nvidia is going to come out of this whole situation a lot more grown up as well. Both companies are going to have to fight blood tooth and nail to stay on top now. If they don't Matrox might just step up to the plate and bloody both of their noses. Or any of those "other" long forgotten video card companies that have some engineers stashed away working on DX 7 chips.
In terms of the performance of the cards you've seen here today, the standings shouldn't change by the time Half-Life 2 ships - although NVIDIA will undoubtedly have newer drivers to improve performance. Over the coming weeks we'll be digging even further into the NVIDIA performance mystery to see if our theories are correct; if they are, we may have to wait until NV4x before these issues get sorted out.
For now, Half-Life 2 """ SEEMS """ to be best paired with ATI hardware and as you've seen thorugh our benchmarks, whether you have a Radeon 9600 Pro or a Radeon 9800 Pro you'll be running just fine. Things are finally """heating up""" and it's a good feeling to have back...
HL2 ""seems"" better on ATI??? , should be, HL2 looks way better and faster on ATI.
Things are finally """heating up""" ??? shoul have been , ATI's performance is killing Nvidia's FX.
The conclusion should have been : Nvidia lied and sucks , Valva had to lower standard ( actually optimize (cheat) in favor of Nvidia) and make HL2 game look bad , just so you could play on your overpriced Nvidia Fx cards.
How about a word of apology from Anand to have induced readers in errors , and have told them to buy Nvidia Fx card's in is last Fx5900 review. ???
From a future ATI card owner, (bundled with HL2 of course)
82, those are 9600 regulars (!), click the links. Pricewatch has been fooled. A Pro isn't much more, though, just about $136.
I'd go with a 9500 over a 9600 any day. The 9500 can be softmodded to 9700 performance levels (about 50-70% of the time, IIRC, and it's actually a little cheaper than the 9600 Pro!). If the softmod doesn't work out, then you return it for a new one. Of course, not everyone wants to do this, and a 9600 Pro is a respectable and highly overclockable card.. but..
I'd still love to see 9500 Pros at lower prices, like they would have been if ATi had kept it out.. but whatever. If you don't know, the 9500 Pro is/was considerably faster than the 9600 Pro. Valve said that HL2 isn't memory-limited, so the 128-bit memory interface on the 9500 Pro (which never made a big difference vs. the 9700 anyway) shouldn't even be noticeable, and the fact that the Sapphire-made ones were just as overclockable as the 9500 regulars and 9700s (think up to 340 core, 350 if you're lucky) is going to make it one HELL of an HL2 card for the $175 most people paid.
Nvidia got schooled, but not on hardware or drivers. ATI locked this up long ago with their deal with Gabe and buddies. Why is everyone just trying to keep a straight face about it? ATI paid handsomely for exactly what has happened to NVidia. But as always happens, watch out when the tables turn, as they ALWAYS do, and Valve could be on the OUTSIDE of lots of other deals.
I am just glad there is finally a damn game that can stress out these video cards. Wonder if Bitboys Oy of whatever there name is come out saying they have a new video card out now that will run Half Life 2 at 100+ FPS :) What made me think of them I have no idea!
Not to detract from the main issue here, but #19 raises a good point. Why does the 9600Pro lose only <1% performance going from 1024 to 1280? The 9800P and 9700P lose between 10-15%. The 5900U loses 30%, sometimes more. I wonder if the gap between the 9800P and 9600P shrinks even more at higher resolutions.
What aspect of the technology in the 9600 could possibly account for this?
A couple of small webstores have a "Smart PC 9600" non-pro 128 meg for <$100. But the smart pc card is a cheap oem unit...I'm not sure if it's as good as the more expensive 9600's.
#74 - No, D3 isn't a DX9 game, its OGL. What it shows is that the FX series isn't bad - they just don't do so well under DX9. If you stick primarily to OpenGL games and run your DX games under the 8.1 spec, the FX should perform fine. It's the DX9 code that the FXes seem to really struggle with.
#74: I have commonly heard this blamed on a bug in an older release of the CATALYST drivers that were used in the Doom3 benchmark. It is my understanding that if the benchmark was repeated with the 3.7 (RELEASED) drivers, the ATI would perform much better.
#75: I believe this goes back to prior instances where Nvidia has claimed that some new driver would increase performance dramatically to get it into a benchmark and then never release the driver for public use. If this happened, the benchmark would be unreliable as it could not be repeated by an end-user with similar results.
Also, the Det50 drivers from Nvidia do not have a working fog system. It has been hinted that this could be intentional to improve performance. Either way, I saw a benchmark today (forgot where) that compared the Det45's to the beta Det50's. The 50's did improve performance in 3DMark03 but no where near the 73% gap in performance seen in HL2.
Because Gabe controls how representative the hl2 beta is of the final hl2 product but he cannot control how representative the nvidia det50 beta is if the final det50s.
And besides that there have been rumours of "optimalisations" in the new det50s.
How is it that Gabe can recommend not running benckmarks on an publicly unavailable driver or hardware, yet the game itself is unavailable? Seems a little hypocritical....
I didn't have time to look into this but can someone enlilghten me as to why the 5900 Ultra outperformed the 9800 PRO in the Doom 3 benchmarks we saw awhile back...is that not using DX9 as well? If I am way off the mark here or am even wrong on which outperformed which go easy on the flames!
"Not true, the 9500 is a true DX9 part. The 9600 does have faster shader units though."
My bad, should have looked at ATI first. I guess I'm thinking about the 8500. Either way, I would still go 9600 Pro, especially given that it is cheaper than a 9500 non-pro.
I think the insinuation is clear from that nVidia email posted and Gabe's comments. Valve believed nVidia was trying to "cheat" with their D50s by intentionally having fog disabled etc. Rather than toss around accusations, it was simpler for them to just require that the benchmarks at this point be run with released drivers and avoid the issue of currently bugged drivers with non-working features, whether the reason was accidental or intentional.
Considering that the FXes fared poorly with 3DMark and again with HL2 - both using DX9 implementations, I think it might be fair to say that the FXes aren't going to do too much better in the future. Especially considering the way they reacted to 3DMark 03 - fighting the benchmark rather than releasing drivers to remedy the performance issue.
I'd like to see how the FXes do running HL2 with pure DX8 rather than DX9 or a hybrid, as I think most people owning current nVidia cards are going to have to go that route to achieve the framerates desired.
#68: 33 fps * 1.73 = 57.09 fps (add the one to account for the intial 33 score).
This doesn't quite work out based on the 57.3 score of the 9800 Pro so corrected score on the Nvidia was probably closer to this: 57.3 / 1.73 = 33.12 fps
#69: I would definitely try to find a 9600 Pro before I bought a 9500 Pro. The 9600 fully supports DX9 whereas the 9500 does not.
The performance increase between the FX5900 and Rad9800Pro is not 73%. Do the math correctly and it turns into 36.5% lead. The article should be revised.
Hmmm... I understand that Nvidia would be upset. But it's not like ATI is using a special setting to run faster. They're using DX9.. Nvidia needs to get on the ball. I'm going to have to upgrade my video card since I have a now obsolete Ti4200 GF4.
GET IT TOGETHER NVIDIA!!! DON'T MAKE ME BUY ATI!
I might just sell my Nvidia stock while I'm at it. HL2 is a big mover and I believe can make or break the card on the consumer side.
I had just ordered a 5600 Ultra thinking it would be a great card. It's going back.
If I can get full DX 9 performance with a 9600 Pro for around $180, and that card's performance is better than the 5900 Ultra - then I'm game.
I bought a TNT when Nvidia was making a name for it's self. I bought a GF2 GTS when Nvida was destroying the 3dfx - now Nvidia seems to have droped the ball on DX9. I want to play HL2 on what ever card I buy. A 5600 ultra won't seem to cut it. I know the 50's are out there, but I've seen the Aquamark comparision with the 45's and 50's and I'm not impressed.
I really wanted to buy Nvidia, but I cannot afford it.
#62: I do have the money but I choose to spend it elsewhere. FYI: I spend $164 US on my 2.4C and I'm running speeds faster than the system used for this benchmark.
"The Dell PCs we used were configured with Pentium 4 3.0C processors on 875P based motherboards with 1GB of memory. We were running Windows XP without any special modifications to the OS or other changes to the system."
Anand was using a single system to show what HL2 performance would be on video cards available on the market today. If we was to run benchmarks on different CPU's he would have to spend a tremendous amount more time doing so. In the interest of getting the info out as soon as possible, he limited himself to a single system.
I would deduce from the performance numbers of HL2 in Anand's benchmarks that unless you have a 9600 Pro/9800 Pro, your AMD will not be able to effectively run HL2.
#61.. i take it YOU have the money to shell out for top of the line hardware ????????? i sure as hell don't, but like #42 said, " more widely used comp "
i my self am running a 1700+ at 2400+ speeds, no way in hell am i gonna go spend the 930 bucks ( in cdn funds )on a 3.2c P4, thats NOT inc the mobo and ram, and i'm also not gonna spend the 700 cdn on a barton 3200+ either, for the price of the above P4 chip i can get a whole decient comp, may not be able to run halflife at its fullest, but still, i'm not even interested in HL2, it just not the kind of game i play, but if i was, whay i typed above, is still valid..
anand... RUN THESE HL2 BENCHES ON HARDWARE THE AVERAGE PERSON CAN AFFORD !!!!!!!!!!!!!!!!!!!!!!!! not he spoiled rich kid crap .....
#42 "...should have benchmarked on a more widely used computer like a 2400 or 2500+ AMD...":
The use of 'outdated' hardware such as your 2400 AMD would have increased the possibility of cpu limitations taking over the benchmark. Historically all video card benchmarks have used the fastest (or near fastest) GPU available to ensure the GPU is able to operate in the best possible scenario. If you want to know how your 2400 will work with HL2, wait and buy it when it comes out.
In reference to the 16/32 bit floating point shaders and how that applies to ATI's 24 bit shaders:
It was my understanding that this quote was referencing the need for Nvidia to use it's 32 bit shaders as future support for its 16 bit shaders would not exist. I don't see this quote pertaining to ATI's 24 bit shaders as they meet the DX9 specs. The chance of future HL2 engine based games leaving ATI users out in the cold is somewhere between slim and none. For an example of how software vendor's react to leaving out support for a particular line of video card, simply look at how much work Valve put into making Nvidia's cards work. If it was feasible for a software vendor to leave out support for an entire line like your are refering to (ATI in your inference) we would have had HL2 shipping by now (for ATI only though...).
Are pixel shader operations anti-aliased on current generation video cards? I ask because in the latest Half Life 2 technology demo movie, anti-aliasing is enabled. Everything looks smooth except for the specular highlights on the roof and other areas, which are still full of shimmering effects. Just seems a little sore on the eyes.
For us freaks, can you do a supplement article. Do 1600x1200 benchmarks!!!
Things will probably crawl, but it would be nice to know that this should be the worst case at this resolution when ATI and NVidia come out with next gen cards.
Also, was any testing done to see if the benchmarks were CPU or GPU limited? Maybe use the CPU utilization montior in Windows o see what the CPU thought. maybe a 5.0 GHz processor down the road will solve some headaches. Doubtful, but maybe....
Whats really funny is that Maximum PC magazine built an $11000 "Dream Machine", using a GeforeFX 5900 and i can built a machine for less then $2000 and beat it using a 9800 pro.
#52 different engines, different results. hl 2 is probably more shader limited than doom 3. The 9600pro has strong shader performance, which narrows the gap in shader limited situations such as hl 2.
btw, where did you get those doom 3 results? Only doom 3 benches I know about are based off the old alpha or that invalid test from back when the nv35 was launched...
I'm surprised that Anand mentioned nothing about the comparisons between 4x2 and 8x1 pipelines? Does he even know that MS is working to included paired textures with simutainious wait states for the nV arcitexture? You see the DX9 SDK was developed thinking only one path and since each texture has a defined FIFO during the pass the second pipe in the nV is dormant until the first pipe FIFO operation is complete, with paired textures in the pipe using syncronus wait states this 'problem' will be greatly relieved.
Quote: 'So why is it that in the age of incredibly fast, absurdly powerful DirectX 9 hardware do we find it necessary to bicker about everything but the hardware? Because, for the most part, we've had absolutely nothing better to do with this hardware.'
one thing that i think is kinda interesting. check out this benchmark hardocp did - fx5900 ultra vs. radeon 9800 pro in doom 3 (with help from id software).
after reading this, read carmack's Jan 03 .plan, where he states that under the default openGL codepath, the fx architecture is about half as fast as the r300 - something that is pretty much resembled in the hl2 benchmarks. furthermore he states that using the default path the r300 is clearly superior (+100%), but when converting to vendor-specific codepaths, the fx series is the clear winner.
conclusions? none, but some possibilities .) ati is better in directx, nvidia in opengl .) id can actually code, valve cannot .) and your usual conspiracy theories, feel free to use one you specifically like
bottom line. neither ati nor nvidia cards are the "right ones" at the moment, wait for the next generation of video cards and upgrade THEN.
i'm sure Nvidia will strike back.. prolly with DOOM III..well till then i'll enjoy my little army of ATI cards: ATI 9800NP>PRO, ATI 9700, ATI 9600PRO :P..long live ATI!!! :D
==="full 32-bit would be required" not 24-bit. So that leaves all ATI cards out in the cold.===
By the time full 32-bit becomes standard (probably with DX10 in 2-3 years) there will be NEW cards that make current cards look like sh!t. ATi will have DX10 cards for under $100, same as nVidia and their 5200. People have been upgrading their PC's for new games for YEARS! Only an [nv]IDIOT would attempt to use an old card for new games and software (TNT2 for Doom3? NOT!).
Funny that you guys think nVidia will be still "plugging along" with the GFFX if the DX spec changes to 32bit... you _do_ know what happens to the GFFX when it's forced to run 32bit prcession don't you? You'd get faster framerates by drawing each frame by hand on your monitor with a sharpie.
"nVidia has released the response as seen in the link. Particularly interesting, however, is this part of the e-mail sent to certain nVidia employees ( this was not posted at the given link ):
'We have been working very closely with Valve on the development of Half Life 2 and tuning for NVIDIA GPU's. And until a week ago had been in close contact with their technical team. It appears that, in preparation for ATI's Shader Days conference, they have misinterpreted bugs associated with a beta version of our release 50 driver.
You also may have heard that Valve has closed a multi-million dollar marketing deal with ATI. Valve invited us to bid on an exclusive marketing arrangement but we felt the price tag was far too high. We elected not to participate. ***We have no evidence or reason to believe that Valve's presentation yesterday was influenced by their marketing relationship with ATI.***'"
If this document is indeed real, nV themselves told their own employees Gabe's presentation wasn't skewed by Valve's marketing relationship with ATi.
LOL! 19, I saw that too. Looks like I'll be replacing my nVidia 'the way it's meant to be played in DX8 because our DX9 runs like ass, and we still sell it for $500+ to uninformed customers' card with an ATi Radeon. Thanks for the review Anand; it will be interesting to see the AA/AF benchmarks, but I have a pretty good idea of who will win those as well.
>>>>>>>ANYONE ELSE CATCH THE FOLLOWING IN THE ARTICLE<<<<<<<<<<<<<<<
""One thing that is also worth noting is that the shader-specific workarounds for NVIDIA that were implemented by Valve, will not immediately translate to all other games that are based off of Half-Life 2's Source engine. Remember that these restructured shaders are specific to the shaders used in Half-Life 2, which won't necessarily be the shaders used in a different game based off of the same engine.""
So I guess the nvidia fan boys won't be able to run their $500 POS cards with Counterstrike 2 since it will probably be based on the HL2 engine.
#23, I believe you're inferring far too much from ATi's HL2 bundling. Check TechReport's article on Gabe's presentation, in which Gabe is noted as saying Valve chose ATi (in the bidding war to bundle HL2) because their cards quite obviously performed so much better (and look better doing it--keep in mind, as Anand said, all those nVidia mixed modes look worse than pure DX9).
In short, Valve doesn't need to do much to please others, as they're the one being chased for the potentially huge-selling Half-Life 2. Everyone will be sucking up to them, not the other way around. And it wouldn't do for Valve to offer nV the bundle exclusive, have consumers expect brilliant performance from the bundled FX cards, and get 12fps in DX9 on their DX9 FX card or 30fps on their $400+ 5900U. That would result in a lot of angry customers for Valve, which is a decidedly bad business move.
People will buy HL2 regardless. Valve's bundling of HL2 with new cards is just an extra source of income for them, and not vital to the success of HL2 in any way. Bundling HL2 will be a big coup for an IHV like ATi, which requires boundary-pushing games like HL2 to drive hardware sales. Think of the relationship in this way: it's not that ATi won the bidding war to bundle HL2, but that Valve *allowed* ATi to win. Valve was going to get beaucoup bucks for marketing tie-ins with HL2 either way, so it's in their best interests to find sponsorships that present HL2 in the best light (thus apparently HL2 will be bundled with ATi DX9 cards, not their DX8 ones).
You should read page 3 of Anand's article more closely, IMO. Valve coded not to a specific hardware standard, but to the DX9 standard. ATi cards run standard DX9 code much better than nV. Valve had to work extra hard to try to find custom paths to allow for the FX's weaknesses, but even that doesn't bring nV even with ATi in terms of performance. So ATi's current DX9 line-up is the poster-child for HL2 almost by default.
We'll see what the Det50's do for nV's scores and IQ soon enough, and that should indicate whether Gabe was being mean or just frank.
#33 To be pedantic, the spec for DX9 24bit minimum, it has never been said by Microsoft that it was 24bit and nothing else, 24bit is just a minimum.
Just as 640x480 is a minimum. That doesn't make 1024x768 non standard.
But considering you are right, and 24 bit is a rock solid standard, doesn't that mean that Valve in the future will violate the DX9 spec in your eyes? Does that not mean that ATI cards will be left high and dry, in the future? Afterall, there will be no optimizations allowed/able?
32bit is the future, according to Valve after all. Nvidia may suck at doing it, but at least they can do it.
Well #26, if the next gen of games do need 32 bit precision, then the tides will once again be turned. and all these "my ATi is so faster than for nVidia" will have to just suck it up and buy another new card, whereas the GFFX's will still be plugging along. by then, who knows, maybe DX10 will support 32 bit precision on the nVidia cards better... btw, im still loading down my GF3 Ti500. so regardless, i will have crappy perf. but i also buy cards from the company i like, that being Gainward/Cardex nVidia based boards. no ATi for me, also no Intel for me. Why? bcuz its my choice. so it may be slower, whoopty-doo!
for all i know, HL2 could run for crap on AMD CPUs as well. so i'll be in good shape then with my XP2400+ and GF3
sorry, i know my opinions dont matter, but i put em here anyhow.
buy what you like, dont just follow the herd... unless you like having your face in everyones ass.
Yeah, like mentioned above, what about whether or not AA and AF were turned on in these tests? Do you talk about it somewhere in your article?
I can't believe it's not mentioned since this site was the one that make a detailed (and excellent) presentation of the differences b/w ati and nvdia's AA and AF back in the day.
Strange your benchmarks appear to be silent on the matter. I assume they were both turned off.
This is an interesting can of worms. So in the future months time, if ATI stick to 24bit, or cannot develop 32 bit precision, the tables will have reversed on the current situation - but even moreso because there would not be a work around (Or optimization).
Will ATI users in the future accuse Valve of sleeping with Nvidia because their cards cannot shade with 32-bit precision?
Will Nvidia users claim that ATI users are "non-compliant with directX 9"? Will ATI users respond that 24bit precision is the only acceptable standard Direct 9 standard, and that Valve are traitors?
Will Microsoft actually force manufacturers to bloody well wait and force them to follow the standard.
And finally, who did shoot Colonel Mustard in the Dining Room?
#26 That was in reference to the fx cards. They can do 16 or 32 bit precision. Ati cards do 24 bit precision, which is the dx 9 standard.
24 bit is the dx 9 standard because it's "good enough." It's much faster than 32 bit, and much better looking then 16 bit. So 16 bit will wear out sooner. Of course, someday 24 bit won't be enough, either, but there's no way of knowing when that'll be.
Quote: http://www.tomshardware.com/business/20030911/inde... "Valve was able to heavily increase the performance of the NVIDIA cards with the optimized path but Valve warns that such optimizations won't be possible in future titles, because future shaders will be more complex and will thus need full 32-bit precision."
The new ATI cards only have 24bit shaders! So would that make ALL current ATI cards without any way to run future Valve titles?
Perhaps I do not understand the technology fully, can someone elaborate on this?
I agree with #23 in terms of money making power the ATI/Valve combo is astounding. ATI's design is superior as we can see but the point is that ATI is going to get truckloads of money and recognition for this. Its a good day to have stock in ATI, lets all thank them for buying ArtX!
I perviously posted this in a wrong place so let me just shamelessly repost in here:
Let me just get my little disclaimer out, before I dive into being a devil's advocate - I own both 9800pro and fx5900nu and am not biased to neither, ATi or nVidia. With that being said, let me take a shot at what Anand opted not to speculate about ant that is the question of ATi/Valve colaboration and their present and future relationship. First of all, FX's architecture is obviously inferior to R3x0 in terms of native DX9 and tha is not going to be my focus. I would rather debate a little about the business/finacial side of ATi/Valve relationship. That's the area of my expertise and looking at this situation from afinacial angle might add another twist to this. What got my attention are Gabe Newell presentations slides that have omitted small but significant things like "pro" behind r9600 and his statement of "optimiztions going too far" without actually going into specifics, other than new detonators don't render fog. Those are small but significant details that add a little oil on a very hot issue of "cheating" in regards to nVidia's "optimizations". But I sopke of inancial side of things, so let me get back to it. After clearly stating how superior ATi's harware is to FX, stating how much effort they have invested to make the game work on FX (which is absolutely commendable) I can not help but notice that all this perfectly leads into the next great thing. A new line of ATi cards will be bundeled with ATi cards (or vice versa), and ATi is just getting ready to offer a value DX9 line. Remember how it was the only area that they have not covered and nVidia was selling truckloads of FX5200 in the meantime. After they have demonstrated how poorly FX flagship performs, let alone the value parts, is't it a perfect lead into selling shiploads of those bundeled cards(games). Add to that Gabe's shooting down of any optimization efforts on nVidia's part (simply insinuate on "chaets") and things are slowly moving in the right direction. And to top it all off, Valve expilcitley said that future additions will not be done for DX8 or so called mixed class but exclusively DX9. What is Joe consumer to do than? The only logical thing - get him/herself one of those bundles. That concludes my observations on this angle of this newly emerged attraction and I see only good things on the horizon for ATi stockholders. Feel free to debate, disagree and criticize, but keep in mind that I am not defending or bashing anybody, just offering my opinion on the part I considered equally as interesting as hardware performance is.
Wow...I buy a new video card every 3 years or so..my last one was a GF2PRO....hehe...I'm so glad to have a 9800PRO right now. Snif..I'm proud to be Canadian ;-)
MUHAHAHA!!! Go the 9600pros, i'd like to bitch slap my friends for telling me the 9600's will not run half-life 2. I guess i can now purchase an All-In-Wonder 9600pro.
Man, I burst into a coughing/laughing spree when I saw an add using nVidia's "The way it's meant to be played" slogan. Funny thing is, I first noticed the add on the page titled "What's Wrong with Nvidia?"
You can bet your house nvidia's 50 drivers will get closer performance, but they're STILL thoroughly bitchslapped... Ppl will be buying R9x00's by the ton. Nvidia better watch out, or they'll go down like, whatwassitsname, 3dfx ?
Hehe, I concer. Seeing a 9500on there would of been nice. But I really want to see is some AF turned on. I can live with no AA (ok, 2x AA) but I'll be damn if AF isn't going to be on.
Anand, you guys rock. It's because of your in depth reviews that I purchased the Radeon 9500 Pro. I noticed the oddity mentioned of the small performance gap between the 9700 Pro and the 9600 Pro at 1280x1024. I would really like to see how the 9500 Pro is affected by this (and all the other benchmarks). If you have a chance, could you run a comparison between the 9500 Pro and the 9600 Pro (I guess what I really want to know if my 9500 Pro is better than a 9600 Pro for this game).
Anand, when using the Print Article feature in Mozilla 1.4, I was shown only graphs from one map throughout. For instance, after clicking Print Article, all graphs were of the bug level. Hitting F5 showed them all to be of techdemo. In both cases, some graphs didn't correspond to your comments.
This may be b/c the article was just posted, but thought I'd give you a heads-up anyway.
Thanks for the interesting read, and hopefully we'll see screenshots of the differences between the DX8.0. 8.1, 8.2, NV3x, and DX9 modes soon (the only thing lacking from this article, IMO)!
Where are the numbers with AA/AF enabled? I know the article intimates that there's a negligible performance hit, but I'd still like to see the numbers.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
111 Comments
Back to Article
uturnsam - Friday, November 28, 2003 - link
#110 continuedNow I know why the guy behind the counter told me to steer clear of the ATI Radeon cards because of the known compatability problems when running games.
(Computer sales guy thinking-I just read the article in the AnandTech post)
Translated: I have a shit load of Nvidia cards and if I don't lie my ass off to my Customer's it will be game over for me!!!
The only reason I started looking at ATI cards was I decided to spend what I saved on the CRT monitor (over the $$LCD) for higher performer card. Mr $Sales$ had me convinced I would be buying an inferior card with ATI. Worth shopping around and scouring reviews :O)
uturnsam - Friday, November 28, 2003 - link
I was going to buy a Geforce5600 but looked at a 9600Pro today the thing is I was wondering if I should really blow the budget and lash out on a 9800Pro.I am so glad I came across this article I will stick with the 9600Pro, save some cash, sleep better at night and know when half life 2 is released I will be getting the best performance for the outlay.
Anonymous User - Thursday, October 16, 2003 - link
you can count on your 9500 being in between the 9800 and the 9600, about 30% frame rate above the 9600. the 4 pipelines will help.Anonymous User - Tuesday, September 30, 2003 - link
I would like to see a test of the dx8 paths on some of the really older cards for those of us who are too broke for these new ones!!For instance, I have a geforce2 GTS that I love very much and works just fine on everything else. I don't want to have to upgrade for one game.
Anonymous User - Sunday, September 21, 2003 - link
I would like to see how they compare with a 5900 using Detonator 44.03 driver. Yes I know its an older driver. But in my tests it provided higher benchmarcks than the 45.23 driver.Has any body else noticed this?
Anonymous User - Friday, September 19, 2003 - link
So actually Nvidia shader(16/32) are notcomparable with ATI shader(24-ms dx9 standard)!
Too bad in a way or another they try to cheat
again and again.......
Very bad idea!
Anonymous User - Tuesday, September 16, 2003 - link
#104, the benchmarks and anand's analysis show that hl2 is gpu power limited, not memory/fillrate limited... the 9600 will be limited more by that than by memory or fillrate.Anonymous User - Monday, September 15, 2003 - link
I think #84 mentioned this, but I didn't see a reply. In the benches, the 9600 pro pulled the exact same (to within .1 fps, which could just be roundoff error) frame rates at 1024 and 1280.I don't think I've ever seen a card bump up res without taking a measurable hit (unless it was cpu-limited). In every other game, the 9600 takes a hit going from 1024 to 1280. And the 9700 and 9800 slow down when the resolution goes up, even though they're basically the same architecture. Someone screwed up, either the benchmarks or the graphs.
Anonymous User - Monday, September 15, 2003 - link
#61 Did you take the time to see that valve limited their testing use. Anandtech had no say in all the tests because they were very time limited. Also, try to make coherent sentences.Anonymous User - Sunday, September 14, 2003 - link
It's not as if GIFs gobble bandwidth, I (as CAPTAIN DIALUP) don't even notice them loading. They're tiny. Even though I don't have trouble receiving this Flash stuff, it pisses me off, because sometimes the same scores will load for all the pages. Why not have a poll or something on this?Anonymous User - Sunday, September 14, 2003 - link
Umm.. could you PLEASE not use shockwave for thosetables? Our firewalls & browser configs also won't let it through, so these reviews become pretty much useless to read.
Anonymous User - Sunday, September 14, 2003 - link
where are the benchmarks comparing HL2 at different CPUs? I mean, i obviously know I'm gonna have to upgrade my gf3 to a new card (first game to make me even think of that... didnt care for ut2k3), but what about my venerable athlon xp 1800+ ? :(Anonymous User - Saturday, September 13, 2003 - link
#98, look at the 9700 Pro numbers, subtract 4-5%.Still, if I were to see another set of benchmarks, I'd DEFINITELY want these:
GeForce4 MX440 OR GeForce2 Ti - As an example of how well GF2/GF4MX cards perform on low detail settings, being DX7 parts.
GeForce3 Ti200 OR GeForce3 Ti500 - It's a DX8 part, and still respectably fast; lots of people have Ti200s, anyway.
GeForce4 Ti4200 - This is an incredibly common and respectably fast card, tons of people would be interested in seeing the numbers for these.
GeForce FX 5600 Ultra - Obvious.
GeForce FX 5900 Ultra - Obvious.
Radeon 8500 - It's still a good card, you know.
Radeon 9500 Pro - Admit it, you're all interested.
Radeon 9600 Pro - Obvious.
Radeon 9700 vanilla - Because it would show how clock speed scales, and besides these (and softmodded 9500s) are quite common.
Radeon 9700 Pro - Obvious.
Radeon 9800 Pro - Obvious.
The GeForce FX 5200 and GeForce4 Ti4600 might be nice too, but the Radeons 9000 through 9200 would be irrelevant (R200-based).
Also, obviously, I'd like to see them on two or three different detail levels (preferably three), to show how well some of the slower ones run at low detail and see how scalable Source really is. Speaking of scalability, a CPU scaling test would be extremely useful as well, like AnandTech's UT2003 CPU scaling test.
This sort of thing would probably take a lot of time, but I'd love to see it, and I bet I'm not alone there. I think something like what AnandTehc did with UT2003 would be great.
Just my ~$0.11.
clarkmo - Saturday, September 13, 2003 - link
I can't believe the Radeon 9500 hacked to a 9700 wasn't included in the benchmarks. What was he thinking? I guess Anand didn't have any luck scoring the right card. There are some still available, you know.Anonymous User - Saturday, September 13, 2003 - link
Quote from the Irari information minister:"Nvidia is kicking ATI's butt. Their hardware is producing vastly superior numbers."
Anonymous User - Saturday, September 13, 2003 - link
Nvidia quote: "Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit."3Dfx said some years ago that no one ever would use or notice the benefits of 32 bit textures. Nvidia did and 3Dfx is gone. Will Nvidia follow the 3Dfx path?
Anonymous User - Saturday, September 13, 2003 - link
Anyone remember when ati.com sold rubber dog crap?Pete - Saturday, September 13, 2003 - link
#74, straight from the horse's mouth:http://www.nvnews.net/#1063313306
"The GeForce FX is currently the fastest card we've benchmarked the Doom technology on and that's largely due to NVIDIA's close cooperation with us during the development of the algorithms that were used in Doom. They knew that the shadow rendering techniques we're using were going to be very important in a wide variety of games and they made some particular optimizations in their hardware strategy to take advantage of this and that served them well. --John Carmack"
Of course those D3 numbers were early (as are these HL2 ones), so things can change with updated drivers.
Anonymous User - Saturday, September 13, 2003 - link
I don't know if it has already been asked, but even if it has I ask again for emphasis.Anand, it would be nice if you could add a 9600 non-pro bench to the results. You mention raw GPU power being the determining factor now, and as the 9600 Pro's difference in memory clock is more significant than its engine clock, it would be interesting and informative to the budget/performance croud to note the 9600 non-pro performance in HL2.
Thanks for all your informative, insightful, accurate, in-depth articles.
Anonymous User - Friday, September 12, 2003 - link
I always find it interesting how people say ATI is the "little guy" in this situation.ATI has been a major player in the video card market since the eighties (I had a 16-bit VGA Wonder stuck in an 8-bit ISA slot in an 8088/2 system) and that didn't change much even when 3dfx came onto the scene. A lot of those voodoo pass through cards got their video from an ATI 3d expression or some other cheap 2d card (Cirrus Logic or Trident anyone?).
Nvidia and ATI have been at each others throats ever since Nvidia sold its first video card on the OEM market. 3dfx was just a little blip to ATI, Nvidia stealing away a bunch of its OEM sales with a bad 2d/good 3d video card on the other hand, well, that was personal.
I imagine someone at ATi saying something like this:
"All of you guys working on making faster DACs and better signal quality are being transferred to our new 3d department. Its sort of like 2d cept its got one more d, thats all we know for right now.".
ATI knows how to engineer and build a video card, they have been doing it for long enough. Same with Matrox (Matrox builds the Rolls Royce's of video cards for broadcast and video editing use), Nvidia on the other hand knew how to build 3d accelerators, and not much else. The 2d on any early Rage card slaughtered the early Nvidia cards.
Course, the 3d sucked balls, thats what a Canopus Pure 3d was for though.
Now ATI has the whole "3d" part of the chip figured out. The driver guys have their heads wrapped around the things as well (before 3d cards came around ATI's drivers were the envy of the industry). Its had many years of experience dealing with games companies, OS companies, standards, and customers. And its maturity is really starting to show after a few minor bumps and bruises.
ATI wants its market back, and after getting artx it has the means to do it. Of course, Nvidia is going to come out of this whole situation a lot more grown up as well. Both companies are going to have to fight blood tooth and nail to stay on top now. If they don't Matrox might just step up to the plate and bloody both of their noses. Or any of those "other" long forgotten video card companies that have some engineers stashed away working on DX 7 chips.
God knows what next month is going to bring.
Anyways, sorry for the rant..
Anonymous User - Friday, September 12, 2003 - link
Anand: When you re-test with the Det 50's, make sure you rename the HL2 exe!!!Gotta make the comparison as fair as possible...
Anonymous User - Friday, September 12, 2003 - link
#69 How does the 9500 not fully support DX9? It's the same core EXACTLY as the 9700.Anonymous User - Friday, September 12, 2003 - link
#53 - So YOU'RE that bastard who's been lagging us out!!! Get out of the dark ages!Anonymous User - Friday, September 12, 2003 - link
What kind of conclusion was that ?In terms of the performance of the cards you've seen here today, the standings shouldn't change by the time Half-Life 2 ships - although NVIDIA will undoubtedly have newer drivers to improve performance. Over the coming weeks we'll be digging even further into the NVIDIA performance mystery to see if our theories are correct; if they are, we may have to wait until NV4x before these issues get sorted out.
For now, Half-Life 2 """ SEEMS """ to be best paired with ATI hardware and as you've seen thorugh our benchmarks, whether you have a Radeon 9600 Pro or a Radeon 9800 Pro you'll be running just fine. Things are finally """heating up""" and it's a good feeling to have back...
HL2 ""seems"" better on ATI??? , should be, HL2 looks way better and faster on ATI.
Things are finally """heating up""" ??? shoul have been , ATI's performance is killing Nvidia's FX.
The conclusion should have been :
Nvidia lied and sucks , Valva had to lower standard ( actually optimize (cheat) in favor of Nvidia) and make HL2 game look bad , just so you could play on your overpriced Nvidia Fx cards.
How about a word of apology from Anand to have induced readers in errors , and have told them to buy Nvidia Fx card's in is last Fx5900 review. ???
From a future ATI card owner, (bundled with HL2 of course)
Boy I'm pissed off!
Anonymous User - Friday, September 12, 2003 - link
82, those are 9600 regulars (!), click the links. Pricewatch has been fooled. A Pro isn't much more, though, just about $136.I'd go with a 9500 over a 9600 any day. The 9500 can be softmodded to 9700 performance levels (about 50-70% of the time, IIRC, and it's actually a little cheaper than the 9600 Pro!). If the softmod doesn't work out, then you return it for a new one. Of course, not everyone wants to do this, and a 9600 Pro is a respectable and highly overclockable card.. but..
I'd still love to see 9500 Pros at lower prices, like they would have been if ATi had kept it out.. but whatever. If you don't know, the 9500 Pro is/was considerably faster than the 9600 Pro. Valve said that HL2 isn't memory-limited, so the 128-bit memory interface on the 9500 Pro (which never made a big difference vs. the 9700 anyway) shouldn't even be noticeable, and the fact that the Sapphire-made ones were just as overclockable as the 9500 regulars and 9700s (think up to 340 core, 350 if you're lucky) is going to make it one HELL of an HL2 card for the $175 most people paid.
Anonymous User - Friday, September 12, 2003 - link
Nvidia got schooled, but not on hardware or drivers. ATI locked this up long ago with their deal with Gabe and buddies.Why is everyone just trying to keep a straight face about it? ATI paid handsomely for exactly what has happened to NVidia.
But as always happens, watch out when the tables turn, as they ALWAYS do, and Valve could be on the OUTSIDE of lots of other deals.
Anonymous User - Friday, September 12, 2003 - link
I am just glad there is finally a damn game that can stress out these video cards. Wonder if Bitboys Oy of whatever there name is come out saying they have a new video card out now that will run Half Life 2 at 100+ FPS :) What made me think of them I have no idea!Anonymous User - Friday, September 12, 2003 - link
Not to detract from the main issue here, but #19 raises a good point. Why does the 9600Pro lose only <1% performance going from 1024 to 1280? The 9800P and 9700P lose between 10-15%. The 5900U loses 30%, sometimes more. I wonder if the gap between the 9800P and 9600P shrinks even more at higher resolutions.What aspect of the technology in the 9600 could possibly account for this?
Anonymous User - Friday, September 12, 2003 - link
#81 You can find 9600pro's for ~$160 from newegg.A couple of small webstores have a "Smart PC 9600" non-pro 128 meg for <$100. But the smart pc card is a cheap oem unit...I'm not sure if it's as good as the more expensive 9600's.
Anonymous User - Friday, September 12, 2003 - link
Pricewatch:$123 - RADEON 9600 Pro 256MB
$124 - RADEON 9600 Pro 128MB
atlr - Friday, September 12, 2003 - link
Quote from http://www.anandtech.com/video/showdoc.html?i=1863...
"The Radeon 9600 Pro manages to come within 4% of NVIDIA's flagship, not bad for a ~$100 card."
Anyone know where a ~$100 9600 Pro is sold? I thought this was a ~$200 card.
Anonymous User - Friday, September 12, 2003 - link
Time to load up on ATI stock :)Anonymous User - Friday, September 12, 2003 - link
Quoted from Nvidia.com:"Microsoft® DirectX® 9.0 Optimizations and Support
Ensures the best performance and application compatibility for all DirectX 9 applications."
Oops, not this time around...
Anonymous User - Friday, September 12, 2003 - link
#74 - No, D3 isn't a DX9 game, its OGL. What it shows is that the FX series isn't bad - they just don't do so well under DX9. If you stick primarily to OpenGL games and run your DX games under the 8.1 spec, the FX should perform fine. It's the DX9 code that the FXes seem to really struggle with.Anonymous User - Friday, September 12, 2003 - link
#74: I have commonly heard this blamed on a bug in an older release of the CATALYST drivers that were used in the Doom3 benchmark. It is my understanding that if the benchmark was repeated with the 3.7 (RELEASED) drivers, the ATI would perform much better.#75: I believe this goes back to prior instances where Nvidia has claimed that some new driver would increase performance dramatically to get it into a benchmark and then never release the driver for public use. If this happened, the benchmark would be unreliable as it could not be repeated by an end-user with similar results.
Also, the Det50 drivers from Nvidia do not have a working fog system. It has been hinted that this could be intentional to improve performance. Either way, I saw a benchmark today (forgot where) that compared the Det45's to the beta Det50's. The 50's did improve performance in 3DMark03 but no where near the 73% gap in performance seen in HL2.
Anonymous User - Friday, September 12, 2003 - link
Because Gabe controls how representative the hl2 beta is of the final hl2 product but he cannot control how representative the nvidia det50 beta is if the final det50s.And besides that there have been rumours of "optimalisations" in the new det50s.
Anonymous User - Friday, September 12, 2003 - link
How is it that Gabe can recommend not running benckmarks on an publicly unavailable driver or hardware, yet the game itself is unavailable? Seems a little hypocritical....Anonymous User - Friday, September 12, 2003 - link
I didn't have time to look into this but can someone enlilghten me as to why the 5900 Ultra outperformed the 9800 PRO in the Doom 3 benchmarks we saw awhile back...is that not using DX9 as well? If I am way off the mark here or am even wrong on which outperformed which go easy on the flames!Thanks
Anonymous User - Friday, September 12, 2003 - link
"Not true, the 9500 is a true DX9 part. The 9600 does have faster shader units though."My bad, should have looked at ATI first. I guess I'm thinking about the 8500. Either way, I would still go 9600 Pro, especially given that it is cheaper than a 9500 non-pro.
Anonymous User - Friday, September 12, 2003 - link
"The 9600 fully supports DX9 whereas the 9500 does not."Not true, the 9500 is a true DX9 part. The 9600 does have faster shader units though.
Anonymous User - Friday, September 12, 2003 - link
I think the insinuation is clear from that nVidia email posted and Gabe's comments. Valve believed nVidia was trying to "cheat" with their D50s by intentionally having fog disabled etc. Rather than toss around accusations, it was simpler for them to just require that the benchmarks at this point be run with released drivers and avoid the issue of currently bugged drivers with non-working features, whether the reason was accidental or intentional.Considering that the FXes fared poorly with 3DMark and again with HL2 - both using DX9 implementations, I think it might be fair to say that the FXes aren't going to do too much better in the future. Especially considering the way they reacted to 3DMark 03 - fighting the benchmark rather than releasing drivers to remedy the performance issue.
I'd like to see how the FXes do running HL2 with pure DX8 rather than DX9 or a hybrid, as I think most people owning current nVidia cards are going to have to go that route to achieve the framerates desired.
Anonymous User - Friday, September 12, 2003 - link
I dont see how the minimum requirements set but valve are going to play this game. 700mhz and a TNT2. The FX5200's could barely keep up.Anonymous User - Friday, September 12, 2003 - link
#68: 33 fps * 1.73 = 57.09 fps (add the one to account for the intial 33 score).This doesn't quite work out based on the 57.3 score of the 9800 Pro so corrected score on the Nvidia was probably closer to this:
57.3 / 1.73 = 33.12 fps
#69: I would definitely try to find a 9600 Pro before I bought a 9500 Pro. The 9600 fully supports DX9 whereas the 9500 does not.
Anonymous User - Friday, September 12, 2003 - link
Guess Its time to upgrade...Now where's my &*&%%'n wallet!!
Wonder where I'll be able to find a R9500Pro (Sapphire)
Anonymous User - Friday, September 12, 2003 - link
The performance increase between the FX5900 and Rad9800Pro is not 73%. Do the math correctly and it turns into 36.5% lead. The article should be revised.atlr - Friday, September 12, 2003 - link
If anyone sees benchmarks for 1 GHz computers, please post a URL. Thanks.WooDaddy - Friday, September 12, 2003 - link
Hmmm... I understand that Nvidia would be upset. But it's not like ATI is using a special setting to run faster. They're using DX9.. Nvidia needs to get on the ball. I'm going to have to upgrade my video card since I have a now obsolete Ti4200 GF4.GET IT TOGETHER NVIDIA!!! DON'T MAKE ME BUY ATI!
I might just sell my Nvidia stock while I'm at it. HL2 is a big mover and I believe can make or break the card on the consumer side.
Anonymous User - Friday, September 12, 2003 - link
I had just ordered a 5600 Ultra thinking it would be a great card. It's going back.If I can get full DX 9 performance with a 9600 Pro for around $180, and that card's performance is better than the 5900 Ultra - then I'm game.
I bought a TNT when Nvidia was making a name for it's self. I bought a GF2 GTS when Nvida was destroying the 3dfx - now Nvidia seems to have droped the ball on DX9. I want to play HL2 on what ever card I buy. A 5600 ultra won't seem to cut it. I know the 50's are out there, but I've seen the Aquamark comparision with the 45's and 50's and I'm not impressed.
I really wanted to buy Nvidia, but I cannot afford it.
Anonymous User - Friday, September 12, 2003 - link
#62: I do have the money but I choose to spend it elsewhere. FYI: I spend $164 US on my 2.4C and I'm running speeds faster than the system used for this benchmark."The Dell PCs we used were configured with Pentium 4 3.0C processors on 875P based motherboards with 1GB of memory. We were running Windows XP without any special modifications to the OS or other changes to the system."
Anand was using a single system to show what HL2 performance would be on video cards available on the market today. If we was to run benchmarks on different CPU's he would have to spend a tremendous amount more time doing so. In the interest of getting the info out as soon as possible, he limited himself to a single system.
I would deduce from the performance numbers of HL2 in Anand's benchmarks that unless you have a 9600 Pro/9800 Pro, your AMD will not be able to effectively run HL2.
Anonymous User - Friday, September 12, 2003 - link
Woohoooo!!!My ATI 9500@9700 128MB with 8 pixel pipelines and 256bit access beats the crap out of any FX.
And it only costed me 190euros/190dollars
Back to the drawing board NVidia.
Muahahahah!!!
Anonymous User - Friday, September 12, 2003 - link
#61.. i take it YOU have the money to shell out for top of the line hardware ????????? i sure as hell don't, but like #42 said, " more widely used comp "i my self am running a 1700+ at 2400+ speeds, no way in hell am i gonna go spend the 930 bucks ( in cdn funds )on a 3.2c P4, thats NOT inc the mobo and ram, and i'm also not gonna spend the 700 cdn on a barton 3200+ either, for the price of the above P4 chip i can get a whole decient comp, may not be able to run halflife at its fullest, but still, i'm not even interested in HL2, it just not the kind of game i play, but if i was, whay i typed above, is still valid..
anand... RUN THESE HL2 BENCHES ON HARDWARE THE AVERAGE PERSON CAN AFFORD !!!!!!!!!!!!!!!!!!!!!!!! not he spoiled rich kid crap .....
Anonymous User - Friday, September 12, 2003 - link
#42 "...should have benchmarked on a more widely used computer like a 2400 or 2500+ AMD...":The use of 'outdated' hardware such as your 2400 AMD would have increased the possibility of cpu limitations taking over the benchmark. Historically all video card benchmarks have used the fastest (or near fastest) GPU available to ensure the GPU is able to operate in the best possible scenario. If you want to know how your 2400 will work with HL2, wait and buy it when it comes out.
In reference to the 16/32 bit floating point shaders and how that applies to ATI's 24 bit shaders:
It was my understanding that this quote was referencing the need for Nvidia to use it's 32 bit shaders as future support for its 16 bit shaders would not exist. I don't see this quote pertaining to ATI's 24 bit shaders as they meet the DX9 specs. The chance of future HL2 engine based games leaving ATI users out in the cold is somewhere between slim and none. For an example of how software vendor's react to leaving out support for a particular line of video card, simply look at how much work Valve put into making Nvidia's cards work. If it was feasible for a software vendor to leave out support for an entire line like your are refering to (ATI in your inference) we would have had HL2 shipping by now (for ATI only though...).
Anonymous User - Friday, September 12, 2003 - link
58, http://myweb.cableone.net/jrose/Jeremy/HL2.jpgAnonymous User - Friday, September 12, 2003 - link
Are pixel shader operations anti-aliased on current generation video cards? I ask because in the latest Half Life 2 technology demo movie, anti-aliasing is enabled. Everything looks smooth except for the specular highlights on the roof and other areas, which are still full of shimmering effects. Just seems a little sore on the eyes.Anonymous User - Friday, September 12, 2003 - link
An observation:Brian Burke = Iraqi Information Officer
I mean this guy rode 3dfx into the dirt nap and he's providing the same great service to Nvidia.
Note to self: Never buy anything from a company that has this guy spewing lies.
Anonymous User - Friday, September 12, 2003 - link
OK, this article was great.For us freaks, can you do a supplement article. Do 1600x1200 benchmarks!!!
Things will probably crawl, but it would be nice to know that this should be the worst case at this resolution when ATI and NVidia come out with next gen cards.
Also, was any testing done to see if the benchmarks were CPU or GPU limited? Maybe use the CPU utilization montior in Windows o see what the CPU thought. maybe a 5.0 GHz processor down the road will solve some headaches. Doubtful, but maybe....
Anonymous User - Friday, September 12, 2003 - link
Whats really funny is that Maximum PC magazine built an $11000 "Dream Machine", using a GeforeFX 5900 and i can built a machine for less then $2000 and beat it using a 9800 pro.Long Live my 9500 pro!
Anonymous User - Friday, September 12, 2003 - link
I can play Frozen Throne and I am doing so on a GeForce2MX LOL (on a P2@400mhz).Anonymous User - Friday, September 12, 2003 - link
look at my #46 posting - i know it's different engines, different API's, different driver revisions etc...but still it's interesting..
enigma
Anonymous User - Friday, September 12, 2003 - link
#52 different engines, different results. hl 2 is probably more shader limited than doom 3. The 9600pro has strong shader performance, which narrows the gap in shader limited situations such as hl 2.btw, where did you get those doom 3 results? Only doom 3 benches I know about are based off the old alpha or that invalid test from back when the nv35 was launched...
Anonymous User - Friday, September 12, 2003 - link
another thing i just noticed looking at the doom 3 and hl2 benchies.take a look at the performance of 9800pro and 9600pro...
in hl2, the 9800pro is about 27% ahead of the 9600pro, in doom 3 the 9800pro is near 50% faster than the 9600pro. the whole thing just feels weird.
enigma
Anonymous User - Friday, September 12, 2003 - link
I'm surprised that Anand mentioned nothing about the comparisons between 4x2 and 8x1 pipelines? Does he even know that MS is working to included paired textures with simutainious wait states for the nV arcitexture? You see the DX9 SDK was developed thinking only one path and since each texture has a defined FIFO during the pass the second pipe in the nV is dormant until the first pipe FIFO operation is complete, with paired textures in the pipe using syncronus wait states this 'problem' will be greatly relieved.Anonymous User - Friday, September 12, 2003 - link
its fake.... HL2 test are not ready today , great fake Anandtech :)rogerw99 - Friday, September 12, 2003 - link
#28Ooo Ooo Ooo... I know the answer to that one.
It was Mrs. White, but it wasn't with the gun, it was the lead pipe.
Anonymous User - Friday, September 12, 2003 - link
ATI The Way It Should Be PlayedAnonymous User - Friday, September 12, 2003 - link
Quote: 'So why is it that in the age of incredibly fast, absurdly powerful DirectX 9 hardware do we find it necessary to bicker about everything but the hardware? Because, for the most part, we've had absolutely nothing better to do with this hardware.'Don't we? Wrong!
http://www.cs.virginia.edu/~gfx/pubs/multigridGPU/
;)
Anonymous User - Friday, September 12, 2003 - link
one thing that i think is kinda interesting. check out this benchmark hardocp did - fx5900 ultra vs. radeon 9800 pro in doom 3 (with help from id software).http://www.hardocp.com/article.html?art=NDc0LDE=
after reading this, read carmack's Jan 03 .plan, where he states that under the default openGL codepath, the fx architecture is about half as fast as the r300 - something that is pretty much resembled in the hl2 benchmarks. furthermore he states that using the default path the r300 is clearly superior (+100%), but when converting to vendor-specific codepaths, the fx series is the clear winner.
conclusions? none, but some possibilities
.) ati is better in directx, nvidia in opengl
.) id can actually code, valve cannot
.) and your usual conspiracy theories, feel free to use one you specifically like
bottom line. neither ati nor nvidia cards are the "right ones" at the moment, wait for the next generation of video cards and upgrade THEN.
enigma
Anonymous User - Friday, September 12, 2003 - link
I'm so glad i converted to Ati, i have never regret it & now it feels even better. Ati rulesnotoriousformula - Friday, September 12, 2003 - link
i'm sure Nvidia will strike back.. prolly with DOOM III..well till then i'll enjoy my little army of ATI cards: ATI 9800NP>PRO, ATI 9700, ATI 9600PRO :P..long live ATI!!! :DAnonymous User - Friday, September 12, 2003 - link
Anand should have benchmarked on a more widely used computer like a 2400 or 2500+ AMD. Who here has the money to buy a p4 3Gb 8000mhz FSB cpu?Anonymous User - Friday, September 12, 2003 - link
==="full 32-bit would be required" not 24-bit. So that leaves all ATI cards out in the cold.===By the time full 32-bit becomes standard (probably with DX10 in 2-3 years) there will be NEW cards that make current cards look like sh!t. ATi will have DX10 cards for under $100, same as nVidia and their 5200. People have been upgrading their PC's for new games for YEARS! Only an [nv]IDIOT would attempt to use an old card for new games and software (TNT2 for Doom3? NOT!).
Anonymous User - Friday, September 12, 2003 - link
Funny that you guys think nVidia will be still "plugging along" with the GFFX if the DX spec changes to 32bit... you _do_ know what happens to the GFFX when it's forced to run 32bit prcession don't you? You'd get faster framerates by drawing each frame by hand on your monitor with a sharpie.Pete - Friday, September 12, 2003 - link
#23, the second quote in the first post here may be of interest: http://www.beyond3d.com/forum/viewtopic.php?t=7839... Note the last sentence, which I surrounded by ***'s."nVidia has released the response as seen in the link. Particularly interesting, however, is this part of the e-mail sent to certain nVidia employees ( this was not posted at the given link ):
'We have been working very closely with Valve on the development of Half Life 2 and tuning for NVIDIA GPU's. And until a week ago had been in close contact with their technical team. It appears that, in preparation for ATI's Shader Days conference, they have misinterpreted bugs associated with a beta version of our release 50 driver.
You also may have heard that Valve has closed a multi-million dollar marketing deal with ATI. Valve invited us to bid on an exclusive marketing arrangement but we felt the price tag was far too high. We elected not to participate. ***We have no evidence or reason to believe that Valve's presentation yesterday was influenced by their marketing relationship with ATI.***'"
If this document is indeed real, nV themselves told their own employees Gabe's presentation wasn't skewed by Valve's marketing relationship with ATi.
Anonymous User - Friday, September 12, 2003 - link
Link please #38Anonymous User - Friday, September 12, 2003 - link
LOL! 19, I saw that too. Looks like I'll be replacing my nVidia 'the way it's meant to be played in DX8 because our DX9 runs like ass, and we still sell it for $500+ to uninformed customers' card with an ATi Radeon. Thanks for the review Anand; it will be interesting to see the AA/AF benchmarks, but I have a pretty good idea of who will win those as well.Anonymous User - Friday, September 12, 2003 - link
>>>>>>>ANYONE ELSE CATCH THE FOLLOWING IN THE ARTICLE<<<<<<<<<<<<<<<""One thing that is also worth noting is that the shader-specific workarounds for NVIDIA that were implemented by Valve, will not immediately translate to all other games that are based off of Half-Life 2's Source engine. Remember that these restructured shaders are specific to the shaders used in Half-Life 2, which won't necessarily be the shaders used in a different game based off of the same engine.""
So I guess the nvidia fan boys won't be able to run their $500 POS cards with Counterstrike 2 since it will probably be based on the HL2 engine.
buhahahaha
>>>>>>>>>>>>>>>>>>>>>>>>><<<<<<<<<<<<<<<<<<<
Anonymous User - Friday, September 12, 2003 - link
Valve specifically said "full 32-bit would be required" not 24-bit. So that leaves all ATI cards out in the cold.Pete - Friday, September 12, 2003 - link
#23, I believe you're inferring far too much from ATi's HL2 bundling. Check TechReport's article on Gabe's presentation, in which Gabe is noted as saying Valve chose ATi (in the bidding war to bundle HL2) because their cards quite obviously performed so much better (and look better doing it--keep in mind, as Anand said, all those nVidia mixed modes look worse than pure DX9).In short, Valve doesn't need to do much to please others, as they're the one being chased for the potentially huge-selling Half-Life 2. Everyone will be sucking up to them, not the other way around. And it wouldn't do for Valve to offer nV the bundle exclusive, have consumers expect brilliant performance from the bundled FX cards, and get 12fps in DX9 on their DX9 FX card or 30fps on their $400+ 5900U. That would result in a lot of angry customers for Valve, which is a decidedly bad business move.
People will buy HL2 regardless. Valve's bundling of HL2 with new cards is just an extra source of income for them, and not vital to the success of HL2 in any way. Bundling HL2 will be a big coup for an IHV like ATi, which requires boundary-pushing games like HL2 to drive hardware sales. Think of the relationship in this way: it's not that ATi won the bidding war to bundle HL2, but that Valve *allowed* ATi to win. Valve was going to get beaucoup bucks for marketing tie-ins with HL2 either way, so it's in their best interests to find sponsorships that present HL2 in the best light (thus apparently HL2 will be bundled with ATi DX9 cards, not their DX8 ones).
You should read page 3 of Anand's article more closely, IMO. Valve coded not to a specific hardware standard, but to the DX9 standard. ATi cards run standard DX9 code much better than nV. Valve had to work extra hard to try to find custom paths to allow for the FX's weaknesses, but even that doesn't bring nV even with ATi in terms of performance. So ATi's current DX9 line-up is the poster-child for HL2 almost by default.
We'll see what the Det50's do for nV's scores and IQ soon enough, and that should indicate whether Gabe was being mean or just frank.
Anonymous User - Friday, September 12, 2003 - link
#33 To be pedantic, the spec for DX9 24bit minimum, it has never been said by Microsoft that it was 24bit and nothing else, 24bit is just a minimum.Just as 640x480 is a minimum. That doesn't make 1024x768 non standard.
But considering you are right, and 24 bit is a rock solid standard, doesn't that mean that Valve in the future will violate the DX9 spec in your eyes? Does that not mean that ATI cards will be left high and dry, in the future? Afterall, there will be no optimizations allowed/able?
32bit is the future, according to Valve after all.
Nvidia may suck at doing it, but at least they can do it.
XPgeek - Friday, September 12, 2003 - link
edit, post #32-should read, "my ATi is so faster than YOUR nVidia"
dvinnen - Friday, September 12, 2003 - link
#31: I know what I said. DX9 dosen't require 32 bit. It's not in the spec so you couldn't write shader that uses more than 24bit percision.XPgeek - Friday, September 12, 2003 - link
Well #26, if the next gen of games do need 32 bit precision, then the tides will once again be turned. and all these "my ATi is so faster than for nVidia" will have to just suck it up and buy another new card, whereas the GFFX's will still be plugging along. by then, who knows, maybe DX10 will support 32 bit precision on the nVidia cards better...btw, im still loading down my GF3 Ti500. so regardless, i will have crappy perf. but i also buy cards from the company i like, that being Gainward/Cardex nVidia based boards. no ATi for me, also no Intel for me. Why? bcuz its my choice. so it may be slower, whoopty-doo!
for all i know, HL2 could run for crap on AMD CPUs as well. so i'll be in good shape then with my XP2400+ and GF3
sorry, i know my opinions dont matter, but i put em here anyhow.
buy what you like, dont just follow the herd... unless you like having your face in everyones ass.
Anonymous User - Friday, September 12, 2003 - link
#28 Not 24bit, 32 bit.Anonymous User - Friday, September 12, 2003 - link
Yeah, like mentioned above, what about whether or not AA and AF were turned on in these tests? Do you talk about it somewhere in your article?I can't believe it's not mentioned since this site was the one that make a detailed (and excellent) presentation of the differences b/w ati and nvdia's AA and AF back in the day.
Strange your benchmarks appear to be silent on the matter. I assume they were both turned off.
Anonymous User - Friday, September 12, 2003 - link
>>thus need full 32-bit precision."<<Huh? Wha?
This is an interesting can of worms. So in the future months time, if ATI stick to 24bit, or cannot develop 32 bit precision, the tables will have reversed on the current situation - but even moreso because there would not be a work around (Or optimization).
Will ATI users in the future accuse Valve of sleeping with Nvidia because their cards cannot shade with 32-bit precision?
Will Nvidia users claim that ATI users are "non-compliant with directX 9"? Will ATI users respond that 24bit precision is the only acceptable standard Direct 9 standard, and that Valve are traitors?
Will Microsoft actually force manufacturers to bloody well wait and force them to follow the standard.
And finally, who did shoot Colonel Mustard in the Dining Room?
Questions, Questions.
dvinnen - Friday, September 12, 2003 - link
#26: It means it can't cheat and use 16 bit registries to do it and need a full 24bit. SO it would waste the rest of the registryAnonymous User - Friday, September 12, 2003 - link
#26 That was in reference to the fx cards. They can do 16 or 32 bit precision. Ati cards do 24 bit precision, which is the dx 9 standard.24 bit is the dx 9 standard because it's "good enough." It's much faster than 32 bit, and much better looking then 16 bit. So 16 bit will wear out sooner. Of course, someday 24 bit won't be enough, either, but there's no way of knowing when that'll be.
Anonymous User - Friday, September 12, 2003 - link
Valve says no benchmarks on Athlon 64! :-/Booo!
Quote:
http://www.tomshardware.com/business/20030911/inde...
"Valve was able to heavily increase the performance of the NVIDIA cards with the optimized path but Valve warns that such optimizations won't be possible in future titles, because future shaders will be more complex and will thus need full 32-bit precision."
The new ATI cards only have 24bit shaders!
So would that make ALL current ATI cards without any way to run future Valve titles?
Perhaps I do not understand the technology fully, can someone elaborate on this?
Anonymous User - Friday, September 12, 2003 - link
I agree with #23 in terms of money making power the ATI/Valve combo is astounding. ATI's design is superior as we can see but the point is that ATI is going to get truckloads of money and recognition for this. Its a good day to have stock in ATI, lets all thank them for buying ArtX!Anonymous User - Friday, September 12, 2003 - link
I emailed gabe about my 9600 pro, but he didnt have to do all this just for me :DI love it.
Anonymous User - Friday, September 12, 2003 - link
I perviously posted this in a wrong place so let me just shamelessly repost in here:Let me just get my little disclaimer out, before I dive into being a devil's advocate - I own both 9800pro and fx5900nu and am not biased to neither, ATi or nVidia.
With that being said, let me take a shot at what Anand opted not to speculate about ant that is the question of ATi/Valve colaboration and their present and future relationship.
First of all, FX's architecture is obviously inferior to R3x0 in terms of native DX9 and tha is not going to be my focus. I would rather debate a little about the business/finacial side of ATi/Valve relationship. That's the area of my expertise and looking at this situation from afinacial angle might add another twist to this.
What got my attention are Gabe Newell presentations slides that have omitted small but significant things like "pro" behind r9600 and his statement of "optimiztions going too far" without actually going into specifics, other than new detonators don't render fog. Those are small but significant details that add a little oil on a very hot issue of "cheating" in regards to nVidia's "optimizations". But I sopke of inancial side of things, so let me get back to it. After clearly stating how superior ATi's harware is to FX, stating how much effort they have invested to make the game work on FX (which is absolutely commendable) I can not help but notice that all this perfectly leads into the next great thing. A new line of ATi cards will be bundeled with ATi cards (or vice versa), and ATi is just getting ready to offer a value DX9 line. Remember how it was the only area that they have not covered and nVidia was selling truckloads of FX5200 in the meantime. After they have demonstrated how poorly FX flagship performs, let alone the value parts, is't it a perfect lead into selling shiploads of those bundeled cards(games). Add to that Gabe's shooting down of any optimization efforts on nVidia's part (simply insinuate on "chaets") and things are slowly moving in the right direction. And to top it all off, Valve expilcitley said that future additions will not be done for DX8 or so called mixed class but exclusively DX9. What is Joe consumer to do than? The only logical thing - get him/herself one of those bundles.
That concludes my observations on this angle of this newly emerged attraction and I see only good things on the horizon for ATi stockholders.
Feel free to debate, disagree and criticize, but keep in mind that I am not defending or bashing anybody, just offering my opinion on the part I considered equally as interesting as hardware performance is.
Anonymous User - Friday, September 12, 2003 - link
Wow...I buy a new video card every 3 years or so..my last one was a GF2PRO....hehe...I'm so glad to have a 9800PRO right now.Snif..I'm proud to be Canadian ;-)
Anonymous User - Friday, September 12, 2003 - link
How come the 9600 pros hardly loses any performance going from 1024 to 1280? Shouldn't it be affected by only having 4 pipelines?Anonymous User - Friday, September 12, 2003 - link
MUHAHAHA!!! Go the 9600pros, i'd like to bitch slap my friends for telling me the 9600's will not run half-life 2. I guess i can now purchase an All-In-Wonder 9600pro.Anonymous User - Friday, September 12, 2003 - link
Man, I burst into a coughing/laughing spree when I saw an add using nVidia's "The way it's meant to be played" slogan. Funny thing is, I first noticed the add on the page titled "What's Wrong with Nvidia?"Anonymous User - Friday, September 12, 2003 - link
booyah, i hope my ti4200 can hold me over at 800x600 until i can switch to ATI! big up canadaAnonymous User - Friday, September 12, 2003 - link
You can bet your house nvidia's 50 drivers will get closer performance, but they're STILL thoroughly bitchslapped... Ppl will be buying R9x00's by the ton. Nvidia better watch out, or they'll go down like, whatwassitsname, 3dfx ?dvinnen - Friday, September 12, 2003 - link
Hehe, I concer. Seeing a 9500on there would of been nice. But I really want to see is some AF turned on. I can live with no AA (ok, 2x AA) but I'll be damn if AF isn't going to be on.Anonymous User - Friday, September 12, 2003 - link
Anand, you guys rock. It's because of your in depth reviews that I purchased the Radeon 9500 Pro. I noticed the oddity mentioned of the small performance gap between the 9700 Pro and the 9600 Pro at 1280x1024. I would really like to see how the 9500 Pro is affected by this (and all the other benchmarks). If you have a chance, could you run a comparison between the 9500 Pro and the 9600 Pro (I guess what I really want to know if my 9500 Pro is better than a 9600 Pro for this game).Arigato,
The Internal
Pete - Friday, September 12, 2003 - link
(Whoops, that was me above (lucky #13)--entered the wrong p/w.)Anonymous User - Friday, September 12, 2003 - link
Anand, when using the Print Article feature in Mozilla 1.4, I was shown only graphs from one map throughout. For instance, after clicking Print Article, all graphs were of the bug level. Hitting F5 showed them all to be of techdemo. In both cases, some graphs didn't correspond to your comments.This may be b/c the article was just posted, but thought I'd give you a heads-up anyway.
Thanks for the interesting read, and hopefully we'll see screenshots of the differences between the DX8.0. 8.1, 8.2, NV3x, and DX9 modes soon (the only thing lacking from this article, IMO)!
Anonymous User - Friday, September 12, 2003 - link
.. goddammit, all the flashes are arranged improperly. (Techdemo on bugbait pages, city on techdemo...) FIX IT.Anonymous User - Friday, September 12, 2003 - link
I was hoping anand would compair a 128mb 9800pro to a 256mb one, guess I'll still have to wait =(Anonymous User - Friday, September 12, 2003 - link
Hey Anand, you have a 9500 Pro lying around?Eh, well, it doesn't need to be included anyway. We all know how it would do: 5% worse than the 9700 Pro.
Anonymous User - Friday, September 12, 2003 - link
#5 & #6 : +1I ll keep my G4 Ti 4200@300/600.
I m sure HL² will still rocks in DX 8.1
Anonymous User - Friday, September 12, 2003 - link
Where are the numbers with AA/AF enabled? I know the article intimates that there's a negligible performance hit, but I'd still like to see the numbers.Anonymous User - Friday, September 12, 2003 - link
Man, the Ti series has been doing this for a while!http://www.amdmb.com/article-display.php?ArticleID...
Anonymous User - Friday, September 12, 2003 - link
I feel the same way about the GF4Ti series. Never did like the FXes much...Anonymous User - Friday, September 12, 2003 - link
Hahahahaha.Go you Ti4600, GO! I BELIEVE IN THE Ti4600!
If all I am going to lose is a bit of image quality, then no great loss. At least it isn't back to 640x480!
Anonymous User - Friday, September 12, 2003 - link
Wow 9800 pro barely edges out 9700 pro. 9600 pro seems to be the best deal if people are still waiting to upgrade.Obviously Nvidia lost this round with nv30 and nv35.
Anonymous User - Friday, September 12, 2003 - link
Big time...Nvidia = Bombaclaats