I think it is impossible for ATI to "add" some more pipelines without doing a new tape out, so unless these pipes where hidden before, R420 will not make it to market with 16 pipelines untill december.
And just for the record, pipelines are paralell paths in a GPU able to produce rendered pixels, so double the pipelines means double the fill rate, but because they may have to share shaders and texture units, more pipelines do not always generate more performance.
Also note that you have to multiply the number of pipelines with the clock speed to calculate the theoretical fill rate.
For those asking if chips could be softmodded, this is very likely, because the real money is made in the lower mid-range, because of the higher sales volumes. So if they could add options to boost sales, without spending more on chips, they will most likely do it.
But remember that enabeling top-end options or speeds may not work on all chips, because top models are often hand picked, as consumers want "their" brand to make the fastest stuff, to impress others with the brand name.
So, as a manufacturer can sell more midrange models when they manage to make their top-end model win the benchmarks (even if their midrange card is slower), there is a very good reason to hand pick top-end chips.
some of you think the names are too long now? wait til Leadtek gets their hands on the chipsets....imagine Leadtek Winfast A420-16 X800 Golden Sample yadda yadda yadda...assuming, of course, Leadtek is wise enough to branch out to ATI chipsets....
#45 AtaSrumf, R420 is the name of the GPU, not the card. From what we're hearing, the GPU itself has four quads constituting 16 potential pipelines, and ATi seems to have the capability to disable a quad if it's imperfect in some way. This is the same way Intel sells Celerons as crippled Pentiums, by disabling half of the L2 cache that has errors in it. Manufacturing a GPU is all about yields, or how many perfect GPUs you can get out of a silicon wafer. ATi, by being able to disable parts of the GPU that contain errors to sell those imperfect chips as fully functional yet lower-performing, is essentially raising their yields of R420 (because they throw less GPUs away).
This is assuming Anandtech is right and R420 can disable faulty quads. This rumor has made the rounds at B3D a few weeks ago, too. It seems an almost perfect solution to increasing yields and thus lowering cost, but ATi may just be doing so at the beginning. They may drop eight-pipe R420's in favor of a mid-range eight-pipe GPU, as they did moving from the stop-gap 9500 Pro (built using the high-end R300 GPU) to the 9600 Pro (built using the smaller and cheaper RV350 GPU).
How can the X800 XT be 16 pipelines and stil be called R420. That's kind of weird.
And the X800 Pro is more than a little later than it would normaly be expected, so if you put 2 and 2 together you see, that perhaps it's all the same core, and soft/hard moding these cards could be a real possibility. Maybe we will see a X800 XT even before May 31st :)
Phiro: Totally understandable. Here is my philosophy on the NV thing as of right now though:
Had this been something like "NV40 is not compatible with all AMD mobos", that would be a battle to fight. As far as the GeForce 6800 name and April 13 release date; well, if you look hard enough you can still find that info (comments, forum posts, etc). I was only told to remove it from the article.
The timing also has more to do with it - I am glad i got 2 days out of the post before someone told me to remove it anyway. Had it been 2 hours after I posted, I most likely would have stood my ground. You can also bet that as soon as the NDA lifts i will replace the original content for archival purposes.
While I'm sad you guys pulled your nvidia part of this, and sad that someone had to notice it and speak up before you said anything, I'm at least glad you told us. You do have to pick your battles, but Nvidia needs to understand that coming down like this will almost always only generate ill-will. Marketing has never been the home of brainiacs - it's full of scumbags who can't look 3 inches past their 401k, stock options and their own twisted little pretend world. When you have to pull content and they *don't want you to acknowledge it is when the bells should be going off in your head. You dig your heels in at that point - sometimes the tree bends with the wind to survive, other times it has to stand tall otherwise it doesn't get the sunlight it needs to survive.
For once AT scoops the online mag The Inquirer. Kudos!! I guess AT has some really close contacts on the inside with ATI. With this major hardware release, The Inquirer wasn't getting the usual info. weeks before the launch as they usually do. And what they have posted on TheInq has been even more vague than usual. Most of it is downright incorrect.
Because they feel that it is against their NDA. I'm sure AT has cards in hand already for the April 13th "launch", but they're not allowed to comment on them. Even if they hear rumors elsewhere about what the cards will be called, sometimes big companies get bent out of shape. I don't know why there's any fuss, though... like knowing the names one week earlier is really going to affect anything.
I think its likely the GeForce 6800 could be over twice as fast as the FX5950 Ultra. There are plenty of rumors floating around that it has a sixteen pipeline core, not sure if 16x1 or 16x2, but that alone would give double the power of the current 8x1 (and I'm not going to get into arguments over whether the 59xx is 8x1 or 8x2). Once you add various other architectural improvements and a higher core clock-speed you'll easily have over twice the power.
Despite using higher speed memory (1.2GHz effective has been suggested) it won't have an equivalent increase in memory-bandwidth unless they shift to a 512-bit interface (which is unlikely) but DX9 performance is more dependent on core than memory speed anyway so it might not matter much when it comes to overall performance.
I think the names are fine, dropping the FX was probably a good idea for nVidia to leave behind the NV3x generation which were always second-place behind the R3xx. ATI could have called their new card the Radeon 10800 but X800 sounds much better. Not sure what they'll call the next generation after this though, X2800 etc would make sense with the X becoming as much a part of the brand as the Radeon is now.
Get real, AW. What do you want them to do? Release two cards with equivalent performance, except one is PCIe and the other is AGP, and call them different names? Like that's any better. The real names will be Radeon X800 Pro, and on the box will be a sticker saying whether it is PCIe or AGP. GeForce 6800 is also relatively reasonable, although I'm sure the GeForce 6850 Ultra or something will come out not too long after. Again, Nvidia cards will probably have a sticker or something stating whether the card is AGP or PCIe.
Personally, I think the Radeon cards should have been called Radeon A800 and A600 (for the version with half as many pipelines). The next cards could be the B800 and B600.... That would get you all the way up to F800 and F600 before you ran out of four digit "numbers". Of course, some of you are probably going to complain about hexidecimal not being a well-known number system among the commoners.... ;)
Final thought is that someone commented on how big of a jump it was from an 8500 to a 9700. Well, it wouldn't have been quite as impressive if you had gone from a GF4Ti to a 9700, but the possibility for the 6800 cards to actually be almost twice as fast as current top cards is real. Rumors are saying that 3DMark03 scores of 12,500+ are being hit with the G6800. It's not the be-all, end-all benchmark, but if a card can score 12,500 at standard settings in 3DMark03, that card will be a pretty massive step up from the current cards. Then your CPU will be the major bottleneck again. :p
So nVidia finally got rid of the bad omen that the FX sufix was. Good for them! ATi's new name isn't bad either. It sounds fresh. Let's just hope the perfomance of the two new cards will be as refreshing, rather than depressing, like the last two generations.
I must admit I'm a bit surprised to see NV40 before R420, but than again they're both late this year.
As for 512 MB of on-board memory. For now it's definately a hideous overkill, but the not too distant future DX9 titles are supposed to be able to use that kind of capacity. Just hope the price doesn't go through the roof.
Come on...are you serious. I was obviously exaggerating. Imagine for one second that you aren't a uber nerd that you are and you were Joe schmoe at Best Buy. Which one sounds better to you (Please note that neither name is descriptive in the least bit)
ATI Radeon 9800 XT ATI Radeon AGP X800 Pro ATI Radeon PCX X800 Pro Nvidia geForce 6800 AGP Nvidia geForce 6800 PCX
You can try and say it makes sense...but it doesn't. It's retarded. What does X mean anyway...10? so it's 18000? You are right, they have simplified some...woohoo. Nividia is definately better than ATI. But still, why did FX ever exist in the first place. It took them a long time to realize it was stupid. SE, XT, and Pro make a little sense if they are consistent. Which obviously isn't happening. So Ati has gone full circle and come to 10000 so now they have to go back down. Since stupid nvidia thought they had to match the 1000 numbers of ATI after Geforce 4...my god what are they gonna do. God knows X800 Pro sounds better than geForce 6800...maybe??? It cracks me up. Both sides have backed themselves into a completely incomprehensible naming corner and I love it! Consumer confusion is good. At least for Anandtech. It's just the more people who come here to figure it out, which is fine with me. The whole thing just makes me laugh.
And another comment, I doubt either of these cards will have the performance jump of the R200 to the R300. I probably will just stick with my 9700NP because the price of these cards might actually be more than $500 (for the high end ones). I had an R8500LE before the 9700NP and I waited 8 months for the price of the 9700NP to come down to $200. Unless there is just a monsterous performance gain, I doubt that the price premium is worth the purchase.
You have to be kidding me. Complaining about the naming schemes? Read more closely. The NVidia names are getting shorter, while the ATI names are staying the same. GeForce 6800, Radeon X800. WHAT COULD BE SIMPLER? No more FX, which wasnt any more complicated than the old GeForce 4 etc system, but seemed to get people so upset. Maybe you want names like "Bob" or "Martha". Or maybe just "Q". Find another bitching bandwagon to jump on.
Nvidia and ATI's marketing departments must really suck. The should all be fired! The naming schemes are bordering on ridiculous. No scratch that..they are ridiculous. It serves no purpose but to create confusion. Finally someone at Intel realized how stupid their naming scheme was and they are simplifying it. One can only hope that Nvidia and ATI will get a clue and follow their lead.
Judging by these names, they aren't "ExTReMe" enough. I am going to wait for the
Nvidia InFiNItY FX(XX) GeForce Triple eXXXtreme NV4658700000 with PCI XXXpress and 1,000 MB of 4x DDR2 Cas2 Memory...;-)
Plus hopefully they will get a clue that most people aren't going to spend more than it costs to build a complete computer on a video card (unless they throw in a 19" LCD with it).
Usally Anandtech comments on stupid naming schemes. This time they didn't. To bad...
Nvidia and ATI...Keep it Simple Stupid! It always works
I have a feeling that Doom 3 and Half Life 2 will have NO problem whatsoever taking advantage of extra memory and horsepower. Anti aliasing, Aniso, real-time lighting, huge textures, etc....
Revolutionary steps CAN happen overnight- why it seems just yesterday I had to pick up my jaw off the floor after witnessing GLQuake for the first time. I definitley didn't see that one coming.
I'll never pay more than $250 for a video card, but anyone who questions bigger, better, faster, more when it comes to computers is always proved wrong in the end.
Its nice to have some names to throw around, can't wait for the benchmarks next week :)
Does the NDA for the Radeon X800 expire the same day as the GeForce 6800 so we get all the results in one big article, or will we have to wait a little longer for it?
I might be wrong on my numbers, but from what I remember reading ATI's solution allows data to transfer at 4GB/s in BOTH directions, allowing for 8GB/s potential max for data to flow to and from the video card. Nvidia's is equal to agp 16x - latenancies for the addition of the bridge. This would allow 4GB/s in only one direction - time for the additional lateneancy. Seeing how graphic cards' data usually travels by majority from the graphics card, and doesn't even use that much bandwidth to send the data anyways, Nvidia's solution should yield almost the same results. That is, considering that the latenancies are closer to marginal then higher. This should save Nvidia a buttload of money. Personally, I would have bridged the PCIX version to work w/ agp, and not vice versa due to the PCIX card most likely being the one that will bench the highest of the two and have a better shot at the top spot.
And i dont see the "X" as being rediculous in this case, due to it meaning "10" and not "Extreme" or "Xtreme". I just want to know what they will do when they hit "11". XI800!? lol
#12, in PCI-E mode, the NV40 will have just as much bandwidth as PCI-E 16x has(thanks to an overclocked internal AGP bus); the difference is that Nvidia's solution is half-duplex, where as a full PCI-E solution is full duplex. You are right however in that neither is significant at this point; besides a few fringe apps, nothing is close to overwhelming 8x.
Huge amounts of video memory allow for much higher resolution shadow maps, for one thing. These are easily created on the fly, rather than taking artists' time as normal textures do. I can see the immediate impact of having more video memory there, since aliased shadow maps are a good 'suspension of disbelief'-breaker.
Also remember that not just textures are stored on the card, but geometry as well.
The bridge will introduce additional delays in the communication between the video card and everything else. And the bandwidth will be limited at what the AGP 8x interface in capable. Neither one is (I suppose) significant at that moment.
Looks like I got a little confused with all those numbers, as the difference is the company that makes them, but let me rephrase that question:
Will the "bridged" PCI-Express card from NVIDIA be any different in concept than ATI PCI-Express released in June? Just trying to understand the significance of the "bridge".
oops signin post...yeah, that '512 mb revision' gave me a good laugh, considering the fact that 128 mb cards still haven't been put to the limit plus by the time we get 512mb cards wont we have PCIX and are moving bandwidth to the cards twice as fast? That plus GDDR3 seems like they are really pushing every aspect of memory these days even though id rather we get more frequent processor revisions =)
Yeah, I agree with you, Spearhawk. If somebody says Xtreme to me one more time, I'm going to snap and bludgeon a marketer with a well-named product for once.
The thing that jumped out at me immediately was "before the end of the year we also anticipate a 512MB revision." I may be incorrect, but aren't games just now *beginning* to fill up 128 MB of video memory? Why the need for 512 right now? I certainly see the need down the road, but it seems a pointless waste of money for bragging rights, if such a card does come to be.
ATi ran out of numbers and added an x. They both want to name the cards differently because they want people to know that they are different then their predecessors. Basically a clean slate.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
53 Comments
Back to Article
T8000 - Monday, April 12, 2004 - link
I think it is impossible for ATI to "add" some more pipelines without doing a new tape out, so unless these pipes where hidden before, R420 will not make it to market with 16 pipelines untill december.And just for the record, pipelines are paralell paths in a GPU able to produce rendered pixels, so double the pipelines means double the fill rate, but because they may have to share shaders and texture units, more pipelines do not always generate more performance.
Also note that you have to multiply the number of pipelines with the clock speed to calculate the theoretical fill rate.
For those asking if chips could be softmodded, this is very likely, because the real money is made in the lower mid-range, because of the higher sales volumes. So if they could add options to boost sales, without spending more on chips, they will most likely do it.
But remember that enabeling top-end options or speeds may not work on all chips, because top models are often hand picked, as consumers want "their" brand to make the fastest stuff, to impress others with the brand name.
So, as a manufacturer can sell more midrange models when they manage to make their top-end model win the benchmarks (even if their midrange card is slower), there is a very good reason to hand pick top-end chips.
darkjedi1066 - Sunday, April 11, 2004 - link
some of you think the names are too long now? wait til Leadtek gets their hands on the chipsets....imagine Leadtek Winfast A420-16 X800 Golden Sample yadda yadda yadda...assuming, of course, Leadtek is wise enough to branch out to ATI chipsets....spyhalfer - Sunday, April 11, 2004 - link
Why on earth should ATI release a card with 12 pipelines, that can be made into 16, before they release a 16 pipeline part?AsiLuc - Sunday, April 11, 2004 - link
Stop guessing and start benchmarking!please!
I can't stand this, I need facts! This way you're giving me a nerve-breakdown...
AtaStrumf - Saturday, April 10, 2004 - link
Well Pete, we essentially agree. It's all one and the same core, with just parts of it disabled for branding and yield improving purposes.tazdevl - Friday, April 9, 2004 - link
#42, don't believe everything you read. The "scoop" you are referring to is about as reliable as you come across on The Inq.Pete - Friday, April 9, 2004 - link
#45 AtaSrumf, R420 is the name of the GPU, not the card. From what we're hearing, the GPU itself has four quads constituting 16 potential pipelines, and ATi seems to have the capability to disable a quad if it's imperfect in some way. This is the same way Intel sells Celerons as crippled Pentiums, by disabling half of the L2 cache that has errors in it. Manufacturing a GPU is all about yields, or how many perfect GPUs you can get out of a silicon wafer. ATi, by being able to disable parts of the GPU that contain errors to sell those imperfect chips as fully functional yet lower-performing, is essentially raising their yields of R420 (because they throw less GPUs away).This is assuming Anandtech is right and R420 can disable faulty quads. This rumor has made the rounds at B3D a few weeks ago, too. It seems an almost perfect solution to increasing yields and thus lowering cost, but ATi may just be doing so at the beginning. They may drop eight-pipe R420's in favor of a mid-range eight-pipe GPU, as they did moving from the stop-gap 9500 Pro (built using the high-end R300 GPU) to the 9600 Pro (built using the smaller and cheaper RV350 GPU).
tazdevl - Friday, April 9, 2004 - link
wrong... Pro will run on 16 as well.AtaStrumf - Friday, April 9, 2004 - link
How can the X800 XT be 16 pipelines and stil be called R420. That's kind of weird.And the X800 Pro is more than a little later than it would normaly be expected, so if you put 2 and 2 together you see, that perhaps it's all the same core, and soft/hard moding these cards could be a real possibility. Maybe we will see a X800 XT even before May 31st :)
KristopherKubicki - Friday, April 9, 2004 - link
Phiro: Totally understandable. Here is my philosophy on the NV thing as of right now though:Had this been something like "NV40 is not compatible with all AMD mobos", that would be a battle to fight. As far as the GeForce 6800 name and April 13 release date; well, if you look hard enough you can still find that info (comments, forum posts, etc). I was only told to remove it from the article.
The timing also has more to do with it - I am glad i got 2 days out of the post before someone told me to remove it anyway. Had it been 2 hours after I posted, I most likely would have stood my ground. You can also bet that as soon as the NDA lifts i will replace the original content for archival purposes.
Hope that helps :)
Kristopher
Phiro - Friday, April 9, 2004 - link
While I'm sad you guys pulled your nvidia part of this, and sad that someone had to notice it and speak up before you said anything, I'm at least glad you told us.You do have to pick your battles, but Nvidia needs to understand that coming down like this will almost always only generate ill-will. Marketing has never been the home of brainiacs - it's full of scumbags who can't look 3 inches past their 401k, stock options and their own twisted little pretend world.
When you have to pull content and they *don't want you to acknowledge it is when the bells should be going off in your head. You dig your heels in at that point - sometimes the tree bends with the wind to survive, other times it has to stand tall otherwise it doesn't get the sunlight it needs to survive.
thatsright - Friday, April 9, 2004 - link
For once AT scoops the online mag The Inquirer. Kudos!! I guess AT has some really close contacts on the inside with ATI. With this major hardware release, The Inquirer wasn't getting the usual info. weeks before the launch as they usually do. And what they have posted on TheInq has been even more vague than usual. Most of it is downright incorrect.A Big Thumb up to AnandTech.
TrogdorJW - Wednesday, April 7, 2004 - link
Because they feel that it is against their NDA. I'm sure AT has cards in hand already for the April 13th "launch", but they're not allowed to comment on them. Even if they hear rumors elsewhere about what the cards will be called, sometimes big companies get bent out of shape. I don't know why there's any fuss, though... like knowing the names one week earlier is really going to affect anything.Da3dalus - Wednesday, April 7, 2004 - link
Why would nVidia want the info pulled?mmp121 - Wednesday, April 7, 2004 - link
nVidia of course, who else would want the info pulled?DerekBaker - Wednesday, April 7, 2004 - link
Told by whom?Derek
KristopherKubicki - Wednesday, April 7, 2004 - link
Nah, nothing fishy, i just was told to remove it. have to choose your battles wisely sometimes; enough people got the message it seemskristopher
Warder45 - Wednesday, April 7, 2004 - link
lol, looks like most Nvidia stuff was removed.AtaStrumf - Wednesday, April 7, 2004 - link
Hey!!! Where did the GeForce 6800/April 13th part of the article go?Something fishy's going on!
PrinceGaz - Wednesday, April 7, 2004 - link
I think its likely the GeForce 6800 could be over twice as fast as the FX5950 Ultra. There are plenty of rumors floating around that it has a sixteen pipeline core, not sure if 16x1 or 16x2, but that alone would give double the power of the current 8x1 (and I'm not going to get into arguments over whether the 59xx is 8x1 or 8x2). Once you add various other architectural improvements and a higher core clock-speed you'll easily have over twice the power.Despite using higher speed memory (1.2GHz effective has been suggested) it won't have an equivalent increase in memory-bandwidth unless they shift to a 512-bit interface (which is unlikely) but DX9 performance is more dependent on core than memory speed anyway so it might not matter much when it comes to overall performance.
I think the names are fine, dropping the FX was probably a good idea for nVidia to leave behind the NV3x generation which were always second-place behind the R3xx. ATI could have called their new card the Radeon 10800 but X800 sounds much better. Not sure what they'll call the next generation after this though, X2800 etc would make sense with the X becoming as much a part of the brand as the Radeon is now.
TauCeti - Wednesday, April 7, 2004 - link
>Trogdor: Personally, I think the Radeon cards should have been called Radeon A800 and A600Come on. That would translate to a Radeon 43008 and a 42496 if you start using base-16 :))
Warder45 - Wednesday, April 7, 2004 - link
The guy in the article on the front page liked it to when the first GeForce card came out.TrogdorJW - Tuesday, April 6, 2004 - link
Get real, AW. What do you want them to do? Release two cards with equivalent performance, except one is PCIe and the other is AGP, and call them different names? Like that's any better. The real names will be Radeon X800 Pro, and on the box will be a sticker saying whether it is PCIe or AGP. GeForce 6800 is also relatively reasonable, although I'm sure the GeForce 6850 Ultra or something will come out not too long after. Again, Nvidia cards will probably have a sticker or something stating whether the card is AGP or PCIe.Personally, I think the Radeon cards should have been called Radeon A800 and A600 (for the version with half as many pipelines). The next cards could be the B800 and B600.... That would get you all the way up to F800 and F600 before you ran out of four digit "numbers". Of course, some of you are probably going to complain about hexidecimal not being a well-known number system among the commoners.... ;)
Final thought is that someone commented on how big of a jump it was from an 8500 to a 9700. Well, it wouldn't have been quite as impressive if you had gone from a GF4Ti to a 9700, but the possibility for the 6800 cards to actually be almost twice as fast as current top cards is real. Rumors are saying that 3DMark03 scores of 12,500+ are being hit with the G6800. It's not the be-all, end-all benchmark, but if a card can score 12,500 at standard settings in 3DMark03, that card will be a pretty massive step up from the current cards. Then your CPU will be the major bottleneck again. :p
bobsmith1492 - Tuesday, April 6, 2004 - link
(did have space) that is...bobsmith1492 - Tuesday, April 6, 2004 - link
Great... now video cards have more than twice the ram as my first hard drive did! (:AtaStrumf - Tuesday, April 6, 2004 - link
So nVidia finally got rid of the bad omen that the FX sufix was. Good for them! ATi's new name isn't bad either. It sounds fresh. Let's just hope the perfomance of the two new cards will be as refreshing, rather than depressing, like the last two generations.I must admit I'm a bit surprised to see NV40 before R420, but than again they're both late this year.
As for 512 MB of on-board memory. For now it's definately a hideous overkill, but the not too distant future DX9 titles are supposed to be able to use that kind of capacity. Just hope the price doesn't go through the roof.
aw - Tuesday, April 6, 2004 - link
Come on...are you serious. I was obviously exaggerating. Imagine for one second that you aren't a uber nerd that you are and you were Joe schmoe at Best Buy. Which one sounds better to you (Please note that neither name is descriptive in the least bit)ATI Radeon 9800 XT
ATI Radeon AGP X800 Pro
ATI Radeon PCX X800 Pro
Nvidia geForce 6800 AGP
Nvidia geForce 6800 PCX
You can try and say it makes sense...but it doesn't. It's retarded. What does X mean anyway...10? so it's 18000? You are right, they have simplified some...woohoo. Nividia is definately better than ATI. But still, why did FX ever exist in the first place. It took them a long time to realize it was stupid. SE, XT, and Pro make a little sense if they are consistent. Which obviously isn't happening. So Ati has gone full circle and come to 10000 so now they have to go back down. Since stupid nvidia thought they had to match the 1000 numbers of ATI after Geforce 4...my god what are they gonna do. God knows X800 Pro sounds better than geForce 6800...maybe??? It cracks me up. Both sides have backed themselves into a completely incomprehensible naming corner and I love it! Consumer confusion is good. At least for Anandtech. It's just the more people who come here to figure it out, which is fine with me. The whole thing just makes me laugh.
Staples - Tuesday, April 6, 2004 - link
And another comment, I doubt either of these cards will have the performance jump of the R200 to the R300. I probably will just stick with my 9700NP because the price of these cards might actually be more than $500 (for the high end ones). I had an R8500LE before the 9700NP and I waited 8 months for the price of the 9700NP to come down to $200. Unless there is just a monsterous performance gain, I doubt that the price premium is worth the purchase.Staples - Tuesday, April 6, 2004 - link
I hate all these naming conventions. They are used to confuse the customer and that they are doing well.Icewind - Tuesday, April 6, 2004 - link
I've got my $500 raring to go whomever is the winner!!spite - Tuesday, April 6, 2004 - link
You have to be kidding me. Complaining about the naming schemes? Read more closely. The NVidia names are getting shorter, while the ATI names are staying the same. GeForce 6800, Radeon X800. WHAT COULD BE SIMPLER? No more FX, which wasnt any more complicated than the old GeForce 4 etc system, but seemed to get people so upset. Maybe you want names like "Bob" or "Martha". Or maybe just "Q". Find another bitching bandwagon to jump on.Da3dalus - Tuesday, April 6, 2004 - link
/me awaits the benchmarking frenzy...:D
SubKamran - Tuesday, April 6, 2004 - link
EXCELLENT. If they come out quick, it'll push down the 9600XT for my brother! LESS MONEY TO SPEND!Then next year I'll by the X series... :P
aw - Tuesday, April 6, 2004 - link
Nvidia and ATI's marketing departments must really suck. The should all be fired! The naming schemes are bordering on ridiculous. No scratch that..they are ridiculous. It serves no purpose but to create confusion. Finally someone at Intel realized how stupid their naming scheme was and they are simplifying it. One can only hope that Nvidia and ATI will get a clue and follow their lead.Judging by these names, they aren't "ExTReMe" enough. I am going to wait for the
Nvidia InFiNItY FX(XX) GeForce Triple eXXXtreme NV4658700000 with PCI XXXpress and 1,000 MB of 4x DDR2 Cas2 Memory...;-)
Plus hopefully they will get a clue that most people aren't going to spend more than it costs to build a complete computer on a video card (unless they throw in a 19" LCD with it).
Usally Anandtech comments on stupid naming schemes. This time they didn't. To bad...
Nvidia and ATI...Keep it Simple Stupid! It always works
dgrady76 - Tuesday, April 6, 2004 - link
I have a feeling that Doom 3 and Half Life 2 will have NO problem whatsoever taking advantage of extra memory and horsepower. Anti aliasing, Aniso, real-time lighting, huge textures, etc....Revolutionary steps CAN happen overnight- why it seems just yesterday I had to pick up my jaw off the floor after witnessing GLQuake for the first time. I definitley didn't see that one coming.
I'll never pay more than $250 for a video card, but anyone who questions bigger, better, faster, more when it comes to computers is always proved wrong in the end.
PrinceGaz - Tuesday, April 6, 2004 - link
Its nice to have some names to throw around, can't wait for the benchmarks next week :)Does the NDA for the Radeon X800 expire the same day as the GeForce 6800 so we get all the results in one big article, or will we have to wait a little longer for it?
Marsumane - Tuesday, April 6, 2004 - link
I might be wrong on my numbers, but from what I remember reading ATI's solution allows data to transfer at 4GB/s in BOTH directions, allowing for 8GB/s potential max for data to flow to and from the video card. Nvidia's is equal to agp 16x - latenancies for the addition of the bridge. This would allow 4GB/s in only one direction - time for the additional lateneancy. Seeing how graphic cards' data usually travels by majority from the graphics card, and doesn't even use that much bandwidth to send the data anyways, Nvidia's solution should yield almost the same results. That is, considering that the latenancies are closer to marginal then higher. This should save Nvidia a buttload of money. Personally, I would have bridged the PCIX version to work w/ agp, and not vice versa due to the PCIX card most likely being the one that will bench the highest of the two and have a better shot at the top spot.And i dont see the "X" as being rediculous in this case, due to it meaning "10" and not "Extreme" or "Xtreme". I just want to know what they will do when they hit "11". XI800!? lol
ViRGE - Tuesday, April 6, 2004 - link
#12, in PCI-E mode, the NV40 will have just as much bandwidth as PCI-E 16x has(thanks to an overclocked internal AGP bus); the difference is that Nvidia's solution is half-duplex, where as a full PCI-E solution is full duplex. You are right however in that neither is significant at this point; besides a few fringe apps, nothing is close to overwhelming 8x.Cat - Tuesday, April 6, 2004 - link
Huge amounts of video memory allow for much higher resolution shadow maps, for one thing. These are easily created on the fly, rather than taking artists' time as normal textures do. I can see the immediate impact of having more video memory there, since aliased shadow maps are a good 'suspension of disbelief'-breaker.Also remember that not just textures are stored on the card, but geometry as well.
Baldurga - Tuesday, April 6, 2004 - link
Icewind, wait a until April 13th and we will know! ;-)Icewind - Tuesday, April 6, 2004 - link
Now comes the big question, How WELL do they perform?Calin - Tuesday, April 6, 2004 - link
The bridge will introduce additional delays in the communication between the video card and everything else. And the bandwidth will be limited at what the AGP 8x interface in capable. Neither one is (I suppose) significant at that moment.Calin
Brickster - Tuesday, April 6, 2004 - link
Reply to my post #10...Looks like I got a little confused with all those numbers, as the difference is the company that makes them, but let me rephrase that question:
Will the "bridged" PCI-Express card from NVIDIA be any different in concept than ATI PCI-Express released in June? Just trying to understand the significance of the "bridge".
Thanks!
Brickster - Tuesday, April 6, 2004 - link
"NVIDIA's bridged PCI-Express cards will launch April 13th along with the AGP versions."Call me ignorant, but what is the difference between the June PCI-Express version and a the "bridged" April 13th version?
Verdant - Tuesday, April 6, 2004 - link
i think i will have to wait until the XXX edition cards come out :shiftyeyes:sandorski - Tuesday, April 6, 2004 - link
X=10, 10 is +1 from the current 9, 10 is next Generation, X simplifies the name and looks coolerZobarStyl - Tuesday, April 6, 2004 - link
oops signin post...yeah, that '512 mb revision' gave me a good laugh, considering the fact that 128 mb cards still haven't been put to the limit plus by the time we get 512mb cards wont we have PCIX and are moving bandwidth to the cards twice as fast? That plus GDDR3 seems like they are really pushing every aspect of memory these days even though id rather we get more frequent processor revisions =)grim122 - Tuesday, April 6, 2004 - link
any1 know if these are actual release dates or just paper releasesPraeludium - Tuesday, April 6, 2004 - link
Yeah, I agree with you, Spearhawk. If somebody says Xtreme to me one more time, I'm going to snap and bludgeon a marketer with a well-named product for once.The thing that jumped out at me immediately was "before the end of the year we also anticipate a 512MB revision." I may be incorrect, but aren't games just now *beginning* to fill up 128 MB of video memory? Why the need for 512 right now? I certainly see the need down the road, but it seems a pointless waste of money for bragging rights, if such a card does come to be.
Thanks for the update, Kristopher.
Spearhawk - Tuesday, April 6, 2004 - link
Anyone else starting to think that using X in the names is geting rediculus?Regs - Tuesday, April 6, 2004 - link
And wow, 256 MBs of GDDR3 memory. These things are going to be 500 dollars when it comes to retail.Regs - Tuesday, April 6, 2004 - link
ATi ran out of numbers and added an x. They both want to name the cards differently because they want people to know that they are different then their predecessors. Basically a clean slate.amdfanboy - Tuesday, April 6, 2004 - link
Is anyone else as confuesed about the names as me ? :confused;