As one of a large flightsim community, can I thank the testers for including it as one of the tested games on this graphics card.
I would plead that it is included as one of the tests in any future CPU/graphics etc reviews, not all of us are war games/shooter games fans and some variety would be nice.
"its sad that it takes 2 of their cards to beat 1 of nvidias"
It's sad that the fan~boi mind cannot appreciate cool technology without feeling insecure about the manufacturer of the hardware in their own precious machine. Do you think the corporate entity that you seem to love so much would show you the same loyalty?
Fan~boi~ism is psychologically no different than the schoolyard clique mentality, which stems from insecurity and immaturity. Grow up and be an independent thinker. It will benefit both your mind and your wallet.
I realize they're different markets, but it's ironic to me AMD gave Intel flack about it's dual die approach for cpus and comes out with a dual chip approach for gpus. I think Nvidia should give them shit about it and then just come out with a dual die approach haha.
Nope. You are mixing things up.
AMD did first the dual CPU with 4x4 or quadFX, Intel follows with an also worthless implementation called skulltrail.
About the dual die, who knows maybe ATI will do it first too since they already have the 512bit memory interface from the R600 design (Nvidia doesn’t have it), just plug in two 256 bit RV670 glued together ;) and there you have a much more elegant and cheaper 3870 X2.
I think you all whine when AMD doesn’t deliver or deliver. I smell fanatic.
ahh, but it all has to do with all of our manufacturers, when we allow our threads to denigrate into fanboi p**sing matches. but i think i've hit upon a formula that explains it.
intel = 5 letters, amd = 3. intel wins.
nvidia = 6 letters, ati = 3. nvidia has twice the performance (unless you want to argue that its amd/ati = 6 letters, one punctuation mark, giving them a slight edge).
what? then you tell me what everyone is on about!! all i'm saying is that, compared with the logic behind many arguments which spring up when these threads go fanboi...my formula almost makes sense!
remember to read it all and laugh, folks. ultimately...you're right about what you want. thats what it all comes down to...you are right. whatever your opinion, whatever your rationale...you are right. not these other people who want to tell you what and how to think.
amd/ati, intel, nvidia = at the buyer's discretion, based on his/her needs.
okay? okay...
now can we move along to some real issues here?
You sound like a fanboy with your comments. Does it really matter if this is company X? Judge it on it's merits alone. Most of us will buy the card that performs the best for the $$.
This product is a successful implementation of dual cores in a single card solution. nVidia's attempt at such a card was poorly executed. This review is just comparing the last attempt by nVidia at a dual gpu card. Yes, it took two AMD cores to outperform one of nVidia's latest and greatest, but it's in a single card. Very promising without breaking the bank, especially if one does not have two PCI-Express slots or the skills to set up a SLI/Crossfire in a proper manner.
It also looks like AMD's new card does this all without much of a hiccup. This is good news for us all. Don't worry for all of you that care which maker is on top. nVidia will come out with their entry in due time. We will have to wait and see if their entry will be as seamless as this AMD setup. Maybe, just maybe, AMD can do something right now and then.
It is a valid observation - the 7950 dual GPU on a card solution is old. It's what I was thinking. At $400+, the card is a high-end card and really doesn't do $150 worth more than, say, the 8800gt - or does it? It's an opinion thing I guess.
That's not to say that this is not a good card - on the contrary, it's WONDERFUL to see AMD/ATI (both of whom I'm a huge fan) coming out with products that are competitive, smooth, and quality!
It's also great to see AMD come out with something that has good and smooth driver support - that's something I've always been frustrated about with the older ATI cards. . .loved what they did and loved them after I got them configured the way I wanted - hated the road to get there though.
But a smooth and effecvtive implementation of it like this one, is quite new...
"the card is a high-end card and really doesn't do $150 worth more than, say, the 8800gt - or does it?"
Interesting question...that $150 extra represents a 37.5% price premium. Plus, of course there's the epenis bragging rights of having the fastest card made (though I never really understood that one.
To give that perspective though, the 8800GT is 32% more expensive than the HD3850...
I guess if you're a gamer, then the X2 makes perfect sense, but if you use little graphics, then the 3850 makes more sense. In between the 2 is the 8800GT, so it will depend on what you do.
Point is that it's up to the buyer to decide whether this is worth it. MultiGPU solutions have always been a questionable and subjective thing. The important objective thing taht can be said about this card is that it takes one disadvantage away, there's no need for a crossfire motherboard.
I think Ati put something interesting on the market.
It wasn't comparing it in terms of performance, bro. It was comparing it in terms of driver support and form factor. That type of comparison is legit no matter how old a card is and you know it.
In terms of performace, this was tested with a 8800GT SLI config (500+) and performed just under it (at 4-450), and it outperforms the ultra for obvious reasons. THAT'S the performance comparison. Quit whining and being a fanboy and think before you post.
"It is a valid observation - the 7950 dual GPU on a card solution is old. It's what I was thinking. At $400+, the card is a high-end card and really doesn't do $150 worth more than, say, the 8800gt - or does it? It's an opinion thing I guess. "
"By that argument, neither does the 8800 Ultra which is MORE expensive... "
"Out of curiosity, I looked to see where ANYONE mentioned an 8800 Ultra in this article. . .
guess what, nobody did. Hmmmm, fancy that.
Topic of this was comparing a non-available previous-generation nVidia card to a top-of-the-line, just-released ATI.
THAT is not a valid comparision. "
So why did you mention the 8800GT in your other post, hypocrite?
ChronoReverse's comparison was just as valid as yours.
I think it is fair. The article's focus is showing AMDs implementation of the dual GPU card, and how seemless it can be done as far a settings, dual sceens, and drivers. I don't think the point is that NVida did it first or faster. I think we all benifit from something like this.
"AMD has pushed the fact that their new hardware is capable of fully accelerating windowed 3D based on how it manages clock speed with respect to work load, so we aren't quite sure why we are seeing this behavior."
Could it be in windowed mode the desktop is showing and Aero Glass is hog? :)
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
22 Comments
Back to Article
flyingalan - Monday, February 11, 2008 - link
As one of a large flightsim community, can I thank the testers for including it as one of the tested games on this graphics card.I would plead that it is included as one of the tests in any future CPU/graphics etc reviews, not all of us are war games/shooter games fans and some variety would be nice.
CDex - Thursday, February 7, 2008 - link
"its sad that it takes 2 of their cards to beat 1 of nvidias"It's sad that the fan~boi mind cannot appreciate cool technology without feeling insecure about the manufacturer of the hardware in their own precious machine. Do you think the corporate entity that you seem to love so much would show you the same loyalty?
Fan~boi~ism is psychologically no different than the schoolyard clique mentality, which stems from insecurity and immaturity. Grow up and be an independent thinker. It will benefit both your mind and your wallet.
diablosinc - Thursday, February 7, 2008 - link
i'll build the chapel, brother...and you preach it!knowom - Thursday, February 7, 2008 - link
I realize they're different markets, but it's ironic to me AMD gave Intel flack about it's dual die approach for cpus and comes out with a dual chip approach for gpus. I think Nvidia should give them shit about it and then just come out with a dual die approach haha.MrKaz - Thursday, February 7, 2008 - link
Nope. You are mixing things up.AMD did first the dual CPU with 4x4 or quadFX, Intel follows with an also worthless implementation called skulltrail.
About the dual die, who knows maybe ATI will do it first too since they already have the 512bit memory interface from the R600 design (Nvidia doesn’t have it), just plug in two 256 bit RV670 glued together ;) and there you have a much more elegant and cheaper 3870 X2.
I think you all whine when AMD doesn’t deliver or deliver. I smell fanatic.
DigitalFreak - Thursday, February 7, 2008 - link
Dual die does not equal dual cpu, smacktard. 4x4/Skulltrail has nothing to do with what he was talking about.diablosinc - Thursday, February 7, 2008 - link
ahh, but it all has to do with all of our manufacturers, when we allow our threads to denigrate into fanboi p**sing matches. but i think i've hit upon a formula that explains it.intel = 5 letters, amd = 3. intel wins.
nvidia = 6 letters, ati = 3. nvidia has twice the performance (unless you want to argue that its amd/ati = 6 letters, one punctuation mark, giving them a slight edge).
what? then you tell me what everyone is on about!! all i'm saying is that, compared with the logic behind many arguments which spring up when these threads go fanboi...my formula almost makes sense!
remember to read it all and laugh, folks. ultimately...you're right about what you want. thats what it all comes down to...you are right. whatever your opinion, whatever your rationale...you are right. not these other people who want to tell you what and how to think.
amd/ati, intel, nvidia = at the buyer's discretion, based on his/her needs.
okay? okay...
now can we move along to some real issues here?
michal1980 - Wednesday, February 6, 2008 - link
unfair to compare a new solution to a card that by nvidia standards is now nearly 3 generations ago?and while I understand the need to say Praise amd for finally getting performance back. its sad that it takes 2 of their cards to beat 1 of nvidias
Amiga500 - Thursday, February 7, 2008 - link
"its sad that it takes 2 of their cards to beat 1 of nvidias"Its two chips on one card...
Seriously, how many people are choosing to see it as two cards? The PC sees it as one card, no crossfire drivers within the OS are necessary.
bigboxes - Wednesday, February 6, 2008 - link
You sound like a fanboy with your comments. Does it really matter if this is company X? Judge it on it's merits alone. Most of us will buy the card that performs the best for the $$.This product is a successful implementation of dual cores in a single card solution. nVidia's attempt at such a card was poorly executed. This review is just comparing the last attempt by nVidia at a dual gpu card. Yes, it took two AMD cores to outperform one of nVidia's latest and greatest, but it's in a single card. Very promising without breaking the bank, especially if one does not have two PCI-Express slots or the skills to set up a SLI/Crossfire in a proper manner.
It also looks like AMD's new card does this all without much of a hiccup. This is good news for us all. Don't worry for all of you that care which maker is on top. nVidia will come out with their entry in due time. We will have to wait and see if their entry will be as seamless as this AMD setup. Maybe, just maybe, AMD can do something right now and then.
SoCalBoomer - Wednesday, February 6, 2008 - link
It is a valid observation - the 7950 dual GPU on a card solution is old. It's what I was thinking. At $400+, the card is a high-end card and really doesn't do $150 worth more than, say, the 8800gt - or does it? It's an opinion thing I guess.That's not to say that this is not a good card - on the contrary, it's WONDERFUL to see AMD/ATI (both of whom I'm a huge fan) coming out with products that are competitive, smooth, and quality!
It's also great to see AMD come out with something that has good and smooth driver support - that's something I've always been frustrated about with the older ATI cards. . .loved what they did and loved them after I got them configured the way I wanted - hated the road to get there though.
Viditor - Thursday, February 7, 2008 - link
"the 7950 dual GPU on a card solution is old"But a smooth and effecvtive implementation of it like this one, is quite new...
"the card is a high-end card and really doesn't do $150 worth more than, say, the 8800gt - or does it?"
Interesting question...that $150 extra represents a 37.5% price premium. Plus, of course there's the epenis bragging rights of having the fastest card made (though I never really understood that one.
To give that perspective though, the 8800GT is 32% more expensive than the HD3850...
I guess if you're a gamer, then the X2 makes perfect sense, but if you use little graphics, then the 3850 makes more sense. In between the 2 is the 8800GT, so it will depend on what you do.
ChronoReverse - Wednesday, February 6, 2008 - link
By that argument, neither does the 8800 Ultra which is MORE expensive...SoCalBoomer - Thursday, February 7, 2008 - link
Out of curiosity, I looked to see where ANYONE mentioned an 8800 Ultra in this article. . .guess what, nobody did. Hmmmm, fancy that.
Topic of this was comparing a non-available previous-generation nVidia card to a top-of-the-line, just-released ATI.
THAT is not a valid comparision.
Gilgamesj - Wednesday, February 13, 2008 - link
Neither did anyone mention a 8800GT.Point is that it's up to the buyer to decide whether this is worth it. MultiGPU solutions have always been a questionable and subjective thing. The important objective thing taht can be said about this card is that it takes one disadvantage away, there's no need for a crossfire motherboard.
I think Ati put something interesting on the market.
Ryanman - Sunday, February 10, 2008 - link
It wasn't comparing it in terms of performance, bro. It was comparing it in terms of driver support and form factor. That type of comparison is legit no matter how old a card is and you know it.In terms of performace, this was tested with a 8800GT SLI config (500+) and performed just under it (at 4-450), and it outperforms the ultra for obvious reasons. THAT'S the performance comparison. Quit whining and being a fanboy and think before you post.
kextyn - Friday, February 8, 2008 - link
"It is a valid observation - the 7950 dual GPU on a card solution is old. It's what I was thinking. At $400+, the card is a high-end card and really doesn't do $150 worth more than, say, the 8800gt - or does it? It's an opinion thing I guess. ""By that argument, neither does the 8800 Ultra which is MORE expensive... "
"Out of curiosity, I looked to see where ANYONE mentioned an 8800 Ultra in this article. . .
guess what, nobody did. Hmmmm, fancy that.
Topic of this was comparing a non-available previous-generation nVidia card to a top-of-the-line, just-released ATI.
THAT is not a valid comparision. "
So why did you mention the 8800GT in your other post, hypocrite?
ChronoReverse's comparison was just as valid as yours.
schmunk - Wednesday, February 6, 2008 - link
I think it is fair. The article's focus is showing AMDs implementation of the dual GPU card, and how seemless it can be done as far a settings, dual sceens, and drivers. I don't think the point is that NVida did it first or faster. I think we all benifit from something like this.schmunk - Wednesday, February 6, 2008 - link
"AMD has pushed the fact that their new hardware is capable of fully accelerating windowed 3D based on how it manages clock speed with respect to work load, so we aren't quite sure why we are seeing this behavior."Could it be in windowed mode the desktop is showing and Aero Glass is hog? :)
schmunk - Wednesday, February 6, 2008 - link
Meant to say Aero. Seriously though doesn't it have some sort of virtual memory footprint for the Aero Interface?schmunk - Wednesday, February 6, 2008 - link
Sorry again, doesn't the Windows Vista Display Driver Model (WDDM) take up these resources?Goty - Wednesday, February 6, 2008 - link
Aero isn't running in that screenshot.