Sign in with
Sign up | Sign in
Your question

X1650 Pro and X1950 XTX

Last response: in Graphics & Displays
Share
July 24, 2006 3:13:50 PM

Quote:
The X1950 XTX running the R580+ core will still be slower than the doubled-up GX2 but we are sure that two X1950 XTXes will beat the single 7950 GX2. The cards will surface in late August and should be available from launch. We know that they will use GDDR 4 and that will end up seven percent faster from the current X1900 XTX cards, and will retial initially for 419 Euros and will soon drop to 399 Euros.
\


it refers to the inq but it was still interesting, not sure why they would release something like this if it can't beat the 7950Gx2 all there doing is competing with there own product in my mind, i dont know maybe its just b/c they want something new on the market to keep ppl interested, also with dx10 coming closer i'm not sure how popular this card will be...

More about : x1650 pro x1950 xtx

July 24, 2006 3:17:20 PM

Interesting.
July 24, 2006 3:25:09 PM

meh... i dont think they would sell very well personally, but you never know, plus it is an arcticle from the inquierer well ya you get my drift... but they were right on the amd ati merger so you never know
Related resources
Can't find your answer ? Ask !
July 24, 2006 3:33:19 PM

It sounds like an enthusiast level card to keep the enthusiasts happy until R600 in a few months.
a b U Graphics card
July 24, 2006 3:38:41 PM

I think they will sell well. Shoot, with a $449 list price, it's faster, cheaper, and has nicer cooling than the well selling X1900XTX. If street prices come in way below the 7950GX2, they will sell for sure.
July 24, 2006 3:40:12 PM

GDDR4 memory?! Nice.
July 24, 2006 3:41:15 PM

Quote:
It sounds like an enthusiast level card to keep the enthusiasts happy until R600 in a few months.


my thoughts exactly but any any computer enthusiast would know not to be stupid enough to buy one when most likely their 7900GT or X1800XT or higher gfx card will be fine until DX 10 cards come out...i'm guessing poeple like Alienware will make use of it...

but like i said i think its just to keep people interested in ATi, keep people reading about ATi, since nvidia has basically gotten ALL the press lately with its 7950GX2, and sales have been great from what i've heard, i'm sure ATi is trying to steal some of the thunder..
July 24, 2006 5:58:22 PM

Quote:
my thoughts exactly but any any computer enthusiast would know not to be stupid enough to buy one when most likely their 7900GT or X1800XT or higher gfx card will be fine until DX 10 cards come out...i'm guessing poeple like Alienware will make use of it...

but like i said i think its just to keep people interested in ATi, keep people reading about ATi, since nvidia has basically gotten ALL the press lately with its 7950GX2, and sales have been great from what i've heard, i'm sure ATi is trying to steal some of the thunder..

You just answered your own question????
July 24, 2006 7:45:12 PM

what question did i answer? i never had one in the paragraph your quoted me on...,i just wanted to bring this here for some open discussion, so when someone else brings something to the table then i'll be able to take that into account, i'm just bringing to the table what i have deducted from information, i'm interested in others opinions and/or information they have also heard..

edit: if your saying that the x1950xtx would sell well because of the success of the 7950GX2 i believe your comparing apples to oranges...

when you slap 2 gfx cards together you spark the interest in people that one very good card such as the x1900xtx just wouldn't do, no matter how good the x1950xtx will be it will never be 2 cards in 1 therefore never sparking that interest...plus the 7950GX2 is the fastest single slot card out..if the x1950xtx comes out and doesn't regain the crown it will be a big disappointment, all will be is a overclocked x1900xtx, which is never a good thing as you'll notice with 7900GTs
July 24, 2006 7:52:57 PM

I like the big red cooler. Looks nice. (and it looks suspiciously like the ATI Silencer... :? )

Here's the link to the pic, for those who are too lazy to google a little bit... :wink:
July 24, 2006 8:05:49 PM

Quote:
I like the big red cooler. Looks nice. (and it looks suspiciously like the ATI Silencer... :? )

Here's the link to the pic, for those who are too lazy to google a little bit... :wink:


i think it looks a lot like the IceQ from HIS...check that HERE

Edit: now that i've seen all 3 looks like they all employ the exact same design...
July 24, 2006 8:09:11 PM

find some more info i guess or supplemental

July 24, 2006 8:13:40 PM

I wonder which one of them have the patent... :lol: 
July 24, 2006 8:14:49 PM

im definitely going to purchase it.. looks better than x1900xt by far.. and is not that much more expensive, cooler, much quieter, and looks freaking sweet!


if you want a really good look at the card, here is a good link
x1950xtx
July 24, 2006 8:48:47 PM

That's fake; the X1950XTX will still be built off the 90nm process. Even the R600 will be released as a 90nm part and the R600 refresh will be the first ATI 80nm GPU.
July 24, 2006 9:10:35 PM

yup, that pic was circulating around quite a few months ago man. Even then ppl saw that it was fake...
July 24, 2006 9:41:44 PM

Not to mention I have heard that memory will run around 2Ghz rather than the 1.8 ghz stated in the slide above.... oh yeah and the whole using GDDR3 bit lol.
July 25, 2006 12:21:55 AM

Quote:
Not to mention I have heard that memory will run around 2Ghz rather than the 1.8 ghz stated in the slide above.... oh yeah and the whole using GDDR3 bit lol.


2 GHz?? Maybe people will start confusing RAM clocks with CPU clocks, with FSB clocks, with... well, almost everything else. :tongue:
July 25, 2006 1:02:20 AM

Quote:
Not to mention I have heard that memory will run around 2Ghz rather than the 1.8 ghz stated in the slide above.... oh yeah and the whole using GDDR3 bit lol.


2GHz GDDR4? In that case, it might actually stand a chance against the 7950GX2, whose memory runs at a mere 1.2GHz. I find it funny how CPUs broke the 1GHz mark 6 years ago, and we're already talking about VRAM at 2GHz. It just goes to show how fast technology advances.
a b U Graphics card
July 25, 2006 3:03:18 AM

Quote:
I think they will sell well. Shoot, with a $449 list price, it's faster, cheaper, and has nicer cooling than the well selling X1900XTX. If street prices come in way below the 7950GX2, they will sell for sure.


I think alot of people here are forgeting the BIG reason why the GF7900 was scary (cheaper chip that might outperform more expensive chip). A GF7950 is 2 of those chips (now no longer cheaper), so being right up against them with one chip is a scary thing for the other way around now.

I agree it's a 'holding pattern product', but it's likely to be cheaper than it's competition to make, and despite it 'not top' spot, I'd be surprised if it didn't outsell the GX2 just becuase of most people's reluctanc to buy cards like that. But don't get me wrong I think the X1900XT and GF7900GT and X1900XTX and GF7900GTX will outsell them both by a heck of alot.

It'll be nice to see something new but personally I don't care about the X1950XTX as much as what they've done to the X16xx series to make it not suck so much (it's 'ok for the price' it's no GF7600GT).

I'm dissapointed in not hearing more X1700 news yet.
July 25, 2006 3:20:15 AM

Quote:
I'm dissapointed in not hearing more X1700 news yet.


Well, I'm not. Where would the X1700 go? I mean, against which Nvidia card?
The X1800GTO is against the 7600GT... So, the X1700 would go against the... 7600GS? 7300GT? :?
I think there's no point in releasing an X1700 card. Maybe with a very low price... :?
July 25, 2006 4:16:58 AM

I am w/ grapeApe on this one. The 1600 is kinda the red-headed step child. Not really good at anything, merely ok. A better card is sorely needed in that low/mid-range lineup.

the 1700 (or 1650 or whatever they will call it) would obviously (to me) replace that 1800gto. the 1800 is the "older" part while the 1700/1650 would be "newer" to match w/ the 1900 parts that are trying to supplant them. While the 1900gto is not exactly a barn-burner, it is still the part slated to replace the 1800xt (i think) so it seems only "natural" to drop the 1800gto if the newer part can truly take its place.

Of course, all this is conjecture as ati (and later amd?) may have a totally diff viewpoint on how all of this should go. But just like car companies, ati and Nv are always trying to keep the sku's at a minimum while still filling the proper segments.

Hopefully it can stand toe to toe w/ Nv's best at that level.
July 25, 2006 4:23:23 AM

Yeah, now that I think this straight... It makes sense.
I mean, the X1600 (Pro, not XT), in raw perfomance, is almost head-to-head with the 6600GT. It's almost ridiculous.
Is the X1800GTO so bad to worth the replacement? :?

On another side... I would've love to see which was going to be the name for the next-gen ATI cards... X2000? Sounds like a 1950's "supercomputer". :lol:  :lol: 
July 25, 2006 4:44:52 AM

not at all, I dont think the 1800gto is bad at all. Neither was the 1800xt, or the 7800gtx... but new chips trickle down and replace "old" ones that are no longer being produced.

Ya, they could still make 'em... but then why make new ones at all?

As is the case throughout history though, "new" is not always "better"... the 1900gto is not really better then the 1800xt in all benches but Oblivion, and even that is a very close margin. ATI just needs to do something w/ the chips that do not make the cut for xt/xtx variants, so they trim them down to gto's. Better then throwing them away...
a b U Graphics card
July 25, 2006 5:26:04 AM

Frst of all this has less to do with the F'ed up desktop section that I'm not as interested in seeing and X1700, than the notebook/laptop segment that needs a chips like the MRX1700.

(EDIT, PS, I just looked at that sentence and it seems to be like I'm mad at you, whichis definitely not the case, no it's just sort of a frustration with the lack of product in the mobile market despite a long LONG wait. Hope you didn't see that as me getting cross, not my intention).

Quote:
Well, I'm not. Where would the X1700 go? I mean, against which Nvidia card?


It would go as slightly better than the GF7600GT, but be a cheaper chip to produce, and not be the crippled more expensive chip that the X1800GTO and X1900GT are. Essentially it's a refresh of the X1900GT is where they're now targetig/naming it, but it'll still be the X1700 to me.

Quote:
I think there's no point in releasing an X1700 card. Maybe with a very low price... :?


You're missing the benifits other than price, power consumption and heat, and since my concern is for laptops, that's why a crippled X1800/1900 isn't as attractive as a built for purpose full on smaller chip, that performs better than the X1600 but doesn't consume as much power as the GF7600, X1800 and other solutions.
a b U Graphics card
July 25, 2006 5:36:20 AM

Quote:

On another side... I would've love to see which was going to be the name for the next-gen ATI cards... X2000? Sounds like a 1950's "supercomputer". :lol:  :lol: 


Well now that they merged with AMD of course it will be the AMD XG2000+ or the X2kG line. :twisted:
July 25, 2006 6:14:25 AM

Quote:

On another side... I would've love to see which was going to be the name for the next-gen ATI cards... X2000? Sounds like a 1950's "supercomputer". :lol:  :lol: 


Well now that they merged with AMD of course it will be the AMD XG2000+ or the X2kG line. :twisted:

Link to where you get that name.
a b U Graphics card
July 25, 2006 12:27:53 PM

Quote:
Link to where you get that name.

:roll: , did you see the :twisted: ?

It raises a good point. Whatever the planned name for R600 (my obvious guess being X2800XTX) might have been, AMD could very well say nope, that blows lets go with _________ name.
July 25, 2006 12:32:19 PM

Do you guys really see AMD putting that amount of control on ATI? I don't think they will.
a b U Graphics card
July 25, 2006 12:32:49 PM

Quote:
I'm dissapointed in not hearing more X1700 news yet.

Considering how long we have known about it (and you've had interest in it), it hard to imagine not knowing more details.
July 25, 2006 12:55:53 PM

Well, I think I'd be more impressed if DX10 wasn't going to be available for some time. Considering the fact that by this time next year, when I upgrade my video card, the next-gen cards will be out, I don't see these cards as necessary. Then again, I'm not an uber-enthusiast. If you absolutely have to have the best in everything, two of those cards in X-Fire should be a hair faster than a 7900GTX system - I emphasize the word hair.

And...

THREAD HIJACK FTW!

I'm waiting to see how the AMD-ATI merger will take shape. But I think this could be bad for people like me who generally prefer AMD and nVidia, and worse for those who don't buy anything else (BM...MMM...etc). I am distrusting of ATI's drivers, even more so under Linux. I prefer nVidia's software support much better than ATI's...and I also prefer green to red...:roll:.

I don't like mixing manufacturer's drivers on my system. If I want to buy AMD for my next computer, and I have to have an ATIMD chipset, and I want an nVidia card, I probably won't be as happy as I could be. I hope AMD gives ATI a hand with their drivers, because they suck, according to my experiences. If I am required to have both ATI and nVidia drivers on my system to buy the hardware I want, I won't like it.

Yes, I know there won't be any performance problems, but there's more to computing than performance. Stability is what I'm after, and it's impossible to tell that within three months some dll will get mixed up, and I won't be able to boot into Windows.

Paranoid? Yes. Neurotic? Yep. Wrong? Who can say...
July 25, 2006 7:40:58 PM

Quote:
what question did i answer? i never had one in the paragraph your quoted me on...,i just wanted to bring this here for some open discussion, so when someone else brings something to the table then i'll be able to take that into account, i'm just bringing to the table what i have deducted from information, i'm interested in others opinions and/or information they have also heard..

edit: if your saying that the x1950xtx would sell well because of the success of the 7950GX2 i believe your comparing apples to oranges...

when you slap 2 gfx cards together you spark the interest in people that one very good card such as the x1900xtx just wouldn't do, no matter how good the x1950xtx will be it will never be 2 cards in 1 therefore never sparking that interest...plus the 7950GX2 is the fastest single slot card out..if the x1950xtx comes out and doesn't regain the crown it will be a big disappointment, all will be is a overclocked x1900xtx, which is never a good thing as you'll notice with 7900GTs

My statement was more one of confusion, hence the extra questions marks at the end of it. I apologize for the confusion.

The statements I quoted from you contradict each other.
a b U Graphics card
July 25, 2006 9:07:37 PM

Quote:

It raises a good point. Whatever the planned name for R600 (my obvious guess being X2800XTX) might have been, AMD could very well say nope, that blows lets go with _________ name.


Yes the smiley face of evil shizzle-disturbing. :twisted:

REALLY, IMO AMD/ATi should start a new number combo, still call it the Radeon like the the Radeon AA100, thus giving us the DX10 = #10x.
It also speaks to AMDATi and Ati's dominance of AA. :wink:

You could even make it so that AA is enthusiast line AB is mid range, and AC is entry, that kind of thing, easy for all the X1K and GF7xxx haters out there. It's the AMD 1

Why 1? Because it's our refresh, next year we launch the new cards the 2, and then it' refresh "e" , not quite three but more than 2. :twisted:

If anytime were the opportunity to jump away from the 100000,00000000..... number scheme, now would be the time to start because now you have an excuse for a rename.
July 25, 2006 9:38:16 PM

yup, I liked the era of 7x00, 8x00, 9x00 b/c it was centered around what dx rev it natively supported. (minus the 9000 re-works of the 8000 cards of course ;)  )

once ati went to the 'x' pre-character (x800) it became arbitrary. At least Nv still keeps to the basic idea that the 7900 is still gen 7 of the "geforce" cards.

reading many statements from amd/ati dudes it seems they will keep "radeon", so "amd1" is probably out no matter how cool it would be ;) 

but radeon 10x would be cool. so would radeon AA100/AB100/AC50. simple and effective.

I agree though, do it now. Need a new number sceme no matter what, might as well do it now.
July 25, 2006 10:51:04 PM

The X1950XTX not going to best the 7950GX2? Like that's supposed to be news?

The real news, I think, would be that the R580+ would come dangerously close to the power of the 7950GX2, and absolutely crushing the 7900GTX. This is big news, after all, since, as many people are forgetting, the 7950GX2 is not one, but two GPUs on one card. And that means it's mighty expensive. (~$600US, I believe) If you can promise something that's just a bit less powerful, but runs cooler, quieter, doesn't require a rediculous power supply, has more features, and costs 25% less right off the bat, I think youv'e got a product going.

And not all of the performance comparisons would still go to the 7950GX2; checking in on Oblivion, the "SLi on a card" setup just barely manages to edge out the X1900XTX overall; it does lose a few tests! The increase to the R580+ would almost certainly plant all of those benchmarks into the hands of ATi.
Quote:
find some more info i guess or supplemental

Clearly fake. (Reminds me of the Nintendo "Nexus" as opposed to what is now knowns as the "Wii") It's well-known that the code-name for the core is "R580+" not "R590." Oh, and it'll be using GDDR4, not GDDR3.

Quote:
That's fake; the X1950XTX will still be built off the 90nm process. Even the R600 will be released as a 90nm part and the R600 refresh will be the first ATI 80nm GPU.

Actually, I thought that the RV575 (or whatever the X1700 would be) was what the first 80nm part from ATi would be; their newest process/design always tends to be tested out on the mid-range first, just like the X700 was the first 110nm part, and the X1600 was the first card to go with the assymetric "pipeline-free" design.

The X1700s would likely arrive later this year, after the X1950s, but before any DirectX 10 cards.

Quote:
Not to mention I have heard that memory will run around 2Ghz rather than the 1.8 ghz stated in the slide above.... oh yeah and the whole using GDDR3 bit lol.

I doubt it will be 2 GHz. Since it appears that some say the increase will be around 10%, I'd guess it'll be around 1.7GHz. I'm not sure if I can trust The Inquirer on this issue, particularly since ATi seems to have learned their lesson on trying to use memory to new to use in bulk. (remember the X800XT PE shortages, anyone?) Even if it is GDDR4.

Quote:
I think alot of people here are forgeting the BIG reason why the GF7900 was scary (cheaper chip that might outperform more expensive chip). A GF7950 is 2 of those chips (now no longer cheaper), so being right up against them with one chip is a scary thing for the other way around now.

I agree it's a 'holding pattern product', but it's likely to be cheaper than it's competition to make, and despite it 'not top' spot, I'd be surprised if it didn't outsell the GX2 just becuase of most people's reluctanc to buy cards like that. But don't get me wrong I think the X1900XT and GF7900GT and X1900XTX and GF7900GTX will outsell them both by a heck of alot.

It'll be nice to see something new but personally I don't care about the X1950XTX as much as what they've done to the X16xx series to make it not suck so much (it's 'ok for the price' it's no GF7600GT).

I'm dissapointed in not hearing more X1700 news yet.

Indeed, tons of people forget that the 7950GX2 is, indeed, actually two graphics cards sold as one; the only advantage is that you can use two together. And although I'm not certain, I wouldn't be suprised if a SLi-capable chipset was required to use even just one of those cards.

It's a shame that most people think of the graphics war's winner as soley the winner of the "my ship has the biggest guns" battle. This isn't a benchmark-driven economy, after all.

Oh, and I tell how pained you are, wondering when that Conroe/X1700 laptop you've been dreaming about will finally materialize. The thoughts of such a machine, to me, have even had me considering that spending the money on getting a new laptop, rather than a used one (I have no such machine right now) might not be such a bad idea after all.

Quote:
Well, I'm not. Where would the X1700 go? I mean, against which Nvidia card?
The X1800GTO is against the 7600GT... So, the X1700 would go against the... 7600GS? 7300GT? :?
I think there's no point in releasing an X1700 card. Maybe with a very low price... :?

It would likely best the 7600GT, and possibly even give the 7900GT a run for its money, at least on the price/power curve.

It would replace the X1800GTO, most likely, and possibly even the X1800XT; ATi has made some claims that the X1700XT would be more powerful than the X1800. The main reason is that it would, for one, be the first GPU designed as mid-range to sport a 256-bit memory interface, and possibly be using GDDR4 memory, that would give the card memory bandwidth comparable to the top-notch cards seen today. The core itself would be like half of an X1900, with 8 texture units and ROPs, 4 (or likely 6) vertex shaders, and a whopping 24 pixel shaders.

All put together, and you've got a platform that could aggressively attack the 7600GT, which is both limited by only having 12 of each pixel unit (and 4 vertex shaders) as well as being crippled by having only 21.4GB/sec of memory bandwidth.

And indeed, the scariest part about it is that, quite possibly, the RV575 would, indeed, be a 80nm part. (ATi, as we all know, likes to try their new things on mid-range parts first) That would mean that an X1700XT would almost certainly ring in under $200US, and possibly be close to the $150US mark. For something that would thoroughly thrash the 7600GT, and even contest the 7900GT... It would be a rather effective attack on what appears to consistently be nVidia's highest-earnings area.

Quote:
Yeah, now that I think this straight... It makes sense.
I mean, the X1600 (Pro, not XT), in raw perfomance, is almost head-to-head with the 6600GT. It's almost ridiculous.
Is the X1800GTO so bad to worth the replacement? :?

On another side... I would've love to see which was going to be the name for the next-gen ATI cards... X2000? Sounds like a 1950's "supercomputer". :lol:  :lol: 

As sojner said, old chips simply need to be replaced; the R520 is an expensive chip to produce, and takes up a lot of resources. Given that the RV575 could be substantially smaller in size, yields would be better, more could be produced per wafer, and all around, it'd be cheaper, and quicker, to produce.

Quote:

It raises a good point. Whatever the planned name for R600 (my obvious guess being X2800XTX) might have been, AMD could very well say nope, that blows lets go with _________ name.


Yes the smiley face of evil shizzle-disturbing. :twisted:

REALLY, IMO AMD/ATi should start a new number combo, still call it the Radeon like the the Radeon AA100, thus giving us the DX10 = #10x.
It also speaks to AMDATi and Ati's dominance of AA. :wink:

You could even make it so that AA is enthusiast line AB is mid range, and AC is entry, that kind of thing, easy for all the X1K and GF7xxx haters out there. It's the AMD 1

Why 1? Because it's our refresh, next year we launch the new cards the 2, and then it' refresh "e" , not quite three but more than 2. :twisted:

If anytime were the opportunity to jump away from the 100000,00000000..... number scheme, now would be the time to start because now you have an excuse for a rename.
July 25, 2006 11:39:02 PM

Quote:
If you can promise something that's just a bit less powerful, but runs cooler, quieter, doesn't require a rediculous power supply, has more features, and costs 25% less right off the bat, I think youv'e got a product going.


unless you've forgotten this last generation of cards from ATi has been known for a few things in comparison to the GF7 series...

1) Run Hotter
2) Require more power
3) Louder
4) don't OC very well...(besides GTO versions, which are usually medium range models..)

taking that all into account i don't exactly find this card very impressive, depending on how cooler, quieter and more powerful it is in relation to a regular x1900xtx i guess will determine everything...the 7900GTX and the x1900xt traded blows depending on the game but the 7900 was cooler and quieter, and drew less power, so its about time ATi gets its act in line to make a cooler, quieter, less power hungry card IMO

http://www.anandtech.com/video/showdoc.aspx?i=2717&p=4

power loads of the 7900GTX and X1900XT and XTX

and to make things clear here, any psu that could run a x1900xtx or a x1950xtx for that matter would run the 7950gx2...

and have you heard how well the 7950gx2 is actually overclocking ,pretty well from what some people are saying...


Edit: after doing some research i've found that the 7950GX2 draws 143W, while the X1900XTX draws 150W...now who needs the rediculous psu...and the only time the 7950GX2 loses is at low resolutions, even then the fps aren't bad, just a few lower than the x1900xtx, but when the resolution is cranked and the eye candy full the 7950GX2 is a hands down winner...
July 26, 2006 12:28:19 AM

Quote:
yup, I liked the era of 7x00, 8x00, 9x00 b/c it was centered around what dx rev it natively supported. (minus the 9000 re-works of the 8000 cards of course ;)  )

once ati went to the 'x' pre-character (x800) it became arbitrary. At least Nv still keeps to the basic idea that the 7900 is still gen 7 of the "geforce" cards.

reading many statements from amd/ati dudes it seems they will keep "radeon", so "amd1" is probably out no matter how cool it would be ;) 

but radeon 10x would be cool. so would radeon AA100/AB100/AC50. simple and effective.

I agree though, do it now. Need a new number sceme no matter what, might as well do it now.


X is the roman number for 10, so X1000 is 11,000.

Either way, when this new card comes out, if I were ATI, I would phase out the last generation of high-end coolers. As much as I love ATI, their coolers suck something alright, but that's better left unmentioned what exactly. To be honest, the most important thing is price performance ration, like how AMD cut prices to be competitive with Intel. Anyone can release a 500 card. Not everyone can release a 500 dollar card that's worth buying.
July 26, 2006 12:44:31 AM

Quote:
unless you've forgotten this last generation of cards from ATi has been known for a few things in comparison to the GF7 series...

1) Run Hotter
2) Require more power
3) Louder
4) don't OC very well...(besides GTO versions, which are usually medium range models..)

taking that all into account i don't exactly find this card very impressive, depending on how cooler, quieter and more powerful it is in relation to a regular x1900xtx i guess will determine everything...the 7900GTX and the x1900xt traded blows depending on the game but the 7900 was cooler and quieter, and drew less power, so its about time ATi gets its act in line to make a cooler, quieter, less power hungry card IMO

http://www.anandtech.com/video/showdoc.aspx?i=2717&p=4

power loads of the 7900GTX and X1900XT and XTX

and to make things clear here, any psu that could run a x1900xtx or a x1950xtx for that matter would run the 7950gx2...

and have you heard how well the 7950gx2 is actually overclocking ,pretty well from what some people are saying...


Edit: after doing some research i've found that the 7950GX2 draws 143W, while the X1900XTX draws 150W...now who needs the rediculous psu...and the only time the 7950GX2 loses is at low resolutions, even then the fps aren't bad, just a few lower than the x1900xtx, but when the resolution is cranked and the eye candy full the 7950GX2 is a hands down winner...

hmm... I can bring in charts too:
different power draw
It shows the gx2 tanking on idle. Yes, the 1900xtx jumps up on load, but considering that a card is idle more than loaded it seems the power difference goes in favor of anything but the gx2...

just thought I would give another opinion on this stuff.

Another counter is that I have oc'd my xt rather well on tests... brought it above xtx speeds w/ no issues, and higher... all with stock cooling. I have seen that w/ aftermarket cooling it rocks. Nv is no different there, and if much of the 7900 oc issues at launch are to be believed then they may be worse off.

finally, when looking at the results from Tom's (and other) reviews the gx2 does not even begin to show where the extra $ and chip are going until it is above 1600x1200... and I mean like 2560x1600...

granted, most ppl that get a gx2 would be running that res anyway, but I still dont think the performance delta is worth the $ or that extra power bill on idle... which would be large. ;) 
July 26, 2006 12:55:48 AM

Quote:
X is the roman number for 10, so X1000 is 11,000.


:?: Huh? ATI used X as a moniker, and in the graphics parlay, 9x00 means the family of cards consisting of the 9600, 9700, 9800 etc. Not sure if you were being sarcastic or actually telling me I was using the 'x' wrong... :?:

Quote:
Either way, when this new card comes out, if I were ATI, I would phase out the last generation of high-end coolers. As much as I love ATI, their coolers suck something alright, but that's better left unmentioned what exactly. To be honest, the most important thing is price performance ration, like how AMD cut prices to be competitive with Intel. Anyone can release a 500 card. Not everyone can release a 500 dollar card that's worth buying.


Hmm... I actually like the fact that the cooler on mine exhausts all heat out of the case. It works great and IMO it is better then the top-end Nv coolers. And frankly, I think that even like the current expensive hog the gx2 that we are debating over, whatever the top-end card(s) are there is a premium for that last 5% of performance. So even in the case of a $400/500/600 card, you pay to play. Or you wait until the price drop and then you pay less to play second best (or whatever)

JMO of course...
a b U Graphics card
July 26, 2006 2:06:09 AM

What he is saying is, X800 stands for 10800, which came 1 generation after the 9800's. Then came the X1800 or the 11800. X stands for 10 in ATI's naming scheme. ATI thought replacing 10 with X was a cooler sounding name.
July 26, 2006 2:24:04 AM

Quote:
What he is saying is, X800 stands for 10800, which came 1 generation after the 9800's. Then came the X1800 or the 11800. X stands for 10 in ATI's naming scheme. ATI thought replacing 10 with X was a cooler sounding name.


That's interesting.
July 26, 2006 2:27:43 AM

Quote:
What he is saying is, X800 stands for 10800, which came 1 generation after the 9800's. Then came the X1800 or the 11800. X stands for 10 in ATI's naming scheme. ATI thought replacing 10 with X was a cooler sounding name.


That's interesting.

Interesting??? 8O 8O
I want to believe you already knew that.... Because it's so obvious....
The "X" means "10" in roman numbers.
July 26, 2006 3:34:21 AM

lol... duh. I totally took it all wrong, my bad. :oops: 

I was thinking of the 'x' i used in the 9x00 lists i made...
a b U Graphics card
July 26, 2006 3:39:38 AM

AH, gotcha. And x Like we use in talking about the X8xx series cards. No biggie.
a b U Graphics card
July 26, 2006 5:35:30 AM

Quote:

unless you've forgotten this last generation of cards from ATi has been known for a few things in comparison to the GF7 series...

1) Run Hotter


Yet it was the GF7900GT and GTX that were crapping out due to overheating and poor handling of OC'ing. Guess it doesn't matter if your part is 10c hotter as long as it runs. :twisted:

Also the GX2 runs hotter than the XTX and most of that GX2's heat is pumped back into your cae while most of the XTX's heat is sent out of the case away from other components.

Quote:
2) Require more power


Which only matters if you can afford a $400 card but not a $50 PSU. :roll:
If this were a laptop using a battery then maybe it'd matter more.

Quote:
3) Louder


Funny how it's known for that, yet nV seems to be maing the loudest card out there with the GF7900GT;
http://www.xbitlabs.com/articles/video/display/powercol...

Quote:
4) don't OC very well...(besides GTO versions, which are usually medium range models..)


Compared to what? At least hey didn't have trouble running at stock speeds. :twisted:

Both companies have good and bad overclockers.

Quote:
taking that all into account i don't exactly find this card very impressive, depending on how cooler, quieter and more powerful it is in relation to a regular x1900xtx i guess will determine everything...the 7900GTX and the x1900xt traded blows depending on the game but the 7900 was cooler and quieter, and drew less power, so its about time ATi gets its act in line to make a cooler, quieter, less power hungry card IMO


The X1900 with that same HSF, is cooler & quieter, but the power hungry argument is pretty funny for a top card. I can just see it now;

"Well I want the extra 20% performance, but the $50 more for a power supply and $1 a year difference in my power bill scared me off. So I decided it screw it all I'll buy a passive VIA chrome VPU with an Eden CPU and now I can game with less power than my LCD monitor draws, of course I'm playing the original DooM and Quake, but ooohh I'm cool quiet and efficient." :roll:

Quote:
and have you heard how well the 7950gx2 is actually overclocking ,pretty well from what some people are saying...


Actually it's not as good as the others, ok core but bad memory @ Xbit;
http://www.xbitlabs.com/articles/video/display/nvidia-g...

Quote:
but when the resolution is cranked and the eye candy full the 7950GX2 is a hands down winner...


Perhaps as a single slot solution, but as a dual PCB/VPU solution it often loses to an X1900XT in Xfire, especially at very high level of AA. So when you really crank the AA to 14-16X the tables turn back again, even with the GX2 in SLi.

So looks like BOTH have something to work on, eh!?!
a b U Graphics card
July 26, 2006 6:25:21 AM

Quote:

Oh, and I tell how pained you are, wondering when that Conroe/X1700 laptop you've been dreaming about will finally materialize. The thoughts of such a machine, to me, have even had me considering that spending the money on getting a new laptop, rather than a used one (I have no such machine right now) might not be such a bad idea after all.


Well I didn't get an encouraging, although not completely reasuring e-mail yesterday;

http://forumz.tomshardware.com/hardware/Intel-Crossfire...

"Thanks for the note. Our merger with AMD is meant to open doors for both companies while at the same time giving our customers more choice. Our graphics products are currently being used almost exclusively by Intel to demonstrate their new Conroe and Merom processors. Our notebook partners make the choice of what combinations they offer and given our continued dominance of the discrete notebook market (70% of high-end notebooks ship with ATI graphics), there is every reason to believe the combination you mentioned will be available."

So there's hope yet. 8)

And hopefully by the time the licensing agreements do expire AMD will have a stronger competitor in the mobile segment.
July 26, 2006 12:08:51 PM

Quote:
(EDIT, PS, I just looked at that sentence and it seems to be like I'm mad at you, which is definitely not the case, no it's just sort of a frustration with the lack of product in the mobile market despite a long LONG wait. Hope you didn't see that as me getting cross, not my intention).


Not at all. It's ok, Ape. :wink:

Quote:
I don't like mixing manufacturer's drivers on my system. If I want to buy AMD for my next computer, and I have to have an ATIMD chipset, and I want an nVidia card, I probably won't be as happy as I could be. I hope AMD gives ATI a hand with their drivers, because they suck, according to my experiences. If I am required to have both ATI and nVidia drivers on my system to buy the hardware I want, I won't like it.


My thoughts exactly. That's why I always buy NVidia cards. I like nForce chipsets, and, those "have" to go with GeForce cards. :D 

Quote:
Yes, I know there won't be any performance problems, but there's more to computing than performance. Stability is what I'm after, and it's impossible to tell that within three months some dll will get mixed up, and I won't be able to boot into Windows.


I don't know about stability. Probably there won't be any issues, but... There's always a "but". 8)

Quote:
Paranoid? Yes. Neurotic? Yep. Wrong? Who can say...


Well, at least you're not alone! :wink:
July 26, 2006 12:44:27 PM

Well, consider that the GX2 consists of two distinct cards in SLI mode. If a single card (X1900XT/X) can draw more power than two cards (GX2), the company with the power hog needs to do something.

I'd say nVidia has done pretty well getting two cards down to the relative power consumption of a single card. A more effective cooling system may be in order, but it shall all come in time.

And there could be a worse setup. Remember that Asus card with two 7800GT's? Sounded good on paper, and looks monstrous in the case, but didn't quite perform well enough to sell well. A prime example of a good basic idea with bad implementation.

And I agree with Grape on the power supply issue. If you're going to buy a $400 video card, you'd better be able to spend some cash on what's going to back it.

It's like those people buying PD805's with cheap RAM, motherboards, and PSU's, expecting to hit 4GHz on stock cooling, then coming here to cry when they get stuck with a machine that won't boot.
July 26, 2006 1:17:45 PM

It's interesting only because it's so uninteresting... didn't they say it'd be 7% faster than the previous card? That's hardly worth getting excited over. I'm with you guys... no one in their right mind would drop that kind of money on a card like that at this point (pre-DX10) ... and if you were willing to spend mad money on the fastest card, you'd just get the 7950.
!