Sign in with
Sign up | Sign in
Your question

x1950 benchmarks

Last response: in Graphics & Displays
Share
August 12, 2006 1:51:38 AM

Dailytech benches

don't know if you've seen this but i'll post it anyway. The x1950's do pretty good but considering the low detail levels and the fact that quad sli isn't even doing as good as normal sli of dual 7900gtx's, do you think these cards will be as good as they look? I was going to get a 1900xt for $500 CAD, but the price dropped today to $416, and supposedly the x1950xt will take that price point. Will the 1950xt perform well enough to be worth the extra?

More about : x1950 benchmarks

August 12, 2006 2:09:42 AM

well theres no doubt that it will be atleast slightly better, considering the improved mem bandwith but is it worth the extra price? who knows
Related resources
August 12, 2006 2:52:20 AM

This thing looks amazing.

What will the price be for the x1950xt ($400 or $450)

Will I be able to go onto Newegg on the 23rd of August and be able to pick on of these up. Or is not going to arrive until weeks later, with the price jacked up (remind you of something...)
August 12, 2006 3:27:18 AM

Quote:
This thing looks amazing.

What will the price be for the x1950xt ($400 or $450)

Will I be able to go onto Newegg on the 23rd of August and be able to pick on of these up. Or is not going to arrive until weeks later, with the price jacked up (remind you of something...)



[CON]roe


:D 
August 12, 2006 5:44:42 AM

More like the 6800U or X800XTPE on launch ......
August 12, 2006 6:01:03 AM

Quote:
More like the 6800U or X800XTPE on launch ......



that 2
August 12, 2006 10:11:41 AM

I was thinking Conroe as I am still waiting for my E6600 and P5W from Buy.com. But getting back on topic, what will the price for the x1950xtx be?
a b U Graphics card
August 12, 2006 11:25:13 AM

Inq says a single X1950XTX beats the 7950GX2 but their VRzone link goes to a review without any visible data. Even with killer mem clocks, I don't believe the X1950XTX could take the crown from the GX2.

http://www.theinquirer.net/default.aspx?article=33643
http://www.vr-zone.com/?i=3885

http://www.hardocp.com/
Bummer, the review has been taken down and I am unable to find anyone who cached the article and posted the performance charts. [H] mentions the article on their homepage, Guess I shouldn't have slept in til 6:30 AM today. :roll:
a b U Graphics card
August 12, 2006 12:37:17 PM

Alright, found a link to the review at B3D. They used the highest quality settings in the drivers for the game tests, not the defaults. Not a bad idea, especially with the texture shimmering issues [H] and others talk about, especially with NV's deafult IQ settings. Anyway, that's a whole different arguremnt as to what setinngs should be used in a review, but one thing is for sure, most sites will run defualt settings, probably boosting the GX2 performance quite a bit. They mention that Cat 6.7 beta gave a good boost in performance, but we shall soon see how much of the gains the X1900XTX made on the 7950GX2 actually came from setting max quality settings, not a driver update.

But even if you just look at X1950XTX vs X1900XTX and throw out comparisons to the GX2, you see some decent boosts in performance:

24% in 3dmark05
33% in 3dmark06
19-20% average boost in the 6 games.


Enough blabbering, heres the link:
http://resources.vr-zone.com.sg/Shamino/1950/



Power consumption increase or not, I'd take that nice cooler and the performance boost over buying a X1900. Wonder what the street pricing would be like though. Who knows how it will OC. They used overdrive and had no problem running at overdrives max available OC of 700/1100(2200).
August 12, 2006 4:22:06 PM

YA those are good results, but i was pissed at this guy on another forum saying that the x1950xtx's in CF beat quad sli by 100%. Plus he seems to think that 2 7950GX2's is equivalent to FOUR 7900gtx's in sli. He's a tard. But back to price, the x1900xt's dropped from $500 CAD to $415 CAD so maybe the x1950xt will take the 500 or 550 price point. If thats the case i might just get it instead of the 1900xt. Also depend swhen it comes out because if it dosn't come out before the end of the month i won't get it. :cry: 
August 12, 2006 4:56:58 PM

Here's the price drops to the current X1900 line.

http://www.vr-zone.com/?i=3881

Supposedly the X1950XTX will be inserted in the $500 slot so the 256MB X1950XT would probably be in the $370 slot although it really should be a bit higher.

I still can't believe that ATI left the core clock of the X1950XTX at 650MHz. With the better cooler, the refined core, and mature process I'd think a 25MHz bump should have been easy without reducing yields or increasing power or temperature significantly. A 700MHz clock would have been perfect in light of all that extra bandwidth available. It might not leave much more overclock room, but the stock performance would be worth it and it'll definitely be needed to holdover against the G80s. At least the Crossfire is clocked at XTX speeds now so that provides a bump there.
August 12, 2006 5:43:06 PM

so do you think the cards will just replace the current cards in the price points they were a couple days ago or where those cards started out at?

In Canada the 1900xtx started at 750 and the xt was 650. last week they were 570 and 500 respectivly. Now they are 500 and 415. Now do you think the new x1950xt 512mb will be 650 or 500 CAD? Or maybe somewhere in between.
August 12, 2006 6:45:50 PM

Something like $570 CAD for the X1950XTX and $500 for the X1950XT sounds about right. It'll obviously be a bit higher until things cool down, but those are probably the target prices. They're targetting the X1950XTX to be priced below the 7950GX2 so that looks about right. The thing is there was no X1950XT until recently, and this new XT looks to only have 256MB of RAM. However, the advantage is that supposedly it's RAM will also be clocked at 2GHz, no word on the core though. Since 512MB doesn't generally offer much advantage over 256MB anyways, it'll probably work out for the best in the end.

http://www.theinquirer.net/default.aspx?article=33368

The X1900s themselves look to stick around for a while replacing the X1800s in the $300-$400 price range. The RV570XT will replace the X1900GT in the $280 price point, RV570XL in the $250, and the RV560 in the $200 price point. Although it's looking like the RV570XL and RV560 will both slip a bit since the RV535 (80nm X1600Pro) looks to become a X1300XT instead of a X1650. The RV535 (80nm X1600XT) is probably going to be discontinued completely in favour of the RV560. The RV516Pro and RV516LE look to directly replace the X1300LE and X1300Pro and the new RV505CE should finally replace the X300.
a b U Graphics card
August 12, 2006 7:00:03 PM

With that nice quiet performance cooler and 20+% performance increase over X1900XTX, the X1950XTX would be a bargain if they actually do retail for $399 USD with street prices at or below that. I'd have to think X1900 prices would drop quite a bit if this is the case. SHoot, I'd consider the X1950XTX the first time I see one for $350 or less, especially if I have a buyer for my X1800XT. ABover $350 I'll wait for G80/R600.
August 12, 2006 7:05:03 PM



How about a nail cut for that guy?
a b U Graphics card
August 12, 2006 7:17:27 PM

It sure could be $499 but it's not going to sell at 7950GX2 prices IMO. I think it needs to list under GX2 street prices, or forget it. The INQ has been pushing the $399 price, and since they are never wrong (and I am cheap) I'll go with that one. :D 
a b U Graphics card
August 12, 2006 7:21:12 PM

Quote:
How about a nail cut for that guy?


Wow, that is downright disgusting.

I read the temp guage but never considered checking out the guys hands. :p 
August 12, 2006 7:28:00 PM

The Inquirer haven't been pushing a $399 USD price for the X1950XTX, they've been talking about 399 Euros which is about $500.

http://www.theinquirer.net/default.aspx?article=33368

Quote:
projected €399 to €415 for X1950 XTX card

Besides the 7950GX2 usually sells around $550 USD so if the X1950XTX really is faster than it as VR-Zone and ATI's own benchmarks seem to indicate, a $499 USD price should be adequate. GDDR4 probably isn't cheap right now so they really can't cut prices that drastically. I'd still want to see a bump to at least 675MHz core though.
August 12, 2006 7:40:22 PM

Quote:


How about a nail cut for that guy?


WOW, thats gross, i didn't even notice the guy's nails till paul just said. MAN, get some nailclippers buddy :twisted:
a b U Graphics card
August 12, 2006 8:05:40 PM

I could be wrong, but am pretty sure somewhere they said $399 USD, which looked like a bargain compared to Europe. It may have been one of their links that said $399, not them directly. I didn't see it now but I'll look some more.

Here they say the master card with the same clocks is $449, so the slave XTX would have to cost less. I've been under the impression they think it's $399 for the XTX and $449 for the Master.

http://www.theinquirer.net/default.aspx?article=33147


And with $499 GX2's, they need to be under that price.
http://www.newegg.com/Product/Product.asp?Item=N82E1681...
August 12, 2006 8:09:55 PM

Yeah Im going with $450 on the release date and like the gx2's did in a few weeks will drop 50 or so dollars.

It HAS to be under 500, because that is what the 7950gx2 is at
August 13, 2006 5:55:40 PM

Quote:
I'd still want to see a bump to at least 675MHz core though.



Agreed. Thats my gripe with the new lineup. 550mhz mem is quite a nice boost, but seeing the core remain unchanged is just....lazy.

Then again, like I said, have the clock speeds officially been confirmed?
25-50mhz core would sure be generous of ATI, not to mention smart.
August 13, 2006 6:20:59 PM

how do those benchmarks make sense?


there is no way a single GPU can beat dual GPU's. Those benchmarks are bloated, its to get hype over the X1950XT, very good marketing technique.
August 13, 2006 6:54:30 PM

Quote:
how do those benchmarks make sense?


there is no way a single GPU can beat dual GPU's. Those benchmarks are bloated, its to get hype over the X1950XT, very good marketing technique.


Some more reviews would be nice, but VR-Zone is a reliable website in my eyes. Now correct me if I'm wrong but I believe I'm right, the problem is that the 7950GX2 and the 7900GX2 (Think XPS600 Renegade) have sluggish and downright pitiful memory speeds, respectively. That's why the 7950 seems to be worse in these benchmarks, and why the 7900GX2 was terrible (not counting the core speed too, which was also bad, and the 32xAF 16xAA not only didn't work, but it didn't have the horsepower to make it work but luckily, NVidia seems to have fixed those problems with the 7950).

Edit 1: Typo
a b U Graphics card
August 13, 2006 7:01:59 PM

Quote:
how do those benchmarks make sense?


there is no way a single GPU can beat dual GPU's. Those benchmarks are bloated, its to get hype over the X1950XT, very good marketing technique.

Sure, sometimes one 7900GT can beat two in SLI, when SLI isn't working in the Ap. :tongue:

But in reference to this review, there are two possibilities.:

First, his GX2 benchmarks may be off, although in the forums he says he retested the 7950GX2 3 times with the same results.

Second, he tested with drivers at the max quality settings. Many people would argue that this is how all high end cards should be tested. At default driver settings, NV has alot of optimizations going. Problem is they do reduce the IQ noticeably as sites like [H] and others point out. Texture shimmering for one example. And setting NV's drivers to max (turning off these optimizations) does give a big performance hit.

Again, this is a whole different arguement on how these cards should be tested. Both comapnies optimize, but it seems NV's IQ drop at default settings is worse than ATI. [H] says they will test at defualt settings, but continue to point out shimmering and issues as long as the companies default settings provide noticeable IQ degradation/shimmering. Other sites like VR-zone will test games at the High/max quality driver settings to equal IQ out between the cards as well as provide the best quality that high end gamers want. And still other sites leave things at defaults and make no mention of IQ. Even if screeenies are given for the reader to make up his mind, shimmering can't be noticed in a screen shot. To me this last group is careless to some extent. Many enthusiasts know better, but others are misled by assuming the IQ is equal. Like I said, it's a whole other arguement, but the ? of what settings should be used for testing" isn't cut and dry. personally, i useually test at defualt driver settings for easy comparison sake. But I try to make mention of the settings when sharing results with others. problem is the majority of people wouldn't know the difference the settings tab make.

IF this reviews results are accurate, then the X1900XTX is alot closer to 7950GX2 performance at high quality(some would say equal) settings. But at default driver settings, the 7950GX2 easily wins. Shoot (just a wild thought) we could even assume that there is a possibility that ATI wanted VR-zone to test using these settings to make the X1950 look better; probably get in alot less trouble for leaking early results than if they had used default settings and the GX2 easily won. Anyway, that's just something to think about, I'm not making accusations, don't get worked up about it anyone :) , it's most likely far from the truth. :twisted:

We shall soon be able to compile results from many sites and see just where the GX2 and X1950 match up. By the time we get it figured out, G80 will be released. :roll:
a b U Graphics card
August 13, 2006 7:21:45 PM

anyway, without digging into the past, here is a recent [H] article discussing IQ comparisons.

http://enthusiast.hardocp.com/article.html?art=MTA4Mywx...


Quote:
"Texture crawling and moiré are not something you can see in a screenshot, it can only be represented with movement. In the above screenshot of Half Life 2: Episode 1 we out outlined a portion of road in this map. This is one area we saw texture crawling and moiré as we moved down the road. It was visible on both ATI and NVIDIA hardware, but worse on NV hardware. Any places with detailed textures like this road you can spot texture crawling.

The great news is that this can be reduced on NVIDIA hardware, the bad news is that it takes a performance hit to do so. You can manually turn off all the filtering options in the advanced driver control panel as well as clamp the LOD bias. This greatly reduces it, but it doesn’t entirely do away with it, and it also takes a performance hit to do so. Still, if you want the best texture quality you will have no choice but to take the hit.

Overall ATI has the better texture which by default has less texture crawling and allows the “High Quality” AF option.
"
August 13, 2006 7:54:21 PM

Quote:
how do those benchmarks make sense?


there is no way a single GPU can beat dual GPU's. Those benchmarks are bloated, its to get hype over the X1950XT, very good marketing technique.


If you bother to read the whole article, it's x1950xtx crossfire against quad SLI 7950GX2. And you knew that Quad SLI is not optimized yet, Tom's review clearly showed that.

Also, fyi, dont get hyped up with SLI or Crossfire, for they just proved superiority under extreme resolution. Why? Because until now, when you do Dual GPU setup, the PCI X16 is chopped in half (x8 times 2) so the bandwidth is not increased. So just under extreme resolution (1900x1200 full AA and AF) it will prove superiority over single GPU setup because each card will do half the screen. Under normal resolution, single highest end GPU still beat dual near highest end GPU (7950GX2 is dual 7900gt not dual 7900GTX)

However, Asus just released new mobo (Crosshair) with full SLI PCI X32 (x16 times 2) on sale at Newegg.com so the result may be proven differently with dual GPU setup, because the bandwidth actually doubled.

http://www.newegg.com/Product/Product.asp?Item=N82E1681...
a b U Graphics card
August 14, 2006 9:23:35 PM

I see both my VR zone links are dead now.


Anyway, just wanted to report, the Inq says the X1950XTX is pushed off until Sept 14th due to large supply of X1900's.

http://www.theinquirer.net/default.aspx?article=33677

Bummer, I guess we have to add 3 more weeks to seeing all the full reviews and how they comapre to VR Zones HQ results.
August 14, 2006 9:28:46 PM

Quote:
I see both my VR zone links are dead now.


Anyway, just wanted to report, the Inq says the X1950XTX is pushed off until Sept 14th due to large supply of X1900's.

http://www.theinquirer.net/default.aspx?article=33677

Bummer, I guess we have to add 3 more weeks to seeing all the full reviews and how they comapre to VR Zones HQ results.


CRAP... man wtf, oh well, i guess thats good they lowered the prices before it come sout so i'm just gonna get a 1900xt :wink:
a b U Graphics card
August 14, 2006 11:29:48 PM

Quote:

there is no way a single GPU can beat dual GPU's.


Well history has proven that that can happen. And if you think about it logically, those 2 GPUs are slow than the GTXs , and so when added together are slower than 2 GTXs in SLi, now if the X1950XTX is a significant core and more importantnly in this case memory boost, then the ~50-70% boost of a card that is clocked to perform at a fraction of the current nV single card leader, which itself is slightly behind the X1900XTX, then it's quite possible than a single X1950XTX VPU could beat 2 essentially crippled GPUs running on a below efficient SLi (not saying below Xfire, but below 100% efficiency).

Quote:
Those benchmarks are bloated, its to get hype over the X1950XT, very good marketing technique.


The ATi provided benchies are almost assuredly the 'pick of the litter' where they found the results to put them in the best light, but for the VR-Zone results there's no benefit for them to falsify those.

I would think there will be many case where they trade blows, and the biggest drawback for the GX2 will be the fact that it's memory bandwidth is tiny in comparison and while it essentially doubles it's core power the memory size and bandwidth stay the same without improvement due to SLi, and therefore at certain choke points the GDDR4 on the X1950 might be handy compred to the 2 times as many shader units.

Something to consider is the situations where you see little difference between a GF7900GT and the GF7950GX2, IMO those are likely the areas (baring issues with SLi of course) that the memory is holding all that series back more than anything else.

In any case it does look to be an interesting showdown over the next few weeks in anticipation of the clash of the new Titans in the fall.
a b U Graphics card
August 14, 2006 11:40:39 PM

3Dcenter has a summary of VR-Zone and HardwareZone's results;

http://www.3dcenter.de/artikel/2006/08-13.php

Nice little graph with percentages and everything.

You will notice that there are some similarities and dissimilarities, with the VR-Zone benchies generally being more favourable to the new contender.

PS, yeah it looks like the dongle-less Xfire is for the RV570/X1950Pro not the R580+/X1950XTX.
a b U Graphics card
August 14, 2006 11:51:40 PM

Hey, thanks for the link.

Interesting to once again have the question of what settings to bench at, don't ya think. I'm sure the discreptancies can mostly be explained by VRzones high quality driver settings in their game (not 3dmark) tests. I was shocked to see discussion on Anand's forums were overwhelmingly in favor of VR-zones HQ driver settings as opposed to defaults. I'd imagine Max IQ settings in the drivers doesn't favor the GX2, maybe there are less NVidiots at Atech than I thought.
August 15, 2006 12:00:49 AM

@ Pauldh

Nice card, but 700?? Bah...

My X1800XT made 700 stock air (64% constant fan speed) without a voltage increase on the core....

Core clock needs to go higher; then again, this is my twisted view on cards.... :twisted: :twisted: :twisted:
a b U Graphics card
August 15, 2006 12:18:29 AM

Well I think the message is finally getting out there that if you are paying a premium, you don't want corners cut regardless of what we're talkin about.

It's about the IQ at this level because no one cares about 1024x768 with 4XAA and 8XAF, no, if you're gonna do anything less than 20x15, then it better be at HQ AF with good AA.

Seriously what it come down to is what's more important to the individual, more pixels or better AF, and really that should be the two tests for these calibre solutions, not every resolution from 8x6 to 24x16, those test are for later for interest's sake in how the architecture scales. Comparing 16x/14x AA @ 20x15 isn't realistic for playability, but it is insightful for core and memory performances under extreme loads.

I just don't see the need for low res low/no AA+AF benchies for these cards; for the low-mid range or laptops, sure makes sense but when you're taling about 1-4 VPUs in tests we're beyond the realm of 1024x768 no AA/AF. :roll:
August 15, 2006 1:10:23 AM

Some people cling to the past.

Dear god, I havent played @ 1024x768 since I had my 8500......


The only way I can see these benchies having validity would be starting @ 1280x1024. Only because a good portion (average) gamers use LCD's that support that res native. However, to an extent, I disagree with my own reasoning; nobody in there right mind should buy a high end GPU to game at low resolutions. Its like buying a cow but your getting the milk for free :!: :!: :!: :!:

The only game being the exception to this rule - Oblivion.
!