sine_1 :
Yes I am aware that the Geforce 7950gx2 is not going to be well compatiable with the latest games as the 8 series is. Peronsely speaking the 8800gts 320mb is a piece of crap due to low amount of memory. Pretty much every game the 7950gx2 runs supreme over it. Tomshardware has tested every card with a good balance of latest games and fairly old ones.
Its not the best card but still pretty good and by all means competes with the 8800 series. If you do the calculation in tomshardware charts the 8800ultra is 19.4% faster than 7950gx2. Do it yourself then but yes I do agree that its not entirely accuarate but atleast gives you some insight how fast the card is.
I didnt write this forum to know how good my card is. Merely to know how to fix the issue. Now if you people with 8800 series want to shrug me well dont bother posting here.
Yes I will be getting a Geforce 9900gtx when it comes out thats definite.
Ugh, there you go again. Let me try to explain a flaw to your logic with the 19.4 % you keep mentioning. You are doing yourself a great disservice misusing and trusting the charts like that. That 19.4 % calculation is so off base and useless you simple must take the time to understand this. It adds up all games and all resolutions. WHo cares if two are similar at 10x7 resolution when both are limited by a CPU? But a 15 fps difference when it is 15 fps vs 30 fps is huge and doesn't bring the average up much for those 100+ fps low res benchies. Seriously, the charts are terribly misleading when misused as you are doing. And that overall game fps sorry to say is a dumb idea which they should remove because it is not used as Tom's intended. People keep quoting it like it represent all gaming and is a definite number to go by. It is so far from it as it is old games and when low res ties cancel out meaningful high res scores.
And they do not test new games. Those charts are a summary of 12 months of reviews, how can they test the latest games? Where's UT3, COD4, Crysis, etc? The games they test are from 2004-2006. Oblivion is the best one there to judge GPU performance in modern games. So lets disect your 19% and specifically look at the best example game they bench.
At high res 19x12, the 7950GX2 averages 15.6 fps the 8800U average 30.4 fps. The 320MB 8800GTS averaged 20.9 fps, same as the 640MB version. So much for crippled crap 320MB GTS.
http://www.tomshardware.com/charts/desktop-vga-charts/oblivion-the-elder-scrolls-4-outdoor,571.html?p=1613%2C1591%2C1614%2C1601%2C1596%2C1598%2C1597%2C1602%2C1600%2C1590%2C1616%2C1612%2C1599%2C1579%2C1615%2C1594%2C1574%2C1595%2C1581%2C1643%2C1589%2C1577%2C1593%2C1637%2C1583%2C1592%2C1582%2C1635%2C1573%2C1586%2C1641%2C1570%2C1642%2C1636%2C1578%2C1640%2C1608%2C1634%2C1568%2C1572%2C1609%2C1588%2C1587%2C1580%2C1639%2C1610%2C1576%2C1611%2C1562%2C1633%2C1632%2C1638%2C1631%2C1604%2C1607%2C1560%2C1571%2C1628%2C1575%2C1630%2C1566%2C1569%2C1626%2C1624%2C1627%2C1620%2C1629%2C1585%2C1567%2C1558%2C1622%2C1625%2C1561%2C1623%2C1619%2C1606%2C1559%2C1603%2C1621%2C1565%2C1605%2C1557%2C1618%2C1564%2C1584%2C1617%2C1563
Same game at 12x10. The 7950GX2 averaged 22.4 fps. The 320MB 8800GTS averaged 31.2 fps. The 8800U averaged 45.4 fps. Now, I guarantee you that the GF8 series sees higher minimum fps too. The GF7 series had significantly lower minimum fps vs the X19xx series and GF8 series in Oblivion. Average is only part of the story.
http://www.tomshardware.com/charts/desktop-vga-charts/oblivion-the-elder-scrolls-4-outdoor,569.html?p=1591%2C1601%2C1596%2C1613%2C1614%2C1598%2C1597%2C1643%2C1600%2C1590%2C1602%2C1599%2C1637%2C1635%2C1595%2C1616%2C1594%2C1641%2C1583%2C1615%2C1612%2C1593%2C1592%2C1579%2C1589%2C1574%2C1581%2C1636%2C1642%2C1577%2C1611%2C1582%2C1640%2C1586%2C1634%2C1628%2C1632%2C1573%2C1631%2C1608%2C1639%2C1633%2C1570%2C1630%2C1578%2C1638%2C1588%2C1609%2C1587%2C1610%2C1568%2C1572%2C1580%2C1629%2C1576%2C1604%2C1627%2C1607%2C1624%2C1571%2C1620%2C1626%2C1566%2C1562%2C1585%2C1575%2C1622%2C1569%2C1560%2C1567%2C1625%2C1623%2C1603%2C1619%2C1605%2C1561%2C1565%2C1584%2C1621%2C1559%2C1606%2C1618%2C1564%2C1563%2C1617%2C%2C
So much for 19% compared to an Ultra when the 320MB GTS can beat it by 50%. It's more like a 100+% lead for the Ultra in the game that best represents modern shader heavy games. And like it or not, the 320MB 8800GTS or even the $100 8800GS is better now than the 7950GX2.
Anyway, sorry to nitpick just this one point as I know it's not your focus for this thread. But I think it's important you understand where the 7950GX2 sits in current gaming and also how flawed your use/misuse of Tom's charts are. If not for you, then at least other people will see this and hopefully learn from it. So many people keep doing thiswith the charts too and the overall picture they get of how cards stack is so far from reality. Anyway, I'm done. Take it or leave it, but if you want a clearer picture, you'll take it and not dilute meaningful victories with high fps, low res, cpu limited results from old games.
Another example. The overall game score puts the X1950XTX, 7900GTX, and HD3850 very close but in that order. That is ridiculous. The 7900GTX is far worse than the other two NOW. And the leader of the three for current games is the HD3850. Yet, people will point to it and say hey, the 7900GTX is better than the HD3850, look at Tom's charts.
Then they go to play a new game like UT3, and look at the 7900GTX vs other cards:
http://www.anandtech.com/video/showdoc.aspx?i=3127&p=8
Another view of the HD3850 that begs to differ with Toms Charts overall game fps total ranking:
http://www.anandtech.com/video/showdoc.aspx?i=3151&p=9