ATI 2950XT to beat the 8800GTX?
guys, here is what i think. with the 2950xt going to 55nm process, we are going to see something unbelievable great. The core clock of the 2950xt was said to be at 825 stock, thats already 75 more than the 2900xt. now since its 55nm, i predict that we can push it to around 925 with stock cooling. if we increases the voltage and replace the stock fan we could possibly reach 1000 core clock. My prediction is that the 2950xt will beat the 8800gtx in majority of tests including 3dmark06. however, nvidia will still remain on top with the 8800 ultra BUT 2 months later ATI will launch the R670 which is basically two 2950xt chips on one PCB. this R670 will BEAT the 8800 ultra. Nvidia will not launch the 9800gtx until Q2 2008 which will give time for ATI to prepare its R700 board. so can anyone see ATI actually regaining the gfx crown?
WAS SAID /PREDICT/PREDICTION/ ,.....
all of these are guesses , no one knows when exactly 9800GTX/2950XTX will be out and no one knows how they will perform
i am not defending NVIDIA, but remember the days before 2900xt was going to come out , many said it will beat 8800gtx and will be the new king ...
In the short term, yes, I think the 2950XT will beat out a GTX, or be very very close to it in performance... It should also be a heck of a lot cheaper, and like you say overclock nicely.
However, I don't have a ton of faith in R700 yet (haven't seen enough info). I personally think that the 9800GTX will be king for some time after it comes out, provided the leaked specs are real.... However, like 8800GTX, the 9800 will be at a price that 95% of us gamers can't afford anyways lol =D
The cheaper 9800GTS on the other hand...... Well, more info is needed =-?
devilray26 said:spuddyt what do you mean by stupid laws? you mean the law that you have to expense imediately your R&D instead of amortizing the costs over time?
The law is that the company works for the shareholders, not for the customers. If it can make the customers pay a lot, the company has to do it because that's what the shareholders want and they get to decide. That's how it works.
It's called capitalism. It's a horrible system, but still the best invented so far.
It all comes down to how much these 55nm chips cost. They need to release a very strong mainstream (~$200) board that beats the 8800GTS. The keys are cheap chips, low heat, less intricate cooling solutions, and a 256 bit memory bus. If they can do that and turn a profit at $180 a pop, nVidia will lose a ton of market share.
I believe I read in an article that the 2950 XT will achieve around 11,000 points as compared to about 12,000 with a 8800 Ultra in the same system. However, AMD is supposedly selling the 2950 XT at a sub $400 price range.
Edit: Found the article
Here's the article.
By Theo Valich: Friday, 05 October 2007, 9:39 AM
IN THE FAR EAST, 3DMark is everything. You can say whatever you want about canned benchmarks, but nobody can dodge the influence of the 3DMark06 benchmark.
It's been the same story with previous iterations of graphics cards, and the same will happen with the next, DX10-only workout. When that is coming out, only Futuremark knows.
But something very significant happened in this round of the war, at least according to our highly-ranked sources.
This time around, Nvidia did not tout its G92_200 series as the fastest thing since Niki Lauda, but rather admitted defeat in this all-popular synthetic benchmark at the hands of a yet-unnamed Radeon HD part.
A reference board from Nvidia is capable of scoring 10,800 3DMarks, while a reference board from ATI will score around 11,400 3DMarks, or a clear 550-600 points advantage.
This is a massive leap over previous-gen parts. The current generation's high-end performer from Nvidia, the 8800Ultra scores 12,500 points. Seeing a mainstream, $250 part scoring barely a thousand less than a current $599 card only makes us wonder how those owners that coughed up so much will feel.
When it comes to ATI's part, you know what to expect in this synthetic benchmark - outscoring Radeon HD 2900XT is a default mode of operation for RV670XT. At least in lower resolutions.
Partners are less than happy with Nvidia board politics as well, but this is a subject of another story. µ
hannibal said:Well the 2950XT is supposed to be next middle range card, so 8800GTX has nothing to worry from that derection. The 8600 products are...
Exactly... the 8800GTX is gonna beat any ATI card for a year(FPS wise anyways). It sounds like since
ATI was bought out they decided to drop the frame rate war.
I am an ATI fan. I used to be an Nvidia fan. I still like both, but Nvidia like Intel has the upper hand/FUNDING!!! They are the new kings. Intel and Nvidia have finally jumped higher than AMD/ATI can reach. The architecture of the ATI card is better, though. Frame rates are no longer the WAR. NOW it's quality image with playable frame rates. So... as far as what the general public thinks, Nvidia is better cuz of the faster FPS. That's what they wanted the whole time that ATI was far faster than Nvidia... more frames per second. ATI will make a game look good and play well no matter what the card is. Once a game is made that uses up 320 pixel shaders... who will look better, the card with only 128 pixel shaders or the one with 320+?
I welcome the HD2950PRO512mb. Cheaper than XT and can overclock to match. THERE'S your 200$(+/-) mainstream board.
I don't think it has anything to do with being bought out, I am pretty sure it has to do with the Uarch. ATI made a card that had problems with power leakage and therefore heat. Once they get down the the 55nm node they should be able to compete in the high end arena.
How would ATI compete against the 8800GTX with what they have now without going down to a lower node and bumping the core clock up? They have been increasing the driver performance, not sure they can really do much else.
Heres the thing.
2950xt/pro probably wont do any better than their 2900 counterparts in real performance, but that doesnt matter, its 55nm so its much cheaper to make each card, it will only need a single slot cooler so there also cheaper, and its a much smaller PCB so more moeny saved.
What does this mean? It can beta the GTX without even beating it in actual stats. How do you say? Would you buy a GTX if you could get a card that does 80% as good for 60% of the price using 1/2 the space and power consumption?
And the thing is ATI can drop price MUCH lower then nvidia, because they use 90nm on the GTX which is damn expensive.
Im not buying a card until 2 or 3months into teh new year now. Why? becaese GTX and rivels are going to be quickly phased out for smarter, cooler parts which will also likel ybe cheaper use to the new competition, 2950X2X or w/e its called will whip an ultra, and still be a lot cheaper to make I expect.
For these new tri sli or quad xfire ot tri xfire or w/e, theres going to be a lot of single slot produccts coming even for high end like the rv670xt, so you can have 3gpu's and not lose valuable pci-e slots, like tri sli will be for the GTX's/ultras until new ones come out.
One things for certain though, ATI seem to have a huge advantage at the moment
Still nothing more than Bunghliomarks.
The GF8600 and HD2600 beat the X1950 in Bungholiomarks, and what has that gotten us.
I wouldn't predict the fall of the GTX just yet, the only thing that the HD2950XT threatens the GTX with is value, but then again the GTS and HD2900XT already did that, so until they actually sell at a good price, it doesn't matter what they get in synthetics, or even the promise that there'll potentially be a dual chip card version someday either.
It's good to hear about new hardwaare, but I wouldn't count on it until we have professional reviews from people we trust to do a thorough job.
Im pretty sure the potential is there to take the market from nvidia, atleast based on whats been released at the moment. Aslong as ATI put the pricing of their new cards at around the 8800gts level, theyll be a huge hit onto the GTX. Causing nvidia to lower the price of the GTX would be great.
Oh I don't deny it's a market threat, but I doubt the HD2950XT is a performance threat in games compared to the GTX.
The HD2950XT market will likely canibalize alot of HD2900XT sales, and some GTS sales, and a few GTX sales (if you can convince people DX10.1 means anything). But I doubt you'll see many more games where an HD29xx card outperforms the GTX than is currently the case. Maybe some of those games where they are within 5% , those could push over the top, and yield a 0-5% victory for the HD2950XT, but considering that there are a vast majority of games where the gulf between the XT and GTX is larger than that, it's unlikely that a mid-range card with close that gap, I think even 10% is too much too expect.
So without a radical change to either the TMU layout or the ROP composition, I don't think they HD2950XT will challenge the GTX for anything other than price/performance, and maybe Bungholiomark scores on the BestBuy wall.
Price/performance is enough though if its a good enough ratio, I think youll agree.
Im just going to assume they'll release a 2950xtx or a dual one quite soon, or maybe they plan on ramping up clocks on this card with a dual slot cooler to make the XTX, who knows. With 55nm and 100watt TDP so many things are possible hehe.
I think the OP is suggesting the same thing that happened with the Nvida 7 series and the ATI 1000 series will happen again. When the 7800 GTX came out it beat ATI to the punch and when they finally released the 1800, it wasn't really a match for it... and then the 7900 GTX came out again followed by an ATI card... the 1900 series... and I think most people agree that the ATI is the better card of those two.
Yeah I think it's enough to threaten GTX sales, yeah, threaten performance is what need something radical, and really something more than just boosting clock rates since even 25% boost to 1GHZ would still leave the R6xx design weak in fill rate and DX9 class AA performance, in order to compete with the GTX, they would need to increase the number of TMUs or improve their layout, and/or improve the ROPs to either be more plentiful or to have better features including hardware AA. Without that the weak areas of the R6xx design remain weak and you would need a super fast RV670 to make up for it. The R600 isn't memory bandwidth limited, and it's not shader limited in the traditional sense (improving scheduler useage would be better than improving just SPU speed).
I think the HD2950XT will be a great card, just like the X1950Pro, but like the X1950Pro, it won't outperform it's opposition's GTX model, just their lower models.
nd it doesn't need to beat the GTX to be succesful, all it has to do it satisfy the needs of people who were looking for a less power hungry card, with the performance and value of the GTS-320. So if it can do that and not be overshadowed by another new competing product, then it will be very succesful regardless of if it's not #1, 2 or 3.
justinmcg67 said:The ONLY way, in my opinion, for the HD2950XT to beat an 8800GTX is to utilize the PCI-Express 2.0 port, have fast enough memory clocks to utilize a full 1GB of memory, and to have great driver support. If it doesn't have those three things, than no, I do not believe it will win.
The onyl part of that equation that 'may' be holding the HD2900 back would be the drivers. It's not memory bandwidth limited, or PCIe bandwidth or latency limited, so why would improving either of those things make the HD2950XT something that the HD2900 is not?
TheGreatGrapeApe said:The onyl part of that equation that 'may' be holding the HD2900 back would be the drivers. It's not memory bandwidth limited, or PCIe bandwidth or latency limited, so why would improving either of those things make the HD2950XT something that the HD2900 is not?
DirectX 10.1 perhaps? as far as HD2900 series and HD2950 is concerned, I see NO differences beyond DX10.1 and PCI-Express 2.0. And that, in my opinion, is NOT going to be enough to beat an 8800GTX.
On a side note, I just purchased the HD2900 Pro, and I'm wondering, how do you "flash" it to a HD2900XT? I don't do Video Card flashing...I imagine it's something like flashing a BIOS? PM please.
Using the shaders to do AA rather then the ROPs would be a start. Since there are so many that cannot possibly be all used at max all the time. I thoguht CFAA did that, but I havernt seen any performance/picture quality of it compared to normal msaa/ssaa.
One huge thing tha tholds ati cards back in the shader clock wish they'd just change it like nvidia did, guess they just cant.
The thing about it, is you could possibly flash it, but right now, I wouldn't recommend it, until the boys at places like TechPowerUp, XtremeSystems, etc have pounded them into submission and done enough attempts with various bioses etc that you know what is the best solution, and which cards work better.
I don't give instructions on flashing cards, there's alot of resources out there on the .net to help anyone interested in learning.
The only advice I do give on flashing cards is a warning, get all your stuff prepared and do all the reading you can before you attempt it, improperly flashing your card is one of the easiest ways to permanently damage it.
3DMark06 is a horribly inaccurate scale of systems. The jump from a 8600GT to a 7900 series cards is more points, yet worse/same performance. I don't think it's very "real-world" in a lot of aspects. But that's my own preference.
As for my HD2900 Pro, it should be arriving next Wednesday. =D Really excited for that. First package I've ever got shipped form the Memphis warehouse from Newegg. =)
i am having a dilemma with clock speeds
do they matter all that much or does the platform matter?
here is why i am asking:
CARD: CORE CLOCK:
nvidia 8800 gtx 576mhz
nvidia 8800 gts 500mhz
nvidia 7900 gtx 650mhz
nvidia 7950 gt 550mhz
ati hd2900 xt 743mhz
ati x1950 xtx 650mhz
ati x1950 xt 648mhz
ati x1900 xt 621mhz
if the top cards of today are in the 500-600 mhz range and the cards that aren't the top performace cards of the day are 600+ why does core clock speed necessarily matter?
if what i am seeing is correct then the architecture the card is built on is more important than the core clock speed.
IF I AM WRONG PLEASE INFORM ME ON HOW THIS ALL WORKS?
acquired information from vga charts
When comparing cards that use the same architecture/chip, you can compare using the clock speed. However, you cannot compare clock speeds of two different chips. Since they are different one might outperform the other even at a lower clock. Example, a 3GHz Athlon X2 is about as fast as a 2.4GHz Core 2 Duo.
Well what exactly does the HD2900 do wrong that makes it worse than the 8800GTX? If it's leakage problems as someone said before couldn't a revision and die shrink fix that? Or if it's driver problems, revised drivers could fix that? It can't be straight hardware limitations can it? maybe there's something I'm not seeing here?
Look at amount of Texture Units and Rops.
Yeah I see what you mean, the HD2900 is severely hampered by the lack of ROPs/ Texture units compared to the GTX's beastly 32/24 respectively.
Also a quick overclock to 825 and it still doesn't beat the GTX in shading and texture fill rate. Huge memory bandwidth though... wonder if that might help? If they're built almost exactly the same its not looking good that it'll beat the GTX. Oh well, there's always the 9800GTX
rv670 will be coming as 2900pro and will be about the same performance as 2900xt
there will also be a xt version wich will be faster.
in q1 2008 there will be r680 wich probably will be a dual gpu card with the performance of twice 2900xt!
all of them will have dx10.1 , 55nm and uvd wich Nvidia lacks all at the moment and in the near future.
according to the latest news g92 aka 8800gt will not include dx10.1
personally i don't see any real performance gains until q1 2008 so i just ordered a 2900pro and clock the hell out of it until then
The Radeon HD 2950X2X is the card to get with your next system.
I'm building a new system now and have everything but the psu, cpu, video card, memory, and motherboard.
(last system was an 939 AMD 64 4000+ with an AGP 9700 Pro)
Looking to pick up the below for the build and will be a killer system.
ATI Radeon HD 2950X2X (2x crossfire)
However, not sure if DDR2 or DDR3 version. Will have to wait and see if the DDR3 ram prices go down.
One that can support crossfire might need a 1000w or so.
The 2950X2X is listed as Q1 next year, any chance that would be January?
What guys you need also see is that HD2950 is going to be DirectX10.1, which, as far as any one knows, could reall offer improvements, and on top of that, it's going to use the PCI-E 2.0 bus, which form what I last read was going to be a good improvement. So, so long as there isn't really any kinks in the DX10.1 API and PCI-E 2.0 bus, i think MAYBE HD2950 will do better, but it's a high hope...as I stated before.
probably not january and probably not going to be q1
Yeah, I kind of doubt it would be Q1 too. But you never know.
Would be awesome if they go on sale around the same time as the Intel chips in January though.
Also, here is some great info on the new ATI HD2950/2950X2X video cards.