Yes it's codenamed the RV670, and the time frame is anything from July to December, with a more likely August/September timeframe IMO.

Main rumour was 65nm with 256bit memory, but there's other rumours out there that it's 55nm.

The 65nm makes sense for an earlier product, the 55nm would make sense for a product later in 2007.
 

shargrath

Distinguished
May 13, 2007
237
0
18,680
Yes, there is an HD 2900 Pro. It's already out.

It's called the 8800 GTS 320.
Go away troll fan boy are you enjoying your fx5200? You should see his posts on the GPUreview forums he got banned for being a massive troll who seldom adds value to the forum, hes the 9-inch and BM of the GPU world only smarter! /im sleepy and just lashing out cause I feel tired and pissed off..
 
Have any analysts predicted what level of performance the HD2900XT will offer? And how many stream shadders and on board ram it will have?

I would think logically the HD2900XT it would offer performance equal to or slightly less than a 320MB GeForce 8800.

I am kinda of interested in the 2900XT but I am questioning why to buy it when it comes out in September when Nvidia will have their GeForce 9800 out 2 months later in November?

And when the 9800 is released then prices on the 768 MB 8800GTX will probably get slashed and you could probably purchase one for $300.

I love ATI. I think they have a lot better color quality than nvidia and their video is brighter and more vibrant. But I think ATI is really behind the 8-ball right now. They are almost generation behind Nvidia.

By the time ATI catches up to Nvidia by releasing their entire 2900 line of cards, Nvidia will have released their next generation flag ship card the 9800.

And just like last year Nvidia will rake in all of the money from Christmas Sales because they will have no competition from ATI. To the best of my knowledge I am not aware of any statements from ATI about a 2950XT or a 3900XT card in the plans for Christmas.

I hate to say it but just due to lesser power consumption and heat concerns I think my next Video card may be made by Nvidia.
 
http://www.vr-zone.com/?i=5078

im pretty sure its right

What? Am i missing something?

Yeah I didn't get it either, I think he pasted the wrong link.

Anyways, he probably wanted to link to VR-Zone's mention of the 65nm solutions like other sites have mention, but then there's also other like OCWorkBench that have the cards as 55nm solutions.
 
Have any analysts predicted what level of performance the HD2900XT will offer?

I think in your whole post you mean to write HD2900PRO instead o XT so we'll go on that asumption for my answer;

And how many stream shadders and on board ram it will have?

No one knows, but alot of guesses. Sofar the educated guesses say the halfway point between the XTs and the HD2600 series. So 240 shaders, memory size would likely be variable like now, and the logical options would be 256 and 512MB.

I would think logically the HD2900XT it would offer performance equal to or slightly less than a 320MB GeForce 8800.

If they beef up ratio of the TUs and ROPs to make the back end equal to the HD2900XT, then that's a good possibility.

And when the 9800 is released then prices on the 768 MB 8800GTX will probably get slashed and you could probably purchase one for $300.

Probably not. Expect the production of the GTX to stop well in advance so that removing them from market is the marketing option. Of course if you don't mind second hand, then by then you may see some GTX's reach that level on eBay, but even that is optimistic IMO.

I think they have a lot better color quality than nvidia and their video is brighter and more vibrant. But I think ATI is really behind the 8-ball right now. They are almost generation behind Nvidia.

I don't agree with either of those statements. Would you mind explaining the first one because it doesn't match the changes nV made to equalize their weakness in that area.

By the time ATI catches up to Nvidia by releasing their entire 2900 line of cards, Nvidia will have released their next generation flag ship card the 9800.

The HD2900Pro has nothing to do with the GF9800, different products will compete in that market segment.

And just like last year Nvidia will rake in all of the money from Christmas Sales because they will have no competition from ATI.

Except that they DIDN'T rake in money from Christmas sales. They had a new product and sales went down both year over year and compared to the previous quarter. So that theory was what many expected, but it never materialized.

I hate to say it but just due to lesser power consumption and heat concerns I think my next Video card may be made by Nvidia.

Well I wouldn't pre-order anything yet, since both would be complete redesigns on new fabs so since you know so little about the PRO I doubt you have any information worthwhile to determine what's what for the next high-end cards.
 
think in your whole post you mean to write HD2900PRO instead o XT so we'll go on that asumption for my answer;

Yes, you are correct. I did make a mistake and out of habit put down the HD2900XT instead of HD2900PRO.

Except that they DIDN'T rake in money from Christmas sales. They had a new product and sales went down both year over year and compared to the previous quarter. So that theory was what many expected, but it never materialized.

Yes, it did run over in to the other year but I think it coming out before Christmas was the Catalyst that got it off to such a great start. Yes, being first the market was HUGE, but I think the timing of the release just helped them that much more.

ATI missed out on 4 opportunites
1. First to Market
2. The Christmas Season
3. The Post Christmas Sales Rush
4. Tax Refund Season.

4 HUGE sales opportunites and they blew all 4 of them. I wonder how much money that cost them.

The fact that Nvidia captialized on all 4 of those factors is what got the sales jump started then kept them going. I think what is driving sales now is the reputation of the 8800 by online reviews and word of mouth and the fact that the drivers have been greatly improved.

Plus, some people may not be willing to invest in a stronger power supply or a better cooling for their case just to buy a 2900XT.
 

tehrobzorz

Distinguished
Jul 30, 2006
463
0
18,780
yes actually i was trying to get to a graph which ( i believe) is a fairly reasonable roadmap for ati.

if i could just add a few words,

i do not believe that ATi will surge the market with the 2900pro especially at 55nm. i think that the 2600xt was a failure, and total misrepresentation of that midrange "performer". ive heard that it performs less than or equal to a 8600gt? ( feel free to correct me ) and that the 8600gt is in no way better than the old 7900gs which is the current " 6600gt" of its lifetime.

now, i think for the 2900pro to really excell and take back where BOTH ATi and nVidia failed, is the midrange performing segment. over here ( i dont know about the US ) the 8600gts sells just touching the $300 mark, making it impossible for those who want to play well... play well without stepping up to the 8800gts 320 mark ( $400 ). remember the 7900 series? i think that they were by far the best value for money at the time ( except for gtx as usuall). nowadays we cant expect our "performer" cards to do what they used to do best. now, many of you will prolly they that they are aimed at different market segments. but there are only a couple segments. the enthusiast AUD $500+ , the Performance/ Overclocker AUD $300-$400 , and the budget AUD $200

the 2900pro needs to:

be equal to or more or less better than the 79x0g(t,s)
at $350~ AUD
and also be able to be multi GPU friendly. without too much heat and power. i should also let you know that many of these should come on PCI-E 2 which would fool most of us. " hey only one power connector!! ^^" NO, ive also read that the new standard ( most likely) will not only provide bandwidth, but also double the power output leading to misconception.

pcie 1 = 75w
pcie2 = 150w

and i think that the pcie power connector provides an additional 75w or somting? feel free to correct me so yeah. these things would be power hungry. im not saying that these cards will use 225w but, youll never know.


im going to bed cya guys please let me know which parts of this may be incorrect or misleading.


2 GB OCZ FTW ^^
 
Quote:
I think they have a lot better color quality than nvidia and their video is brighter and more vibrant. But I think ATI is really behind the 8-ball right now. They are almost generation behind Nvidia.



I don't agree with either of those statements. Would you mind explaining the first one because it doesn't match the changes nV made to equalize their weakness in that area.

When did Nvidia upgrade their video quality to match ATI's? With the 8800 Series? I have not seen a computer in person with a 8800 card in it, but I have seen a computer with a 7 series card and the picture is not nearly as good as ATI's. I still do not think Nvidia is on par with ATI's video quality. I have not seen an Nvidia card yet that has more pixel shadders than it's competing ATI card.

I can see a very noticable difference between the PS3 (Nvidia based) and XBOX 360 (ATI Based R500). The PS3 graphics seem darker and the colors do not look nearly as full and as bright as the XBOX360's.
 
Yes, it did run over in to the other year but I think it coming out before Christmas was the Catalyst that got it off to such a great start.

I wouldn't say it was a 'great start' since it was less than oth the GF6800 and GF7800 launces. The GF8800 launched into alot of uncertainty about the new cards, DX10 itself, and Vista, that didn't improve until well into 2007.

4 HUGE sales opportunites and they blew all 4 of them. I wonder how much money that cost them.

None of them would've mattered had they come out with a competative part. And all of those 4 oppotunities (especially the last 2) will pale in comparison to the effect of games like UT3 and Crysis. The main battle now will be the low & mid-range, that's where the money will be won/lost.

I think what is driving sales now is the reputation of the 8800 by online reviews and word of mouth and the fact that the drivers have been greatly improved.

That and more importantly the low prices of the GTSs, pretty much everybody holding their breath for the G80 v R600 showdown waited not only for the HD2900, but also for the price drops that came with them. Many people were hoping for the GTX to come down, but most were waiting for cheaper GTS cards (many being dissapointed the GTS-320 didn't move, some of those have decided to wait for the HD2900Pro and it's nV equivalent now).

Plus, some people may not be willing to invest in a stronger power supply or a better cooling for their case just to buy a 2900XT.

Why would you need better cooling? The HD2900XT exhausts all it's heat out of the case, only the GF8800GTX and GTS even exhaust a fraction of it back into the case, and the GTS and GTX can be just as warm.

As for the power supply, only someone on the edge of the power envelope would be forced to pick the GF8800 over the HD2900 for power concerns, but maybe there's some out there with dual 6pin power connectors that wouldn't handle the load, but those would be pretty crap PSUs to begin with . But I guess there are Ultra PSU owners out there, so there's no judging for quality in some people.

When did Nvidia upgrade their video quality to match ATI's? With the 8800 Series?

Yes and it's a pretty well known benefit of the new GF8s.

I still do not think Nvidia is on par with ATI's video quality. I have not seen an Nvidia card yet that has more pixel shadders than it's competing ATI card.

Pixel shader # doesn't have anything to do with video quality it has to do with performance. And while that's your opinion on their quality, like the other statement I don't agree with it , and nether do most reviewers. Considering you haven't seen a GF8800 it's understandable you would still be going on old information, just like those who has driver misconceptions based on their experiences from 4-5 years ago.

I can see a very noticable difference between the PS3 (Nvidia based) and XBOX 360 (ATI Based R500). The PS3 graphics seem darker and the colors do not look nearly as full and as bright as the XBOX360's.

Two totally different systems. The RSX isn't like the GF8800 it's like a GF7900GT on a 128bit memory bus, but the Xenos isn't even like the HD2900 since it has alot of very different points.
I haven't played the same game on both machines, but I did like both Gear of War and Resistance; no major anomalies or issues (hey Gear can be a dark game) the difference is the tendency in general for ATi/AMD cards to renders slightly lighter/brighter than nV, but it doesn' aways look better, and it's not always the case, see the PS3 v X360 comparo for COD.

BTW, I prefer the Wii out of all 3.
 
i think that the 2600xt was a failure, and total misrepresentation of that midrange "performer".

It's not even launched yet. :|

and that the 8600gt is in no way better than the old 7900gs which is the current " 6600gt" of its lifetime.

Nah the current GF6600GT is the GF7600GT, the current GF6800GT is the X1950Pro, and those are both the killers that hurt the GF8600/HD2600. Either people already own a GF7600GT and see little benefit of moving to the GF8600 and likely the same about the HD2600, and then if the do look for an upgrade the X1950Pro and X1950XT are all over the performance/$ range.

now, i think for the 2900pro to really excell and take back where BOTH ATi and nVidia failed, is the midrange performing segment.

Yeah I agree, I think the GF8600 and HD2600 will both fail to repeat the success of the GF7600GT, and instead simply mirror the undershoot of the X1600, and then the HD2900Pro and GF8800GS/GF8600U etc are the ones hopefully to the rescue. Because while in places like the US the GTS-320 is a wicked deal, it's stil hot and power hungry and even expensive for nV to make compared to what a midrange should be.

the 2900pro needs to:

be equal to or more or less better than the 79x0g(t,s)
at $350~ AUD

Which would easily be possible with 48 shader groups and a strong back end, same with a GF8800GS. Not sure about pricing though, that's for marketing later.

and i think that the pcie power connector provides an additional 75w or somting? feel free to correct me so yeah. these things would be power hungry. im not saying that these cards will use 225w but, youll never know.

Where are you getting a power connector from, there aren't even pictures of the se cards yet? They may or may not be PCIe 2.0 parts, depends alot on their place in the dev cycle, but like the rumour of the R600, it would make sense to have the PCIE 6 pin as an optional power source for the legacy PCIE 1.1 slots, requiring PCIe2.0 only at this point would be pointless.
These are going to be sub 500mil transistor or less parts on 65nm, they should be less power consuming than the GF8800GTS, and cooler as well.

Now the G92 and R670/680 should so be more efficient per transistor than both the G80 and R600, but we still won't know if it consumes more or less power until the final clocks are pciked as well. The G92 is supposed to be close to 1billion transistors, so that's alot, but the 65nm will help, the question remains how much. And the 65nm R5xx is still a big question mark too.
 
and yet the 360 in general is bad compared to a ps3

What do you base that on?

Most reviewers are favouring the X360 to the PS3, this one has screenies to show the difference;

http://www.gamespot.com/features/6162742/p-2.html

The X360 comes out on top, both have their issues and benefits, but the X3600 comes off as the winner.


I am glad you made that post. That was the example I was looking for. The comparison screenshots. I think some of the PS3 screenshots are more detailed but I think that can be attributed to PS3's cell processor and it's blue ray dics storage capacity. But outside of that if you just look purely at color I think the XBOX 360 looks better.

Other Thoughts:
I wonder when the XBOX 720 or what ever they are going to call it is going to come out. I wonder if it will have an R800 or R900 chip in it or if they will go back to Nvidia like they did with the first XBox.
 

tehrobzorz

Distinguished
Jul 30, 2006
463
0
18,780
there was a graph on vr zone that said that the new r630 and r610 will be pcie gen 2 but im guessing that that was wrong?

and also, im saying that there will be a pcie power connector for that fact that performance cards have never not had one... ( i think)


anyways.. yeah! 8900gt ftw xD
 

Hatman

Distinguished
Aug 8, 2004
2,024
0
19,780
Since nvidia still have the GTX and ultra being top of the market performance wise, I doubt they would release a 8900 or 9800 until ATI had something that could beat it.
 

bullaRh

Distinguished
Oct 6, 2006
592
0
18,980
Yes, there is an HD 2900 Pro. It's already out.

It's called the 8800 GTS 320.
Go away troll fan boy are you enjoying your fx5200? You should see his posts on the GPUreview forums he got banned for being a massive troll who seldom adds value to the forum, hes the 9-inch and BM of the GPU world only smarter! /im sleepy and just lashing out cause I feel tired and pissed off..

It doesnt really surprise me, he has been close to a ban in here also.
 

bullaRh

Distinguished
Oct 6, 2006
592
0
18,980
When did Nvidia upgrade their video quality to match ATI's? With the 8800 Series? I have not seen a computer in person with a 8800 card in it, but I have seen a computer with a 7 series card and the picture is not nearly as good as ATI's. I still do not think Nvidia is on par with ATI's video quality. I have not seen an Nvidia card yet that has more pixel shadders than it's competing ATI card.

actually the 8800 series have a better video quality than the ATI at the moment

8800 series vs x1950xtx

http://static.hardwareluxx.de/hardware/nemig/artikel/G80/G80_AF_8x_hq_hq.jpg

http://static.hardwareluxx.de/hardware/nemig/artikel/G80/G80_AF_16x_hq_hq.jpg

and 8800 series vs 7900gtx

http://static.hardwareluxx.de/hardware/nemig/artikel/G80/G80_AFGTX_8x_hq_hq.jpg

http://static.hardwareluxx.de/hardware/nemig/artikel/G80/G80_AFGTX_16x_hq_hq.jpg

''GeForce 8 performs significantly better texture filtering than its predecessors that used various optimizations and visual tricks to speed up rendering while impairing filtering quality. The GeForce 8 line correctly renders an angle-independent anisotropic filtering algorithm along with full trilinear texture filtering. G80, though not its smaller brethren, is equipped with much more texture filtering arithmetic ability than the GeForce 7 series. This allows high-quality filtering with a much smaller performance hit than previously.''
 
Since nvidia still have the GTX and ultra being top of the market performance wise, I doubt they would release a 8900 or 9800 until ATI had something that could beat it.

I think you forget a very important thing. nVidia wants to reduce the cost per chip of the G80 and still charge people higher prices, so a 65nm version of the GTX is still very attractive for them. So I suspect they will have a GF8900 by the end of the year regardless of what ATi/AMD brings, and they will bring them to market once they have great yields.

The only thing that would speed that along would be competition, however they won't sit back too long as they don't want to get caught wrong-footed like they did with the X1900 and waste the advantage they had.
 

hay00c2

Distinguished
Sep 15, 2007
4
0
18,510
I know this is an old thread but I have some info on the HD2900pro I just found on HIS website. http://www.hisdigital.com/html/product_sp.php?id=341

Model Name
HIS HD 2900Pro 512MB GDDR3 VIVO PCIe
Chipset
Radeon HD 2900 PCIe Series
ASIC
Radeon™ HD 2900Pro GPU
Pixel Pipelines
320 stream processing units* (Unified)
Vertex Engines
320 stream processing units* (Unified)
Manu. Process (Micron)
80nm
Transistor

Fill Rate

Memory Size (MB)
512
Memory Type
GDDR3
RAMDAC (MHz)
400
Engine CLK (MHz)
600 740(2900XT)
Memory CLK (MHz)
1600 1650(2900XT)
Memory Interface (bit)
512bit
Memory Bandwidth
Max. Resolution
2560*1600
Bus Interface
PCI-Express x16
VGA
No
2nd VGA
No
DVI
Yes
2nd DVI
Yes
TV-out
Yes
HDTV (YPrPb component output)
No
Video-in
Yes
TV Tuner
No
FM Tuner
No

So based on them numbers and the fact that the only 2 differences are the ones i've mentioned the 2900XT for, how different will it be to the 2900XT considering Ebuyer have 2 Sapphire version, 1 with 512mb GDDR3 memory for £158 http://www.ebuyer.com/UK/product/132550 and 1gb GDDR4 memory for £191 http://www.ebuyer.com/UK/product/132551
 
That description says 512bit memory, so based on HIS' description of the XTs that would mean it's nothing like the HD2900Pro people were talking about when this post started, and nothing like the one they're talking about nowadays (with either 320 SPUs w/ 256bit memory like the FireGL V7600 or 160 SPUS w/ 256bit memory like the current talk is going on about).

So to answer your question, we won't know for sure how different they will be until we know what's fact and what's a typo.