Amd releases final r600(x2900) specs

grifter33

Distinguished
Dec 16, 2006
116
0
18,680
Six weeks from now, the world will get the first retail Radeon X2900 XTX

Late yesterday DailyTech was briefed on the final details for the upcoming R600 retail specifications, just in time for everyone to go on vacation for Chinese New Year.

AMD's guidance claims R600 will feature 700 million transistors. By comparison, the Radeon X1900 series R580 GPU incorporated 384 million transistors into its design; the half-generation before that, R520, only featured 320 million.

As disclosed by DailyTech earlier this year, the GPU features a full 512-bit memory interface with support for GDDR3 and GDDR4. R580 was also similar in this regard as it supported GDDR3 and GDDR4.

On March 30, 2007, AMD will initially debut the R600 as the ATI Radeon X2900 XTX in two separate configurations: one for OEMs and another for retail. The OEM version is the full length 12" card that will appear in high-end systems.

ATI guidance claims the X2900 XTX retail card comes as a two-slot, 9.5" design with a vapor chamber cooler. Vapor chambers are already found on high-end CPU coolers, so it would be no surprise to see such cooling on a high-end GPU either. The OEM version of the card is a 12" layout and features a quiet fan cooler.

1GB of GDDR4 memory is the reference configuration for Radeon X2900 XTX. Memory on the reference X2900 XTX cards was supplied by Samsung.

Approximately one month later, the company will launch the GDDR3 version of the card. This card, dubbed the Radeon X2900 XT, features 512MB of GDDR3 and lower clock frequencies than the X2900 XTX. The X2900 XT is also one of the first Radeons to feature heatpipes on the reference design.

AMD anticipates the target driver for X2900 XT to be Catalyst 8.36. WHQL release of the X2900 XTX drive will appear around the Ides of March.

Radeon X2900 will feature native CrossFire support via an internal bridge interface -- there is no longer a need for the external cable found on the Radeon X1000 series CrossFire. There is no Master card, as was the case with other high-end CrossFire setups. Any Radeon X2900 can act as the Master card.

A much anticipated feature, native HDMI, will appear on all three versions of Radeon X2900.

One 6-pin and one 8-pin (2x4) VGA power connectors are featured on Radeon X2900, but both connectors are also backwards compatible with 6-pin power supply cables.

AMD claims the R600 target schedule will be a hard launch -- availability is expected to be immediate. Board partners will be able to demonstrate R600 at CeBIT 2007 (March 15 - 21), but the only available cards will be reference designs.

Why was there such discrepancy with the board layouts and designs up until now? An ATI insider, who wished to remain nameless, states "The original Quad-Stealth design is what we build the R600 on: GDDR4, full-length and dual-slot cooling. As the silicon further revised, [ATI] took up several alternative designs which eventually included GDDR3 and heatpipes into the specification. The release cards demonstrate the versatility of R600 in each of these unique setups."

Final clock frequencies will likely remain estimates until later this month.

original link
 

sirheck

Splendid
Feb 24, 2006
4,659
0
22,810
ATI guidance claims the X2900 XTX retail card comes as a two-slot, 9.5" design with a vapor chamber cooler. Vapor chambers are already found on high-end CPU coolers, so it would be no surprise to see such cooling on a high-end GPU either. The OEM version of the card is a 12" layout and features a quiet fan cooler.

ah yes this is what i have been trying to tell the fools who have thought
the r600 is a big card.
 

ADM-86

Distinguished
Sep 11, 2006
164
0
18,680
:p let the video card war begin! :twisted: I just hope that we can see some great prices from this competition.
 

King-Of-Kings

Distinguished
Feb 12, 2007
55
0
18,630
My buddy who owns an 8800 aswell, was basically tripping out cause he just bought the card a month ago and it will be trumped by another card already.

I don't see it that way. My GTS will still play everything on max smoothly, so what if thiers a newer better card. The 8800's are still a beast. Its just getting crazy with these cards now. I don't think the actual technology in games can catch up to the graphic processing power on these card. 8O
 

sirheck

Splendid
Feb 24, 2006
4,659
0
22,810
My buddy who owns an 8800 aswell, was basically tripping out cause he just bought the card a month ago and it will be trumped by another card already.

I don't see it that way. My GTS will still play everything on max smoothly, so what if thiers a newer better card. The 8800's are still a beast. Its just getting crazy with these cards now. I don't think the actual technology in games can catch up to the graphic processing power on these card. 8O
there is always something better.
and most of the games are miles ahead of the videocards.
they always are.
 

sirheck

Splendid
Feb 24, 2006
4,659
0
22,810
You mean the videocards are miles ahead of the games. :wink:

what videocard or cpu can play f.s.x. at max res.?
i just went to a 1680x1050 lcd monitor from a 10x7 monitor.

and am using a 68gt. i thought fsx was tough on my 10x7
but it and oblivion kick the $hit out of my 68gt on the
new monitor :lol:
 

SEALBoy

Distinguished
Aug 17, 2006
1,303
0
19,290
You said MOST games. FSX is the ONE game that will make a top-of-the-line system beg for mercy. All the rest (including Oblivion) are playable at full detail on highend systems. So yes, videocards are miles ahead of most games.
 

tamalero

Distinguished
Oct 25, 2006
1,134
140
19,470
You mean the videocards are miles ahead of the games. :wink:

what videocard or cpu can play f.s.x. at max res.?
i just went to a 1680x1050 lcd monitor from a 10x7 monitor.

and am using a 68gt. i thought fsx was tough on my 10x7
but it and oblivion kick the $hit out of my 68gt on the
new monitor :lol:


whats "F S X" ?
 

SEALBoy

Distinguished
Aug 17, 2006
1,303
0
19,290
"F S X" is the only thing an X6800 with SLI 8800GTX's is scared of.

@sir heck: Well, you more than doubled the number of pixels, so you shouldn't be that surprised.
 

kaotao

Distinguished
Apr 26, 2006
1,740
0
19,780
Mr. Heck's got a point. Up until the 8800 series, oblivion was a bitch. And I still haven't seen any benchies for fear with soft shadows enabled that show any good FPS (everything else maxed out of course).
 

tamalero

Distinguished
Oct 25, 2006
1,134
140
19,470
flight sim x the newest one.

the older ones were more cpu bound but the new is everything bound :lol:
we know it always require microsoft expertice in doing amazing andmonstrous sized code to slowdown the most powerful pc!just like every version of windows XD
 

prozac26

Distinguished
May 9, 2005
2,808
0
20,780
Mr. Heck's got a point. Up until the 8800 series, oblivion was a bitch. And I still haven't seen any benchies for fear with soft shadows enabled that show any good FPS (everything else maxed out of course).

What heck said was:
and most of the games are miles ahead of the videocards.
Two games don't make up "most" of the market.

But yes, there is some truth to his original statement.
 

sirheck

Splendid
Feb 24, 2006
4,659
0
22,810
"F S X" is the only thing an X6800 with SLI 8800GTX's is scared of.

@sir heck: Well, you more than doubled the number of pixels, so you shouldn't be that surprised.

no not too suprised. bf2 fear still play good but oblivion and fsx still
are kinda slow and jerky.

i like high settings though.
 

sirheck

Splendid
Feb 24, 2006
4,659
0
22,810
i was getting 140 fps in f.e.a.r. with one 68gt at 10x7
now i get 70 fps at 16x10 8O .

I find that hard to believe. Maybe 800x600 on low-med settings.

that was at 10x7 and max fps.
avg was around 70 to 80.

now its max of 70 and avg of 40 :cry:

i have played at 19x12 but didnt ever check fps.
this was on my 32inch hdtv and i rarely hook my comp
up too it though.
 

Trunkz_Jr

Distinguished
Feb 1, 2007
332
0
18,780
Really? I always thought games are ahead of videocards. I mean they could make the best of graphics but they have to dim them down so that people would actually be able to play them or else nobody would buy.
 

tamalero

Distinguished
Oct 25, 2006
1,134
140
19,470
Really? I always thought games are ahead of videocards. I mean they could make the best of graphics but they have to dim them down so that people would actually be able to play them or else nobody would buy.

considering most games are now being programmed for consoles and then HORRIBLY ported to pcs, id say NO

remember its easy to ask the user to have a super beefy computer than to hire a lot of programmers to port correctly, take the example of Ubisoft, they only port.. and make horrible ports :p