Sign in with
Sign up | Sign in
Your question

PHOTOS of AMD/ATI R600 Vidio card

Last response: in Graphics & Displays
Share
March 1, 2007 3:03:26 PM

here you go,what everyones been waiting for,and its suppose to wipe the 8800GTX off the map. Thwey looklike their close to 12" long lol Heres the link: http://content.zdnet.com/2346-10741_22-57089.html
These cards are monsters,!!! & everyone though the 8800GTX was huge.
March 1, 2007 3:32:19 PM

Pics of it have been posted already.
These are OEM version, which is 12" long.
The Retail version will be 9.5" long. Smaller than the GTX's
March 1, 2007 3:32:53 PM

Quote:
hate to break it to you, but we've all seen the r600 before, I and some others posted it back a couple of weeks ago, anyways, the one you found is the oem version, the regular version is only 9.5" because it doesn't have the oem fan on it


ahh you beat me to it... oh well :D 
Related resources
March 3, 2007 4:13:14 AM

I don't care if this has been posted before, give me more photos as I want 1. It looks so beasty.
March 3, 2007 6:12:42 AM

Real news?

Uh the article says the system requires 300W, which is what we expected, Kyle misread it to mean the card itself requires the full 300W.

BTW, where does the 300W come from when the PCIe only supplies 75W as a PCIe1.1 slot, and 75W on a 6 pin connector and then 100W on an 8 pin connector. Sounds like he's having trouble adding.

Also I find it funny that he writes about the card requiring 13+" of space (but not saying the actual card length) and implying it'll be the biggest card, yet, the GF7900GX2 would be longer if the 12.5" OEM R600 card at VR-Zone are accurate.

Looks like rumour-mongering of the InQ variety, but without the natural tendency of most people to distrust the InQ.

Must be a slow news day, especially with [H] not actually getting their hands on one, or seeing one live, unlike the EEtimes article which says 200W R600;

http://www.eetimes.com/news/latest/showArticle.jhtml?ar...
March 3, 2007 6:59:11 AM

you forgot that with the extra 4 pin motherboard connector pcie slots can provide 150w, or am i wrong
March 3, 2007 7:02:07 AM

4 pins can't cary 75 W so how does it make the PCIe 150W?

The 4 pin connector is for SLi/Xfire boards to help the power across the mobo/chipset instead of suppling both the PEG slots with 75W when one is only electrically 4x, etc. and they don't wanna fry the mobo.

PCIe 2.0 will move to 150W, but there's no way the R600 is relying on everyone having that.
March 3, 2007 7:42:47 AM

Vaporware really isn't that much fun to look at :twisted:
March 3, 2007 7:54:18 AM

lol, 4 pins can carry 75 w easy, remember P=VI (watts = volts x amps) but that is not worth arguing here, im wrong, lol (ps, 12v x 30a = 360w [aka 12v rail with 30a at disposal so you can underatan how i thought that the 12v molex plug might increase the pci-e power
March 3, 2007 7:59:18 AM

While the article calls the 12" prototype discreet I just can't see a 12" card really being discreet in another use of the definition. Merriam Webster's 3rd definition for discreet is Unobtrusive, or Unnoticeable. 8O
March 3, 2007 3:36:58 PM

Quote:

PCIe 2.0 will move to 150W, but there's no way the R600 is relying on everyone having that.
We know for certain that 2900XTX will have two PCIe power connectors, one of which will be 6-pin and the other 8-pin. The sources of power available to the 2900XTX therefore are:

75W from the slot
75W from the 6-pin PCIe connector
150W from the 8-pin PCIe connector

The theoretical maximum power it could draw is therefore 300W - but it practice it will be well short of that.

What we do not yet know absolutely for certain is whether the card requries all of those connectors to be connected at once. One very strong rumour is that the card will work at standard clock speeds by using two 6-pin PCIe connectors (one using an adapter) but that overclocking will be disabled in the driver under those conditions. Connecting a 6-pin and an 8-pin connector will enable overclocking.

If this is true it means that the highest peak power the card will ever draw at normal speeds, even for a fraction of a second, is less than 225W, and that the average power consumption will be correspondingly lower. A figure of 200W has been suggested in several articles - this seems very reasonable. An overlocked card will need more, but it might still not actually exceed 225W - there needs to be a margin for error if it is to be guaranteed stable.
March 3, 2007 4:04:11 PM

i think im going to refer anyone who is looking for r600
specs to wingy :twisted:
March 3, 2007 6:54:35 PM

Quote:

75W from the slot
75W from the 6-pin PCIe connector
150W from the 8-pin PCIe connector


The 8 pin is spec'd to supply 100W, I think you're over-estimating what's expected with what's possible.
Either they or you are relying on totally re-built power supplies with different power to the various connectors or Kyle's adding wrong based on current layout, or the assumtion is ATi is relying on PCIe2.0 only, whereas if anything I think it's optional at best.

Quote:
What we do not yet know absolutely for certain is whether the card requries all of those connectors to be connected at once. One very strong rumour is that the card will work at standard clock speeds by using two 6-pin PCIe connectors (one using an adapter) but that overclocking will be disabled in the driver under those conditions. Connecting a 6-pin and an 8-pin connector will enable overclocking.


That would be similar to the GF6800U which didn't really need the second molex connector unless overclocking or under truely stressful situations, like D3 at high res/AA. I think that's plausible, far more so thann the 300W for the card itself alone. And along the same 'optional' thread, if the card is PCIe2.0 compliant it might make the extra 6 pin connector optional, where it would be required for plugin into a legacy PCIe 1.1 board, bt in a newer PCIe2.0 configuration you could just plug in the 8 pin and leave the 6 pin free.

Quote:
If this is true it means that the highest peak power the card will ever draw at normal speeds, even for a fraction of a second, is less than 225W, and that the average power consumption will be correspondingly lower. A figure of 200W has been suggested in several articles - this seems very reasonable. An overlocked card will need more, but it might still not actually exceed 225W - there needs to be a margin for error if it is to be guaranteed stable.


True and that's why the 75+75+100 makes sense to give it's 250W headroom. The GTX supposeldy draw a max of around 180-190W, but usually only draws around 140, which is why the GTX has the two 6 pin connectors for 75+75+75 since 150W alone would cause problems at the highest end, but likely would be fine for lighter apps/situations.

Quote:
ol, 4 pins can carry 75 w easy, remember P=VI (watts = volts x amps) but that is not worth arguing here,


I know, but I'm talking about with cirrent PSU pin specs, not theoretically possible, I mean the whole thing could be powered by a two pin connector, but based on current PSU/pin design expectations 4 pin would only supply about 50W across one conector (IIRC they are spec for 8amp max loads per connector, but 'could' carry more). I think when designing something like this for OEMs I would suspect they are doing this based on 50W for 4pin, 75W for 6pin and 100W for 8pin, anything else would cause more havoc IMO.
March 3, 2007 7:21:44 PM

i hope we ll have cooking grill in the package to go with it

defenitely not buying it
March 3, 2007 7:33:06 PM

Are you sure that wasn't Prescotts?
The Conroe is actually quite efficient and cool in comparison (IIRC it's cooler than the X2 and AM2 although I'd have to check to be sure), that was one of the advantages of the new design versus the old intels.
March 3, 2007 8:02:22 PM

seriously amd/ati are not stupid. they release pics and specs to only wet our appetite. they know we are all wetting our pants off because it s the fucking R600. they know we ll drop our pennys to the last cent on this card because of what? it s the fuckin R600!!!! i think this is a piece of enginneering crap( well not totally but as far as consumption and design is implied yes). i think it doesn t deserve to be bought by anyone whoever you are. i myself is looking for a high end card but that definately is not going to end up in my rig. shame on ati and anyone blindly looking at it as the st-graal or something.

the only reason i see to have this card is if you want to learn to cook and and want to make hawaiian parties in your room
March 3, 2007 8:02:52 PM

Yeah that's what I thought, the guys obviously is adept and computers, and made a good egg, but someone should teach him how to make a sandwich.

Looks like Worcester sause, like they eat in the UK, but Oie the sandwich looks weak.
March 3, 2007 8:48:22 PM

Ya, are people really gonna buy a card that sucks more power, gets hotter, is bigger, etc, etc.........just for a few extra framerates......and then say ATI/AMD pwns Nvidia?!? Seriously........

By the time R600 is released, it had better beat the Nvidia offerings by alot, or else they just crapped their profits down the proverbial toilet. To delay it so long....and all the other negatives associated with it.....man, it better destroy NV or it'll go down as the 2nd biggest colossal failure, next to the Quad FX garbage debacle.........
March 3, 2007 9:29:30 PM

Supposedly it's been delayed because of GDDR 4 problems.

Actually, I think they KNOW it can't beat Nvidia right now. Otherwise, why would they wait? If it can beat NV, then release it and get the problems fixed. So why NOT release? If it can't beat NV, then delay, claim whatever, and try to make it competitive.

If you had a superior product, why would you wait so long to release with endless delays? You're answerable to your stockholders, who would scream bloody hell for a release and increased profits. But what if you can't beat NV and you released? Hello, Quad FX!

There is no good BUSINESS sense to delay. The business world does not work that way.....letting your competitors walk all over you. ATI/AMD does have something up their sleeve....it's called Humble Pie. They know they got beat and went back to the drawing board to try to rescue their product.......
March 3, 2007 9:46:35 PM

Quote:
Supposedly it's been delayed because of GDDR 4 problems.
Actually, it's supposedly because AMD's OEM's bitched about the launch dates of the X2900XT/XTX and low and mid-range X2k cards being too far apart, so it was pushed back "a few weeks" to bring the launches closer together.

Quote:
By the time R600 is released, it had better beat the Nvidia offerings by alot, or else they just crapped their profits down the proverbial toilet. To delay it so long....and all the other negatives associated with it.....man, it better destroy NV or it'll go down as the 2nd biggest colossal failure, next to the Quad FX garbage debacle.........
Nah, as far as video cards go, nVidia's FX series is the biggest failure ever. It was so bad it's almost impossible to do worse than it.
March 3, 2007 9:52:30 PM

Quote:
Well that seems stupid to me, why jeopardise your entire company because some launch dates are too far apart, I mean come on, how long has it been since the release of the 8800gtx? and still no 8600 ultra, I don't see why they would be complaining about that. Especially if their launch gets too close to the launch of the 8900gtx, then I don't know what amd will do
http://www.reghardware.co.uk/2007/02/28/amd_690g_launch/
March 3, 2007 10:31:01 PM

Quote:
Launching Direct X 10 chipsets piecemeal was not welcome among AMD's OEMs, and would not give the best economic value for the firm, despite the clamour for new kit from hardcore gamers and computing enthusiasts, he added.


Well, now you know DAAMIT doesn't cater to gamers and enthusiasts, so fuhgettaboutit. No loyalty there, so I'm amazed at the loyalty among DAAMIT fans. WOW, they're basically saying "screw you", we make more money OEM, so we're gonna do whatever we want.

NICE. :roll:
March 3, 2007 10:55:47 PM

the only way they are gonna make money with it is if they put it in a meuseum 50 years from now so that people can say how monstuous it was to give birth to such a hideous creature.
March 3, 2007 10:59:50 PM

I agree Taco, I sure hope AMD comes up with SOMETHING mainstream that's a winner. All of us benefit if they do. A monopoly on chips is NOT good for the consumer.
March 3, 2007 11:14:46 PM

i m only joking around :wink: i dont really think it s THAT bad( i actually think it ll be a powerful card) but i m just upset about cards getting bigger and bigger each year as well as more hungry(nvidia or ati) and the consumers just end up paying for new power supplies, new cases and higher electricity bills but i guess that the way it is and as long as people are willing to pay for what they get it s not going to end.
March 3, 2007 11:54:09 PM

yeah i just hope we ll see more energy saving than power squeezing out of die shrinks. i think amd did a nice step in the right direction with their lower power hungry athlons(i dont remember which ones) and i m happy to see that there is growing market for that. the Video card market behaves pretty much differently right now in that respect but i hope well see the same things from card makers soon.
!