Sign in with
Sign up | Sign in
Your question

R600 Exposed Pics

Last response: in Graphics & Displays
Share
April 11, 2007 9:44:29 PM

Hi, here's something i found on the R600: :) 


http://forums.vr-zone.com/showthread.php?t=143108

More about : r600 exposed pics

April 11, 2007 10:22:08 PM

Looks nice.
a b U Graphics card
April 11, 2007 10:38:48 PM

I loves me that HELLBOY motif, if they throw one of those in with every purchase I might buy one for that alone.

Then I need yours and the Gang's help to figure out how to make it work in Laptop. I got vicegrips and silver solder !! :twisted:
Related resources
Can't find your answer ? Ask !
April 11, 2007 10:40:16 PM

Yeah, but different cooler. That one is ginormous, the desktop version will be smaller like that of the 8800GTS.
April 11, 2007 10:45:02 PM

Damn that is a massive card. I'm pretty sure the turbo on my eclipse would fit nice and tight on that card, i think liquid cooling is in store for this thing if i ever get my hands on it.
April 11, 2007 10:59:48 PM

Quote:
Then I need yours and the Gang's help to figure out how to make it work in Laptop. I got vicegrips and silver solder !! :twisted:

hmmm..


:lol: 

OK, you can put the card on your laptop, then bake it, and they will melt together, and fit like they were meant for each other.
April 11, 2007 11:00:45 PM

I thought it was supposed to have hdmi, or is that a higher version?
April 11, 2007 11:05:47 PM

wow that thing is the size of this huge video editting card my dad use to use in his computer back in 1994!!!
April 11, 2007 11:08:50 PM

lol... they did make sure to make it longer than the 8800... knew it would come down to length!!! :lol: 
Now... I've seen it for the 3rd time... BRING ON THE BENCHES!!!

Anyone know if these are going to come with integrated cooling options like the g80's?
a b U Graphics card
April 11, 2007 11:09:21 PM

Quote:
I loves me that HELLBOY motif, if they throw one of those in with every purchase I might buy one for that alone.

Then I need yours and the Gang's help to figure out how to make it work in Laptop. I got vicegrips and silver solder !! :twisted:


dont forget the b.f.h. :p 
a b U Graphics card
April 11, 2007 11:11:45 PM

that is almost as long as my 8O
April 11, 2007 11:16:46 PM

Why do they need the last inch or two of black plastic past the cooler?
April 11, 2007 11:30:04 PM

Quote:
Why do they need the last inch or two of black plastic past the cooler?


To make it too big for my case, so i have to go with a more expensive retail version instead of the oem. :?

Maybe it's for holding the card during installation or maybe it's part of a cable managment or in-case securing design some of the OEM customers have requested...

I wanna see piccies of the retail versions, and some benchies...-
April 11, 2007 11:36:34 PM

That makes sense that they would have a way of securing it into the case on the tail end. The torque that PCI-E slot would incur during a standard UPS trip could be troublesome for outfits like Dell.

It is good to see this thing may hit the market soon.
April 11, 2007 11:39:12 PM

i thought the oem one was meant to have a bigger cooler thus needing the extra support from the back? cause the oems want a cooler card to put in than a regular cooled one..
April 11, 2007 11:39:15 PM

Quote:
wow that thing is the size of this huge video editting card my dad use to use in his computer back in 1994!!!

This reminds me of the orginal Targa card in 1984 (this is even before the Amiga). It can draw a 3D image at 320x240x16 bit color. Wow. There was also a photoshop type program (I forget the name). It was made for high-end video editing. If you don't know the price, you can't afford it.
a b U Graphics card
April 11, 2007 11:51:33 PM

I think you're thinking about Video Toaster. Best app for the Amiga, and took a long time for it to be replaced by other apps once the Amiga died.
April 11, 2007 11:56:25 PM

Austin Powers: Your graphics card's a GTX?
Nigel Powers: It's not the size mate, it's how you use it.
April 12, 2007 12:45:56 AM

Quote:
It's not the size mate, it's how you use it.


That
is
PURE
BULL$HIT.

Size always matters. :mrgreen:

...IMO, of course...
April 12, 2007 12:50:27 AM

Big-you-know-what....




=





BIG SOCKS!
April 12, 2007 2:30:21 AM

Quote:
Then I need yours and the Gang's help to figure out how to make it work in Laptop. I got vicegrips and silver solder !! :twisted:

hmmm..


:lol: 

OK, you can put the card on your laptop, then bake it, and they will melt together, and fit like they were meant for each other.

ASUS XG Station. Much easier than baking. :wink: Dunno if the card will fit though.
a b U Graphics card
April 12, 2007 2:50:32 AM

ASUS XG station would be terrible for the R600, the XG station only has 1 PCIe lane, pretty much crippling the card.

If anything I'd wait for the AMD LASSO solution, because on the XG, the R600 would function like the MRX2600/GGO8600 I plan on having in it anyways.
April 12, 2007 3:08:28 AM

Quote:
I think you're thinking about Video Toaster. Best app for the Amiga, and took a long time for it to be replaced by other apps once the Amiga died.

I really did mean the Tagra.
Does THIS look BIG enough?? Eh?
The link is a fairly new model. I was talking about a 1980s model in my last post. (this was long before the Amiga & Video Toaster)

Quote:

AT&T formed the Electronic Photography and Imaging Center (EPIC) in 1984 to
create PC-based videographic products. In the following year they released
the TARGA video adapter for personal computers. This allowed PC users for
the first time to display and work with 32-bit color images on the screen.
EPIC also published the TGA Targa file format for storing these true color
images.

http://www.cs.cmu.edu/~ph/nyit/morrison/1980s.txt
April 12, 2007 3:11:36 AM

Quote:
I think you're thinking about Video Toaster. Best app for the Amiga, and took a long time for it to be replaced by other apps once the Amiga died.

Here I have to agree 100%... remember MOD files? Best sound then and I bet that some could still hold up now (I know, I know.. only 4 channels, or did they have 8 by the end of it?). Strange thing how old technology sometimes is the best still available, eh?
a b U Graphics card
April 12, 2007 3:20:42 AM

Ah OK, I was following the sentence and was thinking you were talking about the software at the time and for the Amiga, misread what you wrote.

Yeah Matrox used to have cards like that too with swa in/out memory modules.

The Targa you're talking about would likely have been part of the old Chyron systems.
a b U Graphics card
April 12, 2007 3:24:07 AM

Yep I hated having to relearn everything, and hate doing it again and again with video/audio/picture editing. I've sorta stuck with Adobe now in just a nice all in one package once they finally killed my CoolEdit, now I use Audition, Photoshop, and Premier just for familiarity.
April 12, 2007 3:28:29 AM

Quote:
Ah OK, I was following the sentence and was thinking you were talking about the software at the time and for the Amiga, misread what you wrote.

Yeah Matrox used to have cards like that too with swa in/out memory modules.

The Targa you're talking about would likely have been part of the old Chyron systems.

I also agree that the Amiga/Toaster was WAY ahead of its time. Too bad that trend didn't continue (as in Ray-Tracing back in the late 80s !!).
a b U Graphics card
April 12, 2007 3:31:23 AM

Yeah however I think ray-tracing will make a comeback once they go to the multi-core multi-functional units like intel and AMD have been talking about. Likely by 2010-2012 (man it feel weird writing that and considering it 'soon') we'll start seeing the first real start with games going in that direction beyond just demos where it's the core of a commercial retail game.
April 12, 2007 3:32:57 AM

Quote:
Yeah however I think ray-tracing will make a comeback once they go to the multi-core multi-functional units like intel and AMD have been talking about. Likely by 2010-2012 (man it feel weird writing that and considering it 'soon') we'll start seeing the first real start with games going in that direction beyond just demos where it's the core of a commercial retail game.
Talk about crazy lighting! 8O
April 12, 2007 3:35:05 AM

Yeah... and maybe the SID chip will make a comeback, too!
I dont even know what the fcuk ray-tracing is....
April 12, 2007 3:37:31 AM

I'm soooo lame! :?:
April 12, 2007 3:52:04 AM

i noticed in the pic with the power connectors( 4th pic down) , there is a 6 pin and an 8 pin power connector. WTF?
April 12, 2007 3:53:33 AM

Quote:
Yeah... and maybe the SID chip will make a comeback, too!
I dont even know what the fcuk ray-tracing is....

Ray tracing is a general technique from geometrical optics of modeling the path taken by light by following rays of light as they interact with optical surfaces.
This technique gives very realistic lighting and shadow effects. Such as mirrors reflecting off other mirrors, having a whole lot of different lights and shadows, having realistic materals (such as metal), and refraction (like magnifying glass or water effects). I belive 3D movies (like Toy Story) uses Ray Tracing, not DirectX 7,8,9,10, etc.
Anyway, Ray Tracing has many advantages over shaders, but requires a WHOLE lot more computer power. (as in 20+ CPUs to play Quake 3 using RT)
I think I'm correct, but please yell if I'm way off on something.
April 12, 2007 4:05:10 AM

your right about alot of CPU power man, heres a vid of 3 PS3s using their CELL CPUs to do that ray tracing thing. http://www.youtube.com/watch?v=oLte5f34ya8 . in the real world you'll probably need a quad core OC'd to 4 Ghz and 3 R600s and a dozen sticks of ram :roll:
a b U Graphics card
April 12, 2007 4:07:56 AM

Nah that's correct, and one of the main reasons it's used for movie models is that raytracing doesn't experience the kind of errors you see in rasterized solutions, and was able to do a better job at rendering things like images through glass, etc. Which is something that many movies require.

Jumbles, DX10 is catching up, but there are sill advantages to raytracing, the major dissadvantage is heavy workload, which is slowly getting better. But think of it this way, Raytracing basically gives you what you should see if it were in front of you from all angles rendered from a point in space to your eye with all reflections dhaows etc calculated, and if ocluded then it's taken out, so at it's best it should look 100% realistic, whereas DX10 is still limited but much faster. Now think of it in a theoretical type of way (I'll be talking fancifully a bit to help) let's say to equal ray-tracing with a rasterized solution under current methods you need a mythical DX12, well of course there isn't cards that support it, and to emulate it in CPU with the resources of a DX10 card, it would render at like a fraction of a frame per second, which is similar to what you would need for the equivalent scene with ray-tracing, now while we have alot of shortcust for raterized graphics in DX10 and such, we don't have many short cuts for Ray-tracing, so you are stuck basically on high-quality, whereas the DX solution could turn down the requirements to medium quality or remove some features (like transparent glass [so you can see fully beyond the window, no matter what angle) and run great on current systems. Another tough thing for raytracing is things like fog or volumetric clouds and diffusion sources. Hopefully that helps more than confuses, just trying to watch the hockey games while typing.

One of the old demos that's fun to explore (should try with a modern CPU) is RealStorm's;

http://www.realstorm.com/

But it is old relatively speaking. there was another cool one, calle something like 'against nature' or 'better than nature' or something like that.
April 12, 2007 4:37:16 AM

Thanks for the realstorm. Looks cool!
April 12, 2007 4:50:25 AM

Heckuva benchmark there Mr. Ape. Too bad this computer I'm on sucks so badly. :(  Is there no way to use GPUs as a co-processor or two or four or...more especially with cards with a whole lotta fast memory and multiple cores? Is it a software issue or some intrinsic design issue? That PS3 demo in BigCharb's post was good too, is that a sign of a design strength or just a whole bunch of cores?
a b U Graphics card
April 12, 2007 5:03:14 AM

Unfortunately not for older cards; hey think of this, when that demo first came out in it's early form we were on XP2000-2400+ rigs. 8O

There could be benifit from things like stream processors, but I'm not sure if the I/O or the type of math would be the issue with co-processing on those cards. I know there's bee alot of talk about using GPGPU for rayracing assitance, and for Raster-Tracing.
Likely with the GPGPU 're-evolution' we may see more and more of these types of apps adapted, but right now it's geared to CPUs. Not even sure if there's enough memory in most cards to store that kind of buffer info (these calculations are hugely taxing on memory too).
April 12, 2007 5:53:18 AM

Quote:
i noticed in the pic with the power connectors( 4th pic down) , there is a 6 pin and an 8 pin power connector. WTF?

The 6 pin is the standard PCIe power connector. The 8 pin is the PCIe 2.0 power connector, which is supposedly rated for twice the current. If you look at the PSU forum's list of power supplies, the ones with the 8 pin connector are noted.
April 12, 2007 5:55:36 AM

Thx, Ape. Funny how a thread about the new R600 got turned into one about having CPUs rendering potentially superior graphics. Fascinatin' stuff that raytracing. Sure is pretty when one has the horsepower to do it justice, though. The requirements would go through the roof if there was waving grass, bullets, and such to deal with. The IBM dudes have quite a good system going with those 3 ps3 rendering the car. Maybe scaled up even further you could have a strange kind of LAN party where everyone brings a machine yet only one person gets to play one amazing game. With the dynamic load balancing going on in their software, perhaps the individual power of the machines would not be as relevant: ie if machine A can only render 160 x 240 then that is all it renders of the total scene. Since it's running on Linux, it could be made across platforms, perhaps, if the software was good enough.
April 12, 2007 7:08:56 AM

I read an article in Scientific American about ray tracing. And I'd like to go into a little more detail that Ape may have not known about. There is a professor working on a Ray Tracing Processor, the demo he created ran at only 20mhz and produced 15 fps. That was over a year ago, I speculate that it will not be something the GFX cards adapt but instead, is something that we will buy as an add in. Since its a very specific function and once its performed at an acceptable rate, its pretty much done. I foresee the first real exposure to RT will be with add-in cards, like the PPU, but without the limited use those provided. Since RT doesnt affect gameplay, one could still play the game with or without one.

In the future, I expect a RTPU to be imbedded in all GFX cards, but only years after RTPU's are first offered. Once the message gets out that RT is awesome, it will be adopted by the GFX makers, and then supported by the next DX, or vice versa. It is therefore possible to conclude that RT could only be a few years away, just as ape said, 2010-2011. Compare this to Agena PPU, this was offered as a standalone card, however, soon after Nvidia and AMDTI both adopted PPU into their own cards.

Fact is, RTPU isnt necessarily the hardest thing to process, its just a matter of having a specialized processer doing the work. The problem with most RT programs is that they render in order. What this professor has done is to have the RT done in reverse of the light source, thus its just a matter of working back to the light source rather than away from, the math apparently is alot easier to crunch this way, thus offering a much faster rendering method as well as maintaining equal picture quality. I'm not techical in this area enough to explain further than what the article said, please correct me if I misinterpreted anything.


Oh yeah, R600 pics look nice, with the bonus of looking authentic too! :lol: 
April 12, 2007 2:24:15 PM

Thanks for the explanation, appreciate that. The technology sounds great, let's hope that that professor knows what the heck he's talking about and not juicing his findings... :lol: 
Now I can look forward to something else that will cost me $1000 to get the best graphics.... it's never going to end, so I better just quit bitchin... :idea:
a b U Graphics card
April 12, 2007 7:52:06 PM

Quote:
That was over a year ago, I speculate that it will not be something the GFX cards adapt but instead, is something that we will buy as an add in. Since its a very specific function and once its performed at an acceptable rate, its pretty much done. I foresee the first real exposure to RT will be with add-in cards, like the PPU, but without the limited use those provided. Since RT doesnt affect gameplay, one could still play the game with or without one.


I disagree, only because look at the PPU itself, like a potential RPU, I nor most others, will spend $100+ for something that only helps me in less than 1% of my situations, I'd rather spend that $100 on a better CPU that would help me in 99.9% of my applications/situations. It's an interesting concept, but like the PPU shows, if we can adapt the CPUs or VPUs to take the role while doing their other functions, then that gives us a global boost. And the reason I think folding it into a VPU is a moot point, is because intel and AMD are talking about folding in their VPU functions into those fusion type multi core CPUs and that will happen about the time we start looking at realistic raytracing coming online IMO. Think about it, the 16-80 core solutions we are talking about with the flexability of the CPU and the troughput of the VPU architecture, that's truely the perfect solution for most people, especially if it's as scalable/modular as they are claiming/hoping where we could add more cores like 8 at a time.

In the future, I expect a RTPU to be imbedded in all GFX cards, but only years after RTPU's are first offered.[/quote]

Yeah except the dead end for graphics cards is only a few years away with these hybrid solutions. I think the RPU like the PPU was a solution for a year ago when the Fusion future didn't have the backing of intel and AMD to the same extent it does now.

Quote:
Once the message gets out that RT is awesome, it will be adopted by the GFX makers, and then supported by the next DX, or vice versa.


Actually I think the opposite, M$ and AMD and nV have too much currently invested into raster graphics to allow raytracing to get a foothold until they are ready and let it. Really right now it's too early anyways, but none of the current players have anything to gain from promoting raytracing, especially if it means less dependancy on DX and VPUs.

I think the first Raytraced games will be those small games from mavericks, bu where it's no longer an adaptation of an existing game but a ground up concept, and that's why my timeline is that far out, we need the hardware first even in it's infant stages in order to get a ground swell large enough for Raytracing to be comething that could target about a million+ users, launching a title like Crysis or UT would require a potential 10+million use base, and that won't come for a while after Fusion.

Raytraced main stream games is still a long way off, but it does after a very nice target to strive for.
April 13, 2007 3:02:44 AM

I see what you mean Ape, I have to agree with you that NV, AMDTI, nor M$ would have much reason to invest in RT. From my perspective, I was looking at what the customer would enjoy the most. Personally, I salivate at the idea of VPU's reaching an "end", since, like you stated, RTPU's pretty much are the last leg for gfx to become "lifelike". I like to think companies form their business plans around what customers want, so when it comes to games, nothing matter more than visual quality, clearly this would be a massive gain in whomever can pull it off first. But, like you stated, with no support it will likely come from third party like OGL, in fact, I wouldn’t be surprised if OpenGl were the first to support RTPU's.

So I suppose we need to hope that fusion and Intel’s mirror of that incorporate a sector to RT, after all, if a 20 MHz first draft can churn out more than a top of the line 2006 desktop, these RTPU's must be incredibly efficient beasts yet simple enough to have very small cores.

Correct me on this, but if RT is based on light dynamics, would that not coincide with other similar physics? Going further, would that not mean that PPU's already in the Nv 88 series could process the RT calculations as well? If so, that could mean what is really holding RT back is the support and drivers, since it would be fair to conclude the horsepower is there.
April 13, 2007 3:58:59 AM

All we have to do to see what happens when a company sticks its neck out in a direction that others are not ready to, or do not want to go is look at Ageia. Once Novodex until they were bought out, trying to "force" ppl in a direction (no matter how good that direction may be) will not work.

You attract more flies with honey, and they have yet to provide that sweetness. RT needs the honey first (which grapeape has pointed out where that can come from) but if enough buzz (no pun intended) is created all others will follow.

I think grapeape's timeline is a fair estimate.
a b U Graphics card
April 13, 2007 4:27:24 AM

Quote:
Correct me on this, but if RT is based on light dynamics, would that not coincide with other similar physics? Going further, would that not mean that PPU's already in the Nv 88 series could process the RT calculations as well? If so, that could mean what is really holding RT back is the support and drivers, since it would be fair to conclude the horsepower is there.


I'll address this when I have a bit more time, just finishing up here at work (about an hour to go) and I don't want to go into too much detail especially since it's sketchy for me too, but the GPGPU power is still left untapped, but unlike physics, there is a memory buffer for the pizels that needs to be kept, whereas physics would have a prgressive depletion and repleneshment of resources. IMO the only thing to limit the GPGPU option of high GFlop cards like the R580/G80 is their limited memory options.

I'll see if I can flesh out my views on it later, but you are thinking in the right direction, I just think now it's the wy the architecture is put together and interacts that's holding it back, not the raw processing power, although even that's a little weak for what we need to make UT2K4 in raytracing at an acceptable res.
a b U Graphics card
April 13, 2007 4:46:05 AM

Well glad it didn't look all messed up, because I'm rusty on Raytracing to say the least, but my view of the future is that those Fusion type CPUs will be just perfect, especially when given shared access to large memory pools.

Once that happens things should progress quite naturally I would think.
April 13, 2007 6:08:21 AM

indeed, that would be cool



mmm, UT in RT... 8)
!