Sign in with
Sign up | Sign in
Your question
Closed

ATI R600

Last response: in Graphics & Displays
Share
November 8, 2006 9:37:34 PM

So now that the G80 has been released, does anybody know when the R600 is going to be released or atleast an estimate???

More about : ati r600

November 8, 2006 10:01:33 PM

early '07, holding hands right along Vista.
GDDR4 Memory
1024MB RAM.
64 unified shader pipelines, 32 TMUs, and 32 ROP
DX10/DX9 compatability

The rest of course is pure speculation.

From Wiki.
November 8, 2006 10:04:12 PM

If you'd have read couple G80 threads, there would be some discussion about R600, and its arrival :roll:

Q1 2007 is the estimate.
Related resources
November 8, 2006 10:29:29 PM

Quote:
If you'd have read couple G80 threads, there would be some discussion about R600, and its arrival :roll:

Q1 2007 is the estimate.

... not to be a jerk or anything... :lol: 
November 8, 2006 10:41:00 PM

:lol:  :lol: 
November 8, 2006 10:48:29 PM

It supposedly only has 64 unified shaders? I thought The R600 was supposed to be alot better than the 8800 it has 96 or 128 unified shaders. So is the 8800 series better?
November 8, 2006 11:20:30 PM

Who knows? ATI have kept their lips fairly sealed on the subject. Perhaps they may have increased the unified shader count, or are tweaking with the architecture in general... or trying to do a Nintendo. All I can say is that the best thing one can do is just wait patiently really :? Besides... what use is a DX10 card when Vista and DX10 aren't even out yet....
November 8, 2006 11:34:24 PM

Quote:
Who knows? ATI have kept their lips fairly sealed on the subject. Perhaps they may have increased the unified shader count, or are tweaking with the architecture in general... or trying to do a Nintendo. All I can say is that the best thing one can do is just wait patiently really :? Besides... what use is a DX10 card when Vista and DX10 aren't even out yet....


That's true... but reading these reviews, I'm pretty excited about playing Oblivion with max settings at 1920x1200...

I'm currently building a [dream] system. I'd been thinking of waiting for R600 but after reading the reviews for 8800gtx it's getting awfully difficult.
November 9, 2006 2:13:21 AM

Quote:
Who knows? ATI have kept their lips fairly sealed on the subject. Perhaps they may have increased the unified shader count, or are tweaking with the architecture in general... or trying to do a Nintendo. All I can say is that the best thing one can do is just wait patiently really :? Besides... what use is a DX10 card when Vista and DX10 aren't even out yet....


That's true... but reading these reviews, I'm pretty excited about playing Oblivion with max settings at 1920x1200...

I'm currently building a [dream] system. I'd been thinking of waiting for R600 but after reading the reviews for 8800gtx it's getting awfully difficult.

Well I have no choice BUT to wait since i just got my rig.

At this point theres no reason to believe that ATI would neither confirm or deny any reports. Although we are already in November so I would think that they should be saying something soon.
November 9, 2006 3:11:34 AM

My guess is either Feb or Mar 07. Though I hope the earlier the better.
November 9, 2006 5:34:29 AM

My advice is, don't worry. Ati has been producing competative products since R300. Let Ati take their time, and put together a quality product with good drivers. Ati will be ready for Win Vista and DX10 compatible games when the time is right.
November 9, 2006 6:11:26 AM

AMD's R600 will actually be the second DirectX10 GPU That [Formerly ATi] has manufactured, so I have a feeling they've already had a chance to learn alot about how to best take advantage of / design for, DirectX10. (In case your scratching your head, I'm referring to Xbox360's D3D10 Xenos GPU.)

As for shader processor counts, etc. Nothing is written in stone until the chip is actually taped-out and ready to roll, but I doubt they'll sit back and launch an inferior performing product - then again, it's all under the name of AMD now *cough* FX-62 VS. Conroe *cough*. I kid, I kid.. R600 will impress, have no doubts about that. I believe, like the new x1950PRO, it will be a 80nm product aswell (note: 8800GTX is 90nm), so it has that going for it too. :wink:
November 9, 2006 6:40:56 AM

I was under the impression that the 360's gpu was DX9. I'm not sure about this so no flaming. Last I read (and it was a couple of month ago) Crysis was cancelled for the 360 and P3 because the gpu's couldn't play it on full or something beacuse they were DX9. It was in one of the really hot (lotsa flames) threads of PS3 vs. Xbox360 vs. PC.

It really doesn't matter to me because I will stay with my pc as I don't have the money for both (pc and 360 or p3) and I need my pc.
November 9, 2006 6:51:48 AM

Quote:
then again, it's all under the name of AMD now *cough* FX-62 VS. Conroe *cough*. I kid, I kid..


I know you kid but thats comparing old with new ;)  of course the new is faster isnt it suppose to be? lol (ignores p-4 whole existance.)

Quote:
I was under the impression that the 360's gpu was DX9. I'm not sure about this so no flaming. Last I read (and it was a couple of month ago) Crysis was cancelled for the 360 and P3 because the gpu's couldn't play it on full or something beacuse they were DX9. It was in one of the really hot (lotsa flames) threads of PS3 vs. Xbox360 vs. PC.


The game could simply just be to much for the limits of a console. Wait for them to strip it down like they do all PC games they put on a console. It will come out if they care about the console market.
November 9, 2006 7:30:55 AM

I actually think I read that they'll release it but only after I read what I wrote above. On the other hand, you can't believe everything you read. I'll do some digging to see what I come up with.
November 9, 2006 8:17:34 AM

This is what I found while getting a 10 minute break from slaving away at work

No DX10 for Xbox 360

From this it seems like they will still release it eventhough this has been said.
November 9, 2006 8:35:10 AM

If they want to make more money they will find a way to strip it down enough for a console even though after reading that they dont want to. But they are in business to make money so after a while im sure they will just do it.
November 9, 2006 8:58:43 AM

They can port it if they want to

Another link said they might not need to scale down. There might be a way to make it look the same and still run smooth. The thing is, they don't need to run it at such a high resolution as on a pc. Becuse of dot pitch and all that 800 X 600(*edit* ok maybe a bit higher but it was to get the point through*/edit*) looks worlds apart on the tv and pc. Just my opinion though.
November 9, 2006 9:06:14 AM

Well i dont personaly care if they ever make a port. But if they actualy figure out how to do it with out degrading it abit i would be surprised. 800x600 is the default tv resolution though. Course just by running it in such a low res its going to look worse.
November 9, 2006 9:28:37 AM

Quote:
800x600 is the default tv resolution though.

Thanx, I didn't know that. Just took a wild swing.
November 9, 2006 3:40:52 PM

Quote:
800x600 is the default tv resolution though.


What are you talking about? 800x600 is no 'default TV resolution', it may be the default of some crap LCD panel, but TV is in very strict resolution formats: 640x480i/p (720x480p WS), 1280x720p, 1920x1080i/p , no 800x600 which is a computer resolution (SVGA).

Quote:
Course just by running it in such a low res its going to look worse.


Not necessarily, 640x480 can look better than higher resolutions for specific reasons, more colour through the TV, and better edge-bleeding/blending, which results in a smoother picture (like adding extra levels of AA) as well as some motion blur properties, these image benifits are great for movies and games, but are terrible for static images like test, huds, or trying to get a bead on someone with a sniping rifle and they are still way off and blurry. Also AF's effects would be diminished on a TV, compared to a computer for this same reason, so it depends alot on the type of game you play.

It's not as clean cut as to what's a better display technology, but the industry is moving to displays that have defined edges so soon it will be YxZ on a TV looks the same as YxZ on a computer, because their displays are very similar.
November 9, 2006 4:15:54 PM

Quote:
I was under the impression that the 360's gpu was DX9. I'm not sure about this so no flaming. Last I read (and it was a couple of month ago) Crysis was cancelled for the 360 and P3 because the gpu's couldn't play it on full or something beacuse they were DX9. It was in one of the really hot (lotsa flames) threads of PS3 vs. Xbox360 vs. PC.


Actually the Xbox is DX9.0X (the X for Xbox) which is a midway point between DX9.0C and DX10.
http://www.beyond3d.com/articles/xenos/index.php?p=02#c...

It's commonly refered to as XNA, the M$ game programming tool/code library for the X360 and other hardware that kinda acts as a catch all for C and DX.

The PS3 is restricted to DX9.0C/SM3.0 style limitations (it's really just a GF7900GT with worse memory bandwidth), but it runs on a special version of OpenGL-ES so it's not DX at all really.

Quote:
No DX10 for Xbox 360


That's not really true, many features that are DX10 only on the PC can be used on the Xenos/X360.

It's just a question of added work and picking and choosing features, but it's easiier to do that with XNA and DX then trying to us nV's Cg and OGL, but still worth it for some titles. It just depends on how it's built. It doesn't have to be built for just one or the other, it can use common libraries, and us a language like Cg to link and compile them for each.

The Wii is also OGL BTW.
November 9, 2006 11:25:36 PM

You're right they could port it. Remember when FFVII came out for PS2? It was beautiful. Then they ported it to the PC, and it, well it wasn't beautiful, but it was still a fun game. Have PC's and graphics card come that far as to turn the tables on consoles? Or does it depend on the time that the developers put into a port that makes it look good?
November 10, 2006 2:55:28 AM

Well, I can't always be right and in all fairness, I really don't know all that much about the 360 and Ps3. I've never owned a single console, only emu's.

That's why there are veterans like you guys TheGreatGrapeApe. Anyhow, I just posted what I read and as I stated in one of my posts, you can't beleive everything that you read (I wasn't completely wrong though, was I?). At least I gave links and not just stab in the dark like most of the horde...
November 10, 2006 3:01:16 AM

No worries I was just illuminating the subject, it's not something that's well covered, and you had the bits and pieces from what is out there, just trying to solidify it further.
November 10, 2006 3:01:36 AM

Thats awsome and all but my tv's all can do 800x600 and all games i have ever played on a tv have looked like garbage becuase of the low resolution's and other things you stated. I have never played a console game that i could look at and say thats some steller graphics look at all those jaggid edges.
November 10, 2006 3:05:34 AM

Many thanks for bringing it all together.
November 10, 2006 3:09:27 AM

Well the only way your tvs would 'do' 800x600 would be if they were a monitor and not a traditional TV, and they don't enjoy the edge bleeding/blending, thus you are stuck with the issues of the 800x600 TV of yours altering the original image which would be 640x480 or 1280x720 or 1920x1080 on a console, so it's not surprising they look like crap, it's like trying to display a different res image on an LCD, you suffer the effect of interpolation. So you're basically adding a double whamy to the picture.
November 10, 2006 3:12:31 AM

Im talking about classic old tube TV's from the past 8 years. 640x480 or 800x600 looked like crap all the tvs including the current tubed tv i have are capable of doing this resolutions and nothing higher.
November 12, 2006 11:50:12 AM

ATI won't release an inferior card after they've been beaten to release by their competitor :D 
November 12, 2006 12:36:40 PM

how many damn times are people going to ask this question
November 13, 2006 3:54:49 AM

We're all hoping for that. Just like AMD has to release a better product but don't know if they're gonna. It'll be good for prices if they do (ATI AND AMD)
November 13, 2006 7:24:58 AM

Quote:
how many damn times are people going to ask this question


Lots infact im working up a new thread as we speak ;) 
November 13, 2006 8:11:57 AM

Quote:
It supposedly only has 64 unified shaders? I thought The R600 was supposed to be alot better than the 8800 it has 96 or 128 unified shaders. So is the 8800 series better?


No, the Wiki page says 64 shader pipelines.. If they keep tot he current 3 shaders per pipeline if x1900 then the R600 would have 192 shaders.
November 13, 2006 8:43:57 AM

Quote:
how many damn times are people going to ask this question


Lots infact im working up a new thread as we speak ;) 

Maybe I should start one too!!! :wink:
November 13, 2006 8:45:13 AM

Hmmm... interesting, I have an old CRT JVC 20" tv i bought like eight years ago. I know 480i is the default resolution, however i could get my mythtv box to "supposedly" display in 800x600. I say "supposedly" because if i started up kruler it would show 800x600 pixels. Always thought this was strange. However it is probably not increasing the image quailty, my guess is that the TV was scaling it down somehow.

Oh, and i do think that 640x480 on a TV/console looks much better than 640x480 on a monitor/computer though not as good as 1600x1200 on a monitor/computer.

Sorry for being so off topic. :wink:
November 13, 2006 12:28:03 PM

such a good video card doomed to use such crappy resolutions :(  makes me cry!
November 13, 2006 1:40:16 PM

In the case of FFVII coming to PCs. The original was probably made to display at 680x480 on a "normal" TV, not HD or anything like that. When they ported it to PC where most users at the time displayed their games at 800x600 - 1024x768 they would have had to redo most of the graphics in the game at higher resolutions to make it look as good.

The difference in going from a game made to play at 680x480 to most PCs playing 1024x768ish would make it look horrible in comparison on the PC because the detail would not be there.

Its much like playing a compressed DVD (regular old DVD) on a monitor at its default resolution. Most common atm being 1280x1024 (17-19" monitor). I see some pixelation at times, especially in smoke effects etc even on a decent quality DVD when displayed at a higher resolution than it was coded at originally.

I hope that made sense, trying to type fast as I'm at work.
November 13, 2006 1:48:14 PM

if your talking CRT i think 1600x1200 is alot more common and for 19" LCD i think 1600x1200 is the native on most of those. If not im glad i didnt get anything under 24" :-/
November 13, 2006 4:33:39 PM

Quote:
If you'd have read couple G80 threads, there would be some discussion about R600, and its arrival :roll:

Q1 2007 is the estimate.



To the OP just ignore this clown. He has no people skills whatsoever and loves to act all hard core while he sits safely behind his keyboard and monitor.
Clearly, it takes one to know one. Your talking about how prozac has no people skills while you are being aggro. "just ignore this clown. He has no people skills whatsoever"... priceless. Irony at it's best for sure!

Thanks for making my day, RobsX2. :lol: 
November 13, 2006 5:15:45 PM

Quote:
I hope that made sense, trying to type fast as I'm at work.

Writing messages on the Forumz while at work? Who would commit such a fiendish crime? *looks over shoulder to make sure boss doesn't walk up*
November 13, 2006 5:20:08 PM

Nice... you are proving my points for me again! Keep those people skills coming! :lol: 

Anyway, I actually try not to hold grudges against people on the forumz (because it's pointless... I come on here to talk about hardware, not to try to flame everybody and their mother if they don't agree with me). But, you can think whatever you want and rewrite history however you want. I don't mind.
November 13, 2006 5:34:17 PM

Quote:
It supposedly only has 64 unified shaders? I thought The R600 was supposed to be alot better than the 8800 it has 96 or 128 unified shaders. So is the 8800 series better?


Until less than 2 weeks ago, most poeple tought G80 would havwe "disunified" shader in the like of 48 pixel and 16 vertex. And we have 96 or 128 unified shader now. So who know about R600?

Personally, considering that R500 has 48 unified shaders, I'd be surprised if the R600 would have "only" 64.

If you look atr the last generation of VPU upgrade, they always had at least double the pixel pipeline (or at least pixel shaders) compare to previous generation. R580 having combine 48 (pixels) + 8 (vertex, not exactly sure tough) for 56 units in total, I'd be surprised to see 64 like I said before.

To me it looks like NVidia and ATI are downplaying themselves to get the other ones to feel in a better position than they really are. But like any speculation, I might also be totally off track... :wink:
November 13, 2006 5:36:00 PM

Removed from the author :wink:
November 13, 2006 5:48:42 PM

prozac may have been a little rude, but I wouldn't call him a clown. He is generally helpful and constructive. Lets not turn this into another flame war, last one got out of hand :wink:
November 13, 2006 6:13:57 PM

Quote:
prozac may have been a little rude, but I wouldn't call him a clown. He is generally helpful and constructive. Lets not turn this into another flame war, last one got out of hand :wink:


Yeah, I realized I talked a bit too fast. That why I edited my last message. There's a difference between being rude and being a prick, which he isn't.
November 13, 2006 6:42:33 PM

Quote:
Yeah ok sure. :roll: :roll: :roll: :roll: :roll: :roll: :roll: :roll:

Intelligent. :lol: 

Anyway, I declare this thread a waste of everybody's time (IMO). I'm withdrawing.
November 13, 2006 7:07:04 PM

Ummm... what exactly did prozac do?
November 13, 2006 7:33:11 PM

Quote:
Nice... you are proving my points for me again! Keep those people skills coming! :lol: 

Anyway, I actually try not to hold grudges against people on the forumz (because it's pointless... I come on here to talk about hardware, not to try to flame everybody and their mother if they don't agree with me). But, you can think whatever you want and rewrite history however you want. I don't mind.


He lies ever since that one time we totaly didnt agree hes been stalking me from the bushs trying to catch me slippin. He is one mean mofo.
November 13, 2006 7:42:41 PM

Stop making alts.
!