Sign in with
Sign up | Sign in
Your question

R600, questions

Last response: in Graphics & Displays
Share
December 2, 2006 7:39:47 PM

I am building a new PC for gaming. I am running a E6600 and am gonna pick up a P5W-DH Deluxe prob. I am trying to wait for the R600 card because ive only heard good things about it, and I want my card to be dx10 compliant so i dont have to replace it again anytime soon. Can anyone out there fill me in on some general info about this card? Any speculation on a release date, is it worth waiting for as opposed to just buying something else. I've considered buying a cheaper card so i can at least set up my system, (I'm getting really anxious about it) but it seems like such a waste for maybe only a month. Not really sure, could use some opinions about this.

-And also, will i be able to run this card with the e6600 and p5w-dh deluxe combo? any issues there that anyone knows of?

Thx for any feedback on this

More about : r600 questions

December 2, 2006 8:00:08 PM

Trust me, wait. ATi is developing it to be king.
I as well personally know of how anxious you get when waiting to build. Well, when this happend to me, I got screwed by the AMD price cuts, the X1k series, and many other things.
There shouldn't be any problems running the R600 with your mobo and chip. Though, you might want to consider getting the upcoming DFI RD600 mobo (RD600 is the name for the chipset, not related to the R600). Word has it that the best bios tweaker in the world is making a bios for this new board that can overclock like a dream. And for any of the new DX10 cards, it would be smart to overclock to eliminate that bottleneck.
December 3, 2006 6:26:27 AM

The R600 is planned for an end January 2007 release. My logical self says that you should wait, as the card might be a tad expensive and buying something now and then again in 60 days seems a bit daft.

Ati is very quiet about the card, so all that can be said at this stage is pure speculation.

Your CPU and Mobo will be fine, although a move to quad core when more cheap next year should not be ignored. Graphics cards are, and have been very powerfull over the years, and I doubt whether there will be an instance where a commercial CPU will be more powerfull than a graphics card moving forward. But it doesn't hurt getting close to it though.

You might want to ensure you have a proper PSU, and then a case that aids in cooling around the graphics card.
Related resources
December 3, 2006 12:09:01 PM

Quote:
Your CPU and Mobo will be fine, although a move to quad core when more cheap next year should not be ignored

Well, if you need the power of four cores, then surely a move to consider, however, a gamer, will not see benefits of it.
December 3, 2006 1:08:08 PM

True :D 

But the catalyst drivers supports your statement (last part) in that regard.. :D 

Had to edit, my first reply was a bit moot. Soz.

I personally don't see the bonus of overclocking.
December 3, 2006 4:24:34 PM

The e6600 is already bought, so i doubt i will move to quad anytime soon unless it becomes necessary for industry standard. The only other thing i have purchased atm is a 20' lcd widescreen monitor. I just seem to be waiting on the video card, to kind of fit the rest of my comp around.
December 3, 2006 4:59:34 PM

...well that sucks, cause i already bought one. Hopefully it wont be too bad.
December 3, 2006 6:37:36 PM

20'' is okay :b its also what game u play.. dx10 maybe gonna slam the card a little down so the 20'' res would probably be good, i dunno how smart it is 2 game crysis on 2560x1600 and how much punish it will take so we r gonna have 2 wait and see.
December 3, 2006 6:41:42 PM

a gamer WILL see benefits of it soon

Alan Wake was demonstrated at the Intel Developer Forum in September 2006, running on an Intel Core 2 Quad processor clocked at 3.73GHz. The demonstration took the form of a tech demo, showcasing engine features such as day/night cycle, volumetric light, weather and physics. It was revealed that the game engine is multi-threaded and able to make full use of all four cores
December 3, 2006 6:51:24 PM

You will be fine with your 20' if you have it for a while when newer games come out, the lower res on the screen will allow u for high frame rates at your max res. I have a 19' LCD and there are many games that give me alot of trouble at full res (1280x1024).

Best,

3Ball
December 3, 2006 7:13:57 PM

So you say that I would actually have better gameplay with a higher resolution?...know that I am extremely picky when it comes to frames...as in if I drop below 60 for about 4 seconds, then I quit playing the game! One specific game that I own, but do no play because it doesnt run up to my standards is Ghost Recon: Advanced Warfighter, which has support for dual core I believe? So I wouldnt believe that I would be CPU limited with that, especially when I even tried it with my CPU @ 2.62ghz and my vid card slightly more OC'd, but not much maybe all of 10 - 20 more mhz! All of my temps are great and the system is perfectly clean and stable (24 hours prime95 stable and all components pass EuroSoft PC Check Diagnostic tests with OC applied and Memtest checks out as well). I make sure its in perfect shape every single day so that it doesn't hinder any performance what so ever. I don't even use the WinXP theme or run AIM and have a IE window open and minimized because I want all of my resources free that are possible to have while running a game! (Driver and patch update checks are made atleast 2 times daily as long as I am home! And my catalyst is configured for better performance than that of stock settings)...anyone have anymore ideas for me?!?

Best,

3Ball
December 3, 2006 7:52:26 PM

Higher resolutions move the burden off of the CPU on to the GPU.
December 3, 2006 8:14:21 PM

but u forgot the price of 24 is 40% more than 22 and i know what res the have i looked at all the 24 specs cuz i wanted one myself but i dont think its worth it now

22 is really cheap now and with 5 ms

and i know ppl play high res in oblivion but i wanna see what hit the dx10 takes on the cards cuz we havent seen yet.. and the price is just not right for 24'' or 30'' tho in 1 year or so it will be :b atleast i hope so
December 3, 2006 8:45:23 PM

I would wait on the the DX10 cards until ver 2. I would guess around Q2.

Between the facts that there arn't any DX10 games and that you will need vista for DX10: waiting until nVidia and ATI work out bugs in the drivers and hardware will only benefit you.

I would fully expect the r600 to beat out the 8800, but if things stay like the last year, a month or two later nVidia will have something that beats it by just a little bit. Then two months later it will be back to ATI.
December 4, 2006 12:48:07 AM

Quote:
at that resolution, bs, no way that xtx can't handle any game with aa or hdr at that resolution, it is your processor

What you fail to realize is that at any resolution lower than about 1600x1200, it is more cpu related for gaming, while at the higher resolutions, it is more graphics power needed, set the priority of the games lower to make it more gpu related and you will see a major increase, that xtx is designed to handle about 1920x1200

That is why it is a waste to get a 20" lcd with an r600, because there is no use for it, a better processor is more efficient for the price
Actually, at 1280x1024 an X1950XTX isn't capable of providing a constantly smooth gaming experience in Oblivion and the list is growing.
December 4, 2006 1:51:17 AM

Im not saying its not a powerful GPU, im just saying he isnt wasting the card by buying the 20' monitor. Anyways I am 4 hours into the prime95 test of my CPU @ 2.55ghz, its the highest I can go with keeping my side panel on and temps under 50 degrees celcius as well as keeping the voltage under 1.4v so I think that is where I will be keeping it as long as everything checks out since that seems to be where my computer is hurting the most?

Best,

3Ball
December 4, 2006 2:24:54 AM

http://www.viewsonic.com/products/desktopdisplays/lcddi... is the lcd i bought. How much is using this going to limit me? Can you link some of the ones you are talking about? Maybe getting something more limited now and upgrading after the bugs are worked out like KTev said isn't a bad idea.
December 4, 2006 2:46:19 AM

It looks like a fine lcd. The only problem is, I didn't see anything that stated HDCP. So I have to guess it doesn't support it.

With the lower res you will just be able to run soft shadows and higher AA/AF settings without needing SLI/CF.

Everyone wants a 30" but is it worth the $2k?
December 4, 2006 5:58:23 AM

Damn, thats a pretty nice monitor! (IMO) It looks to me like the res is decently high so I still dont see you having any problems, but someone else may disagree. How much did u get it for if u dont mind me asking?

Best,

3Ball
December 4, 2006 11:57:49 PM

I picked it up for $294.99 from zipzoomfly and there is suppose to be a $50 mail in rebate. So essentially $255. That's after tax and shipping, which zipzoomfly does for free.
December 5, 2006 12:06:54 AM

Oh and the screen has a sticker that says, "HI-DEF Widescreen" and also "Enjoy HD format video and gaming." Nothing that says HDCP.
December 5, 2006 1:59:06 AM

That is a great price.

If it doesn't say HDCP some where I think you are going to SOL for any real HD format video.

I can't find a utility that would id HDCP.
December 5, 2006 4:08:28 AM

If I'm not mistaken, HDCP is the copy protection, and I am not aware of of monitors doing the decrypting. It's normally the DVD player in the case of home entertainment, or on the graphics card in case of computer.

What I think you want to say is HDMI, which is the interface for connecting from an HDCP device source to a TV/LCD etc.

HDMI is just a revised means of connecting HD video (new connector) along with the audio in once neat cable.

The ViewSonic does not have such an interconnect. It only has DVI and Analog connection.

With regards to HD, true HD comes in the form of 1920x1080 resolution, running in progressive scan mode (1080p). You can use lower res, and still have 1080i, as the video is upscaled and some proffesionals reckon that the difference is not noticeable. The Sony Bavaria is one of the examples here, running in something like 1300x800 res, it's still capable of 1080i. Newer Bavaria come in the true 1080p standard.

As far as gaming goes, you will be able to output HD content, same as I'm capable of doing on my 19" CRT, but the effect might not be as good as on a true HD monitor, yet still better than I can use on my CRT. And this all depends on the res you are using. I'm saying might, as I'm not 100% sure about what the display (HD "compliant") does with HD content (ie deinterlacing, progressive scan, algorythms for HD etc.)

Normal HD video I can not comment on, so someone else will have to fill in here.
December 5, 2006 5:20:49 AM

Quote:
If I'm not mistaken, HDCP is the copy protection, and I am not aware of of monitors doing the decrypting.


Actually they do decrypt the signal, which is encrypted using the HDCP KEYs (40 sets of 56bit keys). Encrypted by hardware added to the output device (graphics card) and decrypted by the corresponding hardware on the display.

http://www.digital-cp.com/home/HDCP_Specification_Rev1_...
December 5, 2006 7:25:31 AM

Now that I didn't know. I thought the DVD/G.card decrypts the source (HD-DVD / Blu Ray) and then sends it over to the TV/LCD via HDMI.

The HDMI part is true though, but I wasn't aware of encrypting and then decrypting again like you said.

Thanx! :wink:
December 5, 2006 12:44:19 PM

Part of HDCP is a unified secure connection from the computer to the monitor. You need HDCP compliant software, graphic card, and monitor to take advantage of blue ray or HD-DVD.

I don't know if the disk drive is part of the chain. I don't think you need anything on the motherboard.

If at any point in the chain there is a security risk the content is scaled down from 1080p. I think it goes to 720p or lower. Yes your monitor will then upscale it to what ever your monitors native res is. The upscaled content isn't close to the true HD content. If you think it is try playing your games a vga and tell me that monitor upscaling make the game look just as good as 1600x1200.

In this situation a CRT will fair better then a LCD or plasma.

If your screen is of lower res then the input stream (lots of tvs don't meet 1080i) your monitor will down scale to its res. In most cases this shouldn't have the problems that upscaling does, but you are not getting the full benefit of HD content.

Another way to see the slight differences is, that monitor uses both an analog connector and a DVI. A video card with both analog and DVI out can send the same res to the monitor. At first you may not notice the differences, but after using the DVI connect for some time when you go back to the analog you will notice text is fussy and harder to read and the screen is full of artifacts.

Some other things to note about HDCP. Right now I know that the current nVidia cards only support HDCP over single link DVI. So those with the bigger screens (dell 30") that use dual link connections will have issues. I haven't been able to find out if ATI has the same problem.

Games shouldn't be an issue.

HD naming is surround with lots of hype and misdirection. They can call a tv HD if it supports just about anything over SD. Monitors were "HD" in the 90's then. Don't trust any little sticker that says HD. You need to look at the specs. If you want the true HD experience you need HDCP, 1080p, and either a DVI or HDMI connection. The screen should also be 16x9 (wide screen)but most computer monitors aren't wide enough at 16x10. The last thing for the total setup is that HD also includes 5.1 audio (even broadcast HD) so you need the audio setup to.
December 5, 2006 5:58:57 PM

Quote:

I don't know if the disk drive is part of the chain. I don't think you need anything on the motherboard.


Yeah disk drives are part of the chain which was an issue with early BD drives, read/write data, but not HDCP content / movies.

Quote:
If at any point in the chain there is a security risk the content is scaled down from 1080p. I think it goes to 720p or lower.


Actually it scales down to 1/6 res to 480p.

Quote:
Another way to see the slight differences is, that monitor uses both an analog connector and a DVI. A video card with both analog and DVI out can send the same res to the monitor. At first you may not notice the differences, but after using the DVI connect for some time when you go back to the analog you will notice text is fussy and harder to read and the screen is full of artifacts.


That's overly generalized, DVI can also show artifacts, especially in the long distances common in HTPC setups. A properly shielded and calibrated VGA connection can be as goo/bad as DVI. Analogue BNC is better than both so it's not that simple.

Quote:
Some other things to note about HDCP. Right now I know that the current nVidia cards only support HDCP over single link DVI. So those with the bigger screens (dell 30") that use dual link connections will have issues. I haven't been able to find out if ATI has the same problem.


Yeah both have the same limitation, ATi says the R600 won't. However 1080P can be sent over single link you just need the proper signalling and setup (remember 1080P content is not just 60hz, but 20, 24, 30, 50, 60, 100). You can fit 1080P50 in single link with standard blanking, and 1080P60 with 5% reduced blanking. So for now it shouldn't be an issue, it's only for larger panels, and only non-HDCP net content is currently over 1080P anyways.

Quote:
HD naming is surround with lots of hype and misdirection. They can call a tv HD if it supports just about anything over SD. Monitors were "HD" in the 90's then.


Who are 'they', the standards are set, the deceivers usually are trying to sell you something. EDTV is not HDTV, nor SDTV, nor SCTV.

Quote:
Don't trust any little sticker that says HD. You need to look at the specs. If you want the true HD experience you need HDCP, 1080p,


720P and 1080i are still HD just not the high end of current widely available content, but the other two are definitely the most widely available of the content. Also, there is no single 'true HD' that's just marketing speak, similar to calling EDTV HDTV. The only thing to consider is 'FULL' 720P or 'FULL' 1080P/1080i, etc. because there is content and playback outside the norm, like Sony's HD cameras (which they IMO misleadingly call full HD 1080), or movies like T2 both of which are encoded at 1440x1080.
December 5, 2006 10:49:54 PM

Thanks for further clearing that up.

The "they" I was talking about is both manufactures and retailers. I think the FCC or some other government body forced ED TVs from being advertised as HD, but there is still misleading information out there. On the sales side most of the people don't know jack and even if they knew what they were taking about explaining it to the customer is a pain.

Taking advantage of HD is such a mess you would think congress was involved.
December 5, 2006 11:25:09 PM

Quote:

The "they" I was talking about is both manufactures and retailers. I think the FCC or some other government body forced ED TVs from being advertised as HD, but there is still misleading information out there.


Probably the consumer protection agency. The practice nowadays is to primarily show what content it can display, not the native resolution, and not mention that while it displays 1080i/p content the image downconverted to only 800x600. The Retailers and Mfrs are both responsible for that shift.

Quote:
Taking advantage of HD is such a mess you would think congress was involved.


Nah they already relaxed the push to force broadcasters, there's no way they're going to mess with mfrs, heck remember it's the government boobs who think the internet is a series of tubes. :roll:

That's the same dumba$$ who blamed file sharing for creating porn. Damn file sharing's been around a long time! But musta been 'papyrus file' sharing. 8O

As much as it would be nice to have someone who knows tech in government set a standard, that ain't gonna happen, and of course it even got started lobbyists would kill anything tha meant 1 penny less to the mfrs. Best advice, caveat emptor ; the dumb consumers need to edjucate themselves before plunking down a chunk of change. How many think nothing of spending $2K on a computer or TV, yet spend weeks going over $100 purchases like drapes, which have little to hide (Oooh it's not an HD drape, it's only Standard Def. :wink: ). The info's out there, people just need to look before they buy.
!