Sign in with
Sign up | Sign in
Your question
Solved

HD 5870 Output?

Last response: in Graphics & Displays
Share
January 27, 2011 5:57:18 PM

Hi. I have a Element 32" 720p:

http://www.elementelectronics.com/products/tvs/elchw321...

Anyways, I just finished building my new gaming computer that consists of:

HD5870
Phenom II X6 1075T @ 3.0GHz
6GB DDR3 @ 1600

Anyways, I use to run a 8800GT to my TV by using a DVI to VGA adapter on the 8800GT. Anyways, my monitor supports 1600x1200 resolution. This is where it confuses me, because the "maximum resolution" on the TV is 1360x768. Now that I have a graphic card that supports HDMI, I wanted to use it, so my question is: should I run 1360x768 on HDMI or 1600x1200 with DVI-VGA? Also, what's the difference? For the last day I've read up on aspect ratios and I think that 1600x1200 is a 4:3 aspect ratio, so by putting it on my TV everything is getting stretched to fit. Should I take the resolution decrease for the aspect ratio increase (because from my understanding 1360x768 is the native 16:9 resolution)?

I'm so confused, so any information would help. Thanks in advanced..

More about : 5870 output

January 27, 2011 6:05:07 PM

Running it in native resolution would look better yes, but that resolution is so...so.... small.

Honestly if you want to keep that TV I would sell that 5870 and get something like a GTS 450 or 5750. You're not going to come near running that graphics card to its full potential in that resolution.

If you have the money I personally would invest in a 1920x1080 monitor. You can get them relatively cheap nowadays.
m
0
l
January 27, 2011 6:29:12 PM

Unfortunetaly I just put $1,300 into this computer, and have no intention to sell it. This also means that I can't get a new TV because I spent all of my money. I won't be able to get a 1080p TV (it has to be TV, this is a gaming room we're talking about - Xbox 360, PS3, computer, Wii) for awhile. The resolution doesn't seem that small, but compared to the 1080p resolution it is. You say I'm not using the 5870 to it's full potential, but at 1920x1080 I've seen people not being able to run max AA or anything. So, if I run it at the native resolution (1380x768), I would be able to turn everything to max quality / max AA / max AF and still run games on Ultra with a solid 60 FPS; that's not underpowering the card in my opinion. Anyways, since the resolution is small, are you saying I should go with the DVI converted VGA (Graphics card > DVI to VGA > VGA input on TV) so I can get 1600x1200 or run HDMI with 1380x768? I'm so confused, what's the difference..? Why can I run 1600x1200 on DVI/VGA but not on HDMI if they're still the same video signal? Why does my monitor support 1600x1200 if it's only 720p? So many questions..
m
0
l
Related resources
January 27, 2011 7:12:14 PM

At higher resolutions you don't need "max AA" and what not, those are only needed becuase the "jaggies" you see without AA are only noticable at low resolutions. I guess what I'm trying to say is a game will look much better in 1920x1080 with 2xAA, than a game in 1380x768 with 8xAA. Regardless with high resolution you gain such a better image quality, and in some cases a larger FOV. Just do a google image search of 720p vs 1080p you'll find enough answers there. Image quality is so much different, its not something you will understand untill you notice or see it first hand. If you can, go to a buddies house who has 1080p and play Xbox, change the system setting from 720p to 1080p and you will see the difference. It is more something you have to see with your own eyes to understand. Trust me I was in the same boat, I was using 1280x1024 resolution up until a few years ago because I didn't realize how much difference there could be.

It should also be noted that running 720p on a 20 inch display will look better than running 720p on a 32 inch display. If you think about it for a second, you have the same amount of pixels on a larger screen, this causes the pixels to be larger on the larger display and thusly the image is less detailed and a little more blurred.

And what I mean by underpowering your graphics card is that buying a GTX 460 768, will give you the same quality that your 5870 will, just becuase your 5870 isn't being used to its max potential. Sure you can max out everything on every game (depending upon CPU bottlenecks becuase at lower resolution the CPU has to work harder), but what is the point when a card half the price can max out everything, on every game as well.

I also have a gaming room, I have to have my computer and consoles hooked up to different monitors/TVs. Personally, my computer needs to be on a monitor at 1920x1200 and my consoles on a TV at 1920x1080, and my OLD consoles (SNES/NES) are on an old CRT 480p TV.

Anyway, none of this really helps your problem what so ever as it is more of an explanation. I can't be very helpful when it comes to forcing a computer to run on a TV. Sometimes they work well without a hitch, othertimes a TV and a computer just don't play well together when it comes to setting up resolutions. Honestly, there is a reason that there are monitors and then there are TVs, they aren't the same thing.

Use the 1600x1200 if the picture isn't too distorted, that is what I would do. Again, I can't see this distortion so I don't know what it looks like.

EDIT - I'm sorry if I sounded snarky, I'm really not trying to be. Its just a very hard thing to explain, and it is easier to show.
m
0
l
January 27, 2011 8:02:31 PM

I don't undermine your answer to not be helpful at all, you actually clarified some of the information I was thinking. On New Years, I was still running a Pentium 4 with a 8800GT, that tells you how often I'm able to upgrade because of money problems. The reason I got this particular graphics card is so the computer could last me 5+ years, and most people say it won't but when games start using multicore technology my Phenom II X6 will be fine, just like my P4 was fine until Starcraft 2 / Catalysm came out. I guess I'll just have to try each, and take a decent picture of both setups to see what the difference is.. when I was running 1600x1200 I noticed a tiny distortion, and it felt like the colors weren't as vibrant.. At first I thought I understood televisions / monitors but now I'm confused. I understand resolution and definition, but then aspect ratio comes into play and confused me. Like, if I ran 1600x1200 it would be a 4:3 aspect ratio stretched to fit my screen.. but if I ran native 1380x768 then it would be a 16:9 ratio then it wouldn't need any dragging/zooming/distortion. I am tempted to take the loss of resolution just to get a sharper image, but I agree with you.. I had an old monitor (the fat ones) that ran 1000x800 (or something, I really don't remember), and bought a flat screen LCD monitor that ran 1440x900 (which I don't have anymore) and saw a humongous difference. I don't have the money to have a TV for my consoles and a monitor for my computer at the moment, but if I did go monitors then I'd definitely go Eyefinity. So, all in all, I guess I will take a picture with my 1080p camera of each resolution on desktop then in a game, I'll post them and see if there's any difference. Maybe I'll look into getting a 27 inch monitor that supports DVI / HDMI.. no smaller than that though or my desk (that I had specially made for my big TV) would look tacky. Thanks for your help, looking for someone to explain where aspect ratio comes into distortion, resolution, definition, and screen size.
m
0
l
January 27, 2011 8:09:29 PM

Well, aspect ratio is really easy to explain, most games are made with a certain aspect ratio already in mind. This of course being 16:9. So using any other aspect ratio is going to distort the image slightly to something it wasn't made to do.

http://en.wikipedia.org/wiki/Aspect_ratio_(image)

Look at the image on the right hand side of the screen.

If a game is meant to played in 4:3 ratio and you play it in 16:9 ratio, as you can tell your going to have to strech the images width greatly to make it fit that ratio. Now if you take a game meant to be played in 16:9 and squish it up into 4:3 you're going to have a squished image.
m
0
l
January 27, 2011 8:33:26 PM

I understand what aspect ratio is alone, but where does it come into place when you look at resolution, screen width, connectivity (VGA/DVI/HDMI), definition, aspect ratio, and screen refresh.

I understand this:

Resolution is how many pixels are placed out in your TV/monitor. Since HDTVs have a set resolution, the bigger TV you get the more distorted the image.

Screen width is the size of the screen, the image played on the screen is based on how many pixels there are in the screen, resolution size.

Connectivity is what I'm confused about. I have a 32" 720p TV, but when I connect it with VGA/DVI I can get a 1600x1200 maximum resolution, but with HDMI I get 1380x768 (720p) as a maximum resolution. If I'm able to get 1600x1200 resolution with VGA/DVI, could I set it as a custom resolution with HDMI and have it work? This is probably what is most confusing to me: how can my TV display 1600x1200 if there are 1380x768 pixels? Is it because DVI supports higher resolutions than HDMI, because HDMI is set to definitions (720p, 1080p and 720i, 1080i)?

Definition is the sharpness of the image, the higher the definition the sharper the image.

Aspect ratio, the way the screen is recorded and played on your TV to fit your screen without black lines; causes most distortion problems.

Screen refresh rate is the rate at which your TV refreshes per second.. not much to say about this besides it can cause images to come together a lot easier, giving you a more distinct picture.


Now, when you combine them all, it depends on what your TV supports. Seeing that my TV is 720p, and supports HDMI means that with HDMI I can get the definition's screen size.. which is 1380x768. However, if I connect it with DVI/VGA (which would be the same signal as HDMI, HDMI just has digital audio), I get a higher resolution. This is what confuses me, how does connecting it with DVI/VGA impact the native resolution of the screen? My thought about how it works is that it takes the 4:3 1600x1200 resolution and expands it out to a 16:9 aspect ratio, pulling the image quality with it causing distortion. If I'm right, why doesn't HDMI support 1600x1200?

I guess my main question is how does DVI/VGA support 1600x1200 while my HDMI and my TV only support 720p which has a native resolution of 1360x768? How does 1600x1200 resolution fit on my screen which consists of only 1360x768 pixels..?

Also, thanks for all of your help so far.

Edit: I won't be able to buy a monitor or TV anytime soon. When I do get a TV, if I do, it'll be a TV and not a monitor only because I can't afford to have a TV for everything and a monitor just for my computer; I can only afford to upgrade the main component that I use to display everything. So, saying that, my next upgrade would definitely be the maximum. I personally didn't buy this TV, it was a present, and since my TV consists of using everything I'd get a 1080p 120Hz TV. I guess another side question would be: does screen refresh rate impact graphic performance? If I have 120Hz instead of 60Hz, wouldn't my graphics card have to work harder since it would be refreshing 120 times a second instead of 60? Would my graphics card be capable of running WoW / Starcraft 2 on all Ultra with a 1080p resolution with 120Hz?
m
0
l

Best solution

January 27, 2011 8:40:19 PM

PandaBearz said:


Connectivity is what I'm confused about. I have a 32" 720p TV, but when I connect it with VGA/DVI I can get a 1600x1200 maximum resolution, but with HDMI I get 1380x768 (720p) as a maximum resolution. If I'm able to get 1600x1200 resolution with VGA/DVI, could I set it as a custom resolution with HDMI and have it work? This is probably what is most confusing to me: how can my TV display 1600x1200 if there are 1380x768 pixels? Is it because DVI supports higher resolutions than HDMI, because HDMI is set to definitions (720p, 1080p and 720i, 1080i)?

HDMI is not "set" to run only at any of those definitions, look below for more details. For reference both DVI dual link and HDMI are capable of much higher resolutions then anything we are talking about in this conversation.
PandaBearz said:

Seeing that my TV is 720p, and supports HDMI means that with HDMI I can get the definition's screen size.. which is 1380x768. However, if I connect it with DVI/VGA (which would be the same signal as HDMI, HDMI just has digital audio), I get a higher resolution. This is what confuses me, how does connecting it with DVI/VGA impact the native resolution of the screen? My thought about how it works is that it takes the 4:3 1600x1200 resolution and expands it out to a 16:9 aspect ratio, pulling the image quality with it causing distortion. If I'm right, why doesn't HDMI support 1600x1200?

It does, or at least I think it does? I think the problem here is just when I was talking about computers not playing well with TVs. For reference different HDMI versions are capable of different maximum resolutions. 1.0-1.2a is capable of a maximum resolution of 1920x1200, where as version 1.4 is capable of 4096x2160 resolution. Being as the lowest maximum resolution is 1920x1200 I see no problem here. Have you tried forcing custom resolutions on the computer?
PandaBearz said:

I guess my main question is how does DVI/VGA support 1600x1200 while my HDMI and my TV only support 720p which has a native resolution of 1360x768? How does 1600x1200 resolution fit on my screen which consists of only 1360x768 pixels..?

Well your display is obviously made to be capable of 1600x1200. It is meant to be ran at 1280x720, because it is a TV, and it is made for TV things in 720p. Recommended Resolution =/= Native Resolution =/= Maximum Resolution. Think of it like a car, most cars can go well over the speed limit, but they are meant to be used at the speed limit.
PandaBearz said:

Edit: I won't be able to buy a monitor or TV anytime soon. When I do get a TV, if I do, it'll be a TV and not a monitor only because I can't afford to have a TV for everything and a monitor just for my computer; I can only afford to upgrade the main component that I use to display everything. So, saying that, my next upgrade would definitely be the maximum. I personally didn't buy this TV, it was a present, and since my TV consists of using everything I'd get a 1080p 120Hz TV. I guess another side question would be: does screen refresh rate impact graphic performance? If I have 120Hz instead of 60Hz, wouldn't my graphics card have to work harder since it would be refreshing 120 times a second instead of 60? Would my graphics card be capable of running WoW / Starcraft 2 on all Ultra with a 1080p resolution with 120Hz?

Completely understandable, and no the refresh rate is just what the monitor will refresh at. Which means at 60Hz refresh rate you can only get up to 60FPS, and at 120HZ 120FPS, now it should be noted that you will not be LOCKED at this FPS unless you turn on V-Sync within game settings. You’re graphics card will work just as hard with both monitors. For example if a benchmark says you will get 40FPS in Crysis at 1920x1080 with a 5850 (this is hypothetical), then it doesn’t matter if you use a 120Hz monitor, or a 60Hz monitor, you will get 40FPS either way.
Share
January 27, 2011 9:13:41 PM

With my old computer monitor, and I upped 60Hz to 75Hz the screen would go black.. but with this TV 75Hz works, does that mean it's compatible? The television's maximum resolution in the manual is 1366x768, that's what confuses me. So you're saying that since my DVI/VGA can get a maximum set resolution of 1600x1200 (I didn't force it, it was there as an option), then my TV is compatible with it? If it is, that means I can force the 1600x1200 resolution in the catalyst control center and it'd be better than 1360x768? I'm just confused why DVI would have the option in the preset and HDMI wouldn't?
m
0
l
January 27, 2011 9:30:03 PM

I don't know why, as I honestly have long stopped researching such low resolutions. But theoretically yes, if you can get it with DVI you can get it with HDMI. Just try forcing it.

And for what that TV is MADE for, the maximum resolution is 720p, most consmuers will be using a TV as a TV not a computer monitor, in which case it is only capable of 720p and not 1080p, I think this was just bad wording on their part.
m
0
l
January 27, 2011 9:37:23 PM

Thanks again for all of your help. I understand that TVs aren't monitors, they're catagorized by definition which has a set resolution instead of monitors which have are catagorized by resolution. I'll attempt to force my TV to 1600x1200 (is that a big resolution, btw? - could I go higher?) I'll then take a picture of both resolution with a full 1080p camera, and compare the pictures to see what the difference is.. hopefully then I can understand it. I'd imagine 1600x1200 would be stretched out but maybe it's the native resolution and they were advertising the TV by definition. By the way, is 1600x1200's aspect ratio 16:9 or 4:3? I think it's 4:3 because 16/4 = 4 : 12/4 = 3. 4:3. In that case, wouldn't you'd rather have a lower resolution with a better aspect ratio than a higher resolution with a worse aspect ratio? Wouldn't it distort everything since it's spreading a 4:3 ratio on a 16:9 monitor? Thanks again.
m
0
l
January 27, 2011 9:40:01 PM

Your TV I believe has a maximum resolution of 1600x1200. So that should be the highest it can go. Yes 1600x1200 is a 4:3 ratio.

Again your last question can only be answered by looking at it yourself and deciding what you think looks like best.
m
0
l
January 27, 2011 10:06:11 PM

This is where I get confused. I went into my dad's room and hooked my computer up to his 19 inch computer monitor.. the native resolution is 1400x900 and the color looks spectacular. Same with the text, everything is very very clear.. and I just hooked it back up to my television and everything seems so..... I don't even know how to explain it.. all the colors look faded and the text has noticable black lines around them.. nothing looks smooth. However, in 1380x720 is looks a lot better but everything is bigger and I hate that. I got use to using the living room's 27 inch iMac and when you opened Safari it only took up half the screen.. I loved everything being zoomed out. This is so confusing.. thanks for your help though, I guess I'll just do what I said: take pictures of each on a few screens and compare them. I'll probably end up going with 1600x1200 just because it'll look better in games.. but as soon as I can I'll probably sell this television and get a 1080p for $50 more.. it should've been done in the first place.
m
0
l
January 27, 2011 10:06:29 PM

Best answer selected by PandaBearz.
m
0
l
January 27, 2011 10:09:15 PM

The reason it looks better is becuase its made to be used with a computer, and you are running a higher resolution at its native resolution.
m
0
l
January 31, 2011 7:10:55 AM

I'm still confused on how my monitor/TV is running at 1600x1200 and it's only 720P. Thanks for all of your help, though.
m
0
l
January 31, 2011 7:55:44 AM

Becuase it isn't "only" 720, that is just the native resolution of the monitor.

As I said before, native resolution =/= maximum resolution.
m
0
l
March 31, 2011 11:59:19 PM

A comment and a question - first my comment - I run a 5870 at 720P on a 50" plasma and when you start tweaking settings, it still taxes the GPU to the max on Crysis 1. Some games can always use more horsepower. I think on a television, 720P is still very good - no, I don't have a lot of time using higher, but 720P is sharp "enough" in my opinion, and I use my computer for all sorts of things on the TV (although in desktop mode, I do run it at 1080 and let the TV downscale, which still looks fine as I set my Windows fonts to a larger size).

I do have a question - in the latest CCC, I just cannot find any way to center the image. It's close enough, but the "show desktop" button at the bottom right edge of my Win7 taskbar, for example, usually gets cut off completely. It seems that it only shifts left randomly upon restarting the computer and TV.

I have searched for so long and see no answer. I think PowerStrip might be a fix, but I don't want to pay for that. Does anyone else know why the centering seems to change all the time? Is it more just a function of the TV interpreting the signal and centering it, vs. the video card output?
m
0
l
!