Sign in with
Sign up | Sign in
Your question

Best card for 1920x1080, 1080p setup???

Last response: in Graphics & Displays
Share
March 7, 2007 8:44:53 PM

I was curious which card (cheapest one available), would run this resolution on a plasma TV without a single hitch?

Thanks.
March 7, 2007 9:16:18 PM

Do you mean play games at this resolution without a hitch, or just output to a monitor with this resolution?
March 8, 2007 12:37:43 PM

Games. I'd like to buy a plasma in the coming months, and have started looking. I want to connect the TV to my PC, and output games onto the bigscreen (as wel as movies).

There is so much hardware change that I dont want to get screwed like the guys that bought in 2006 (DRM). Also, I've got to make sure my PC can actually handle the games. Dont want to buy a big screen, only to find out it lags or hangs or jumps on the big screen (hence the vcard question).

I play all types of games (RTS, FPS, simulations, WoW, FEAR, Supreme Comm etc). I watch a lot of movies too.

Any thoughts anyone?

http://video.google.com/videoplay?docid=-43286002333697...
Related resources
March 8, 2007 1:01:53 PM

Oh yes, I read that 1080p is esssential if you're sitting a few feet away (like you would normally do with games). Most 1080p TVs coming with the 1920 x 1080 res, hence the need to output properly to a TV like this.

Here is the TV:
http://www.sharpusa.com/products/ModelLanding/0,1058,17...
March 8, 2007 1:14:30 PM

I just read that "gaming monitor dilemma" post. I see the Dell 24" runs at 1920 x 1200. This 52" TV would only run at 1920X1080. Would the 8800GTS work well on this TV?

I guess an important question is, will the resolution be ok. Its double the size of the Dell, with the same resolution. Will it suck in picture quality?
March 8, 2007 1:20:57 PM

Correct me if I'm wrong, but this ain't a plasma TV. Looks like a LCD flat panel to me (but actually, there's nothing wrong with LCDs).

To have a good gaming experience with modern games at this resolution... you pretty much need high end cards. I would go with the 8800 GTX or GTS (the 640 mb version only).

Of course, high-end ATI cards (x1950xtx) will probably be ok, but for now, Nvidia has the performance crown.

And I would suggest to stay away from SLI or CrossFire.

So just go with an ultra high-end card, like the 8800GTX. You could also wait for the R600, but this card won't probably be on the market until summer (if it's not delayed again...).
March 8, 2007 1:24:11 PM

Ok thanks. What about the question about the res? How will a Dell 24" compare to a 52" plasma/LCD with the same res? The TV is twice the size.

Will it look crappy?
March 8, 2007 1:25:35 PM

The 8800GTS will work well. You will be sitting further away from the TV than you will the Dell, so you should get a similar level of quality.
March 8, 2007 1:34:25 PM

If I look at this chart of 1600X1200, it says Oblivion will only get 21fps. That stinks:
http://www23.tomshardware.com/graphics.html?modelx=33&m...

1920 X 1080 = 2,073,600 pixels (TV)
1600 X 1200 = 1,920,000 pixels (Dell)

That means I should get lower fps on the TV right? Thats not gonna work. Anything below 30fps is unplayable (and thats average). You should get around 60fps average to not drop below 30fps.
March 8, 2007 1:42:22 PM

Not sure what your budget is, but a 1080p plasma is quite pricey (last i looked was about 6-8k). all of the "reasonably priced" plasmas are 720p (1366x768)

1900x1200 would look great on a 52". IMO that resolution is unnecessary for a 24" screen. would rather increase the AA.

$0.02 :) 
March 8, 2007 1:46:05 PM

Whereas if you look here;

http://enthusiast.hardocp.com/article.html?art=MTI5Myw3...

You will see that it should be averaging 47fps at 1920x1200.

That's with all settings on Max, anyway. If you find you can't play at that rate you can always turn things down a little.

If you want to play with everything at max in 1080p (including all the next-gen games) you will be looking at dual 8800GTXs. How much are you prepared to spend?
March 8, 2007 2:04:17 PM

I've been saving for a while for a nice big screen TV, but cant afford to spend on dual GTXs. Thats not gonna happen. Maybe I can talk the GF into needing a new PC, and I can transfer my x1900xt card to hers, and get a new GTX, but thats about as far as my budget goes.

That Sharp 52" is $2500 incl shipping and taxes. Its damn expensive, but Im saving. Its also not $6k. Big screen prices are crashing.

So in your opinion guys, the GTS will work with the 52" at that res? Why do THG charts speak diferently?
March 8, 2007 2:10:03 PM

BTW - that |H| article was really helpful. The GTS should be fine. When you're sitting a few feet away, I dont think that lowering the AA will be that visible, even some of the other settings.

Thanks again guys.
March 8, 2007 2:11:53 PM

Why is it different? I dunno - what processor are they using in the THG charts?
March 8, 2007 2:19:11 PM

Excellent point. The |H| article uses a Core 2 X6800 (my E6600 is OC'ed to 3.33Ghz so it should be comparable).

Here are the THG specs:

Last Update: November 10, 2006
Hardware
Processor: AMD Athlon 64 FX-60 Dual mit 2.61 GHz
Motherboard: NVIDIA nForce4 - Asus A8N32-SLI Deluxe
Memory: Mushkin 2x1024 MB HP 3200 2-3-2
Graphics Card: PCI Express Card
Hard Disk: 2 x Hitahi 120 GByte SATA, Cache 8 MByte
Network: Marvell
Audio: Realtek AC97
Software
OS: Windows XP, Build 2600 SP2
DirectX: Version: 9.0c (4.09.0000.0904)
Chipset Driver: Nvidia Nforce4 6.86
Graphics Driver ATI Catalyst 6.6/Nvidia Forceware 91.31
Network Driver: Windows default
Audio Driver:
Other Driver: none
Close Window
March 8, 2007 2:22:03 PM

Last noob question - if the TV supports a native 1920 X 1080 res, and the GTS only pumps out a 1920X1200 res, does this make a difference in quality if the images need to be stretched??

TY!
March 8, 2007 2:27:52 PM

You should set the GTS to do 1920x1080 - if there's nothing in the drivers to do it, PowerStrip should allow you to.

Failing that, the TV should interpolate, you shouldn't get any loss in quality, but you might.


The THG system isn't really comparable to the [H] system, is it? :)  I think that explains the 25fps difference! :) 
March 8, 2007 2:33:24 PM

I wonder if my x1900XT will be able to buy me some time, if I get that TV (wait for better cards). I know it wont be the best setup, but I wonder if it will run ok.

Know of any articles that are similar the |H| one, for x1900xt and 1920X1080?
March 8, 2007 2:49:46 PM

An x1900xt is still a powerful card. I would not pay 400$ to upgrade to a 8800 GTS, considering what you actually have. I don't think it will be that much of an improvement.

Honestly, I would wait. You will still be able to play games quite well at that resolution with your current card. Just lower some settings and don't crank AA.

Buy your new TV/monitor and wait. Upgrade later with R600 or 8900GTX. They aren't that far. Like 2-3 months. A x1900xt will be good enough for these 2-3 months.
March 8, 2007 3:43:04 PM

Hey Thanks,

Another thing is just researched is the 1080p minefield. OMG, what a joke. There is so much you need to know about this.

Im not too worried about HDCP yet (wont buy a HD-DVD player for a while). I'd like to play games though at full 1920 X 1080, at 1080p.

Will this MSI x1900xt card be able to output that?
March 8, 2007 3:45:48 PM

Just a tought, how do you intend on hooking your TV to you PC? Component/DVI/HDMI?

I ask this because according to page 15(in the PDF that is) of your TV's manual, you need to hook up using HDMI if you want to enjoy 1080p resolutions, so apart from selected ATI models (X1650pro i think) that comes with HDMI ports, you're toasted on that one.

Next off if you choose to hook up using the VGA plug, you will be forced to use 1366x768 resolution, since the TV doesn't support higher resolution when hooking up from a PC (see bottom of page 15 again for the table showing this information) so no go on that one either.

Your left with components and possibly using an DVI->HDMI adapter cable. I didn't see anywhere what resolutions are supported for the components in (generally you still can get 1080p using those), but then the thing to make sure of is that if you can output correctly using a component cable adapter that goes from your GFX card to your TV. From experience many TV's don't support HDMI connections from a PC's DVI port. So I would look into that one also.

I just wanted to let you know this and to be aware of these constraints, because i was personally foiled in December when i bought a Samsung 32" LCD HDTV to use primarily with a Windows MCE HTPC, and got the awful surprise of it not being as compatible a I thought it would be.
Just go through all the small prints in there, and look around for people that did exactly what you want to do just to make sure you can achieve it. Also I personally game on my TV @ 1366x768 using a crappy old ATi 9600se 256MB hooked up using VGA and still get nice frame rates while playing NFS:Carbon and other recent titles, I always disable AA and AF etc. because @ 32" I can't personally see the difference when @ 5-6' away from the TV...So your 52 incher @ 7-8 feet away will look even better.
My 2c.
March 8, 2007 3:51:34 PM

Quote:

Im not too worried about HDCP yet (wont buy a HD-DVD player for a while). I'd like to play games though at full 1920 X 1080, at 1080p.

Will this MSI x1900xt card be able to output that?


Well if you read the article on Tom's about AnyDVD's recent upgrade to their software, you would have known that you can now enjoy HD-DVDs and BDs on your non-HDCP ready hardware with their new player. :wink:

And about the 1080p minefield, got quite a few headaches from trying to find the perfect TV/monitor, just to still be disappointed in the end.

Quote:

I'd like to play games though at full 1920 X 1080, at 1080p.


The easiest way to play @ 1920x1080 is to find a TV that can output 1920x1080 as a native DVI resolution (like the 37" model form Westinghouse). Other than that it's a hit or miss situation unless you have access to a plethora of HDTVs that you ca try out using different setups, but that would be highly uncommon thing for the average consumer.

And BTW i have a friend that owns a 37" Westinghouse, and using a 7600GT 256MB GFX card he can play most of todays games with decent frame rates (not talking about Oblivion obviously) @ 1920x1080 using the DVI connector. And also note that when talking about PC resolutions, as long as you have the right resolution available for the panel (be it 1920x1080 or 1360x720) trying to get 1080p or 720p instead of VGA 1080 or 720 won't show any noticeable difference.
March 8, 2007 3:59:12 PM

Im sorry for your headaches. What a bummer. Thats why I'm taking the time to sort this nonsense out. The companies are hurting themselves. We are techies and cant figure this B/S out. How the heck is your average consumer supposed to navigate this mine field.

I really apprecaite your thoughts on this. Will dig deeper.

If a DVI to HDMI cable would be able to carry 1080p, then I might be ok. I'll just need to check that the TV can actually accept that setup, and at what res it will run at.

Maybe I should take my PC into BestBuy to hook it up first, to see if it works. Could one do that???
March 8, 2007 4:02:38 PM

I did see that AnyDVD article, but it doesnt really apply. The new TV will be HDCP compliant, and so will the new HD-DVD player when I get one eventually (when prices come down A LOT). The card would be the prob in that case, but by the time I get the player, I'll probably have to get another card. Interesting though.

I've read that HDMI cables carry audio and video, so no need for extra cables. I assume a DVI to HDMI cable cant do this (v-cards arent equipped for this either right?)

Thanks.
March 8, 2007 4:05:01 PM

If most TV's dont support DVI to HDMI, how the heck do people build HTPCs??? Whats the solution, get a compliant TV.

There are no vcards with HDMI out right?
March 8, 2007 4:13:57 PM

Quote:
Maybe I should take my PC into BestBuy to hook it up first, to see if it works. Could one do that???


LOL I wouldn't be surprised if the clerks would look at you funny, but hey, what won't they do to make a sale. Even if it's not your PC, i'm pretty sure some geek over there would be able to arrange some sort of testing using one of their own PCs, so you could get you answers. But again that all depends.
March 8, 2007 4:25:04 PM

Quote:
If most TV's dont support DVI to HDMI, how the heck do people build HTPCs??? Whats the solution, get a compliant TV.


As i pointed out earlier, using a standard VGA input on the TV. Not all TVs have VGA port on them, but most LCD panels have them since they're basically PC monitors with integrated NTSC/ATSC tuner built into them. But I'm not well versed in the Plasma domain, since those were not the object of my desires at that moment.

Quote:
There are no vcards with HDMI out right?


Yes there are some models with HDMI outputs, Here's one as an example from Sapphire (Ati) but there's some from Nvidia also (I think). But normally they're not of the high-end flavor.
March 8, 2007 4:28:01 PM

If you use the VGA port though, you'll be limited to a X 700 something? resolution right? You wont be able to get 1920 X 1080. That TV doesnt accept anthing higher than 7??, but neither do most as far as I know.

You need HDMI/DVI to get 1080p, or 1920 X 1080.

WTF?
March 8, 2007 4:30:10 PM

scour the "google forums". I've been reading about this matter a little myself since I'm also looking to buy a 1080p tv sometime in August.

During my escapades through google I came across DVI to HDMI cables and some people mentioning that they have successfully used these cables to use 1080p tv's as monitors.

(I think one of them was review on pricegrabber for a 46" aquos).
March 8, 2007 4:38:59 PM

Quote:
scour the "google forums". I've been reading about this matter a little myself since I'm also looking to buy a 1080p tv sometime in August.

During my escapades through google I came across DVI to HDMI cables and some people mentioning that they have successfully used these cables to use 1080p tv's as monitors.

(I think one of them was review on pricegrabber for a 46" aquos).


If it's possible to do that with the specified model (the 52" i mean), then it would only be a matter of enabling the 1080p resolution support inside the Catalyst control panel for that connection and you would be good to go.
March 8, 2007 4:40:48 PM

Geeeez, read this:
http://www.edn.com/article/CA6413792.html?ref=nbop

Are we screwed? If I read this right, you can only view 1080p (1920 X 1080) if you have a HDCP compliant v-card, calbe, TV and player?????

If this is true, then no matter what setup I have, I wont be able to get that res in games using my 1900xt?
March 8, 2007 4:49:16 PM

Quote:
Geeeez, read this:
http://www.edn.com/article/CA6413792.html?ref=nbop

Are we screwed? If I read this right, you can only view 1080p (1920 X 1080) if you have a HDCP compliant v-card, calbe, TV and player?????

If this is true, then no matter what setup I have, I wont be able to get that res in games using my 1900xt?


Well that would explain the f**cked up images i get when i plug in a DVI to HDMI cable from my GFx card to my TVs HDMI inputs. It looks like it takes full 1080i resolution and squishes it into an 720p image size...looks like hell...That would make the statements in the article accurate for my situation.
Still my TV only accepts 720p and 1080i through HDMI though.
March 8, 2007 4:54:43 PM

This is grounds for a class action lawsuit against the goverment - misleading the public. They implemented these standards, but did not implement a compliance system to make sure the manufacturers label and inform the public of the capabilities of their products. Have you read the cock-up with the HD ready vs full HD argument. OMG. I cant believe these people. How could they mess this up so badly. Its a sad time for consumers.
March 8, 2007 4:55:19 PM

IYO, do you think that it would be safer to go with a 1080i TV?
March 8, 2007 4:56:32 PM

I agree with Lachdan. Keep your X1900XT for now and go for a faster R600 or 8900GTX when they're available. By that time you will probably have more options with HDCP compliant components and, who knows, maybe even an HDMI port on the graphics card itself.
March 8, 2007 4:57:10 PM

Im running the westinghouse 37" 1080P screen. Just got it yesterday and I LOVE it.. you can do 1080P in dvi, vga and hdmi. i think component can do it as well I know it does 720P. I havent really tried any computer games yet on it but I have run my 360 and it works great. For normal computer stuff so far my 7900GS handles it great.. Maybe later ill pop in flight sim X and watch my card cry.
March 8, 2007 4:59:40 PM

Is there a difference between 1080i and 1080p for GAMES? I know it makes a difference for movies.
March 8, 2007 5:00:40 PM

And another thing: You don't really need 60fps for Oblivion to run smoothly. It's not a lightning-fast first person shooter like Q4 or FEAR or CS: Source for example.
In Oblivion 35fps should keep anyone happy
March 8, 2007 5:00:49 PM

I don't know if they added it yet, But the only problem you'll encounter with that TV, is if you don't have a Digital Cable/satellite set-top box to use as a tuner, since the models I've seen don't come with any NTSC/ATSC tuner built into them. And yes it's quite a nice panel indeed.
March 8, 2007 5:05:24 PM

With all these DRM problems, I would not buy a 1080p TV.

I have a cheap 32" 720p/1080i Digimate TV, and everything works well. I have it connected (VGA port) to my computer and the image quality is good enough. My graphic card isn't HDCP, even if I bought it 6 months ago (7900GTX).

I have still managed to play Oblivion on it and it was beautiful. And also : 720p TV have a resolution of about 1320x768, which is really easy to output for a graphic card. It's not harder than 1280x1024. So you'll enjoy nearly max quality with current games without upgrading your graphic card.

And if you plan to eventually watch HD movies, just buy AnyDVD HD to disable HDCP.
March 8, 2007 5:06:03 PM

Quote:
IYO, do you think that it would be safer to go with a 1080i TV?


No, interlaced will look like shit.

Now, HDMI IS DVI, it just has some extra capability for carrying audio. The only reason an HDMI-to-DVI cable wouldn't work is if the TV manufacturer either a) fucked up the TV so it can't handle a video signal without audio (dumb) or b) specifically tried to prevent it.

The absolute best thing to do is buy an HDMI-to-DVI cable, borrow a laptop with DVI and go test it on some TVs in a store. I don't know if most stores will let you do that or not though.
March 8, 2007 5:06:49 PM

TV is a whole different story. I called Comcast, and they only broadcast in 1080i. There are no plans in the next few years to broadcast in 1080p, and frankly, from what I've read, the networks cant handle it. Its 1080i for the next 3-5 years at least. 1080p would only be worth it if you plan to get a HD-DVD player, which we all will in time. It worth it to "future-proof" yourself.

The big question is about the HTPC setup. I cant find a solution. Anyone found a 52" TV (or native res 1920 X 1080) that can accept full 1080p though DVI??
March 8, 2007 5:11:18 PM

Two very important points for a 1080p TV:

1) Your big screen MUST be able to delace 1080i to 1080p without trouble, otherwise watching TV will be a joke
2) The TV must accept 1920 X 1080 (or 1080p) over HDMI.

Can anyone post a TV on here that does that????
March 8, 2007 5:26:21 PM

http://72.14.209.104/search?q=cache:VF1sf-KIuNwJ:review...

"All in all, I think anyone would have a hard time distinguishing between 1080i and 1080p on today's 1080p HDTVs. But if you really want one that will support this future format, you should wait until the summer of 2006, when almost all 1080p resolution HDTVs will offer 1080p input support. "
March 8, 2007 5:41:35 PM

hdmi = dvi + audio + drm

inorder for a computer to output an hdmi signal on an hdmi cable, it must include the audio with the video(plus the DRM bit, if applicable).


i use a 6800 to display to my westinghouse 37". wish it was faster :( 
March 8, 2007 6:11:36 PM

Quote:
Geeeez, read this:
http://www.edn.com/article/CA6413792.html?ref=nbop

Are we screwed? If I read this right, you can only view 1080p (1920 X 1080) if you have a HDCP compliant v-card, calbe, TV and player?????

If this is true, then no matter what setup I have, I wont be able to get that res in games using my 1900xt?


I am not an expert on DRm or HDCP. From what I have read/understood, HDCP is required only if the media being played (i e, HD-DVD/Blu-Ray or protected content) calls for it. Thus, the protection policy must be invoked by the content provider on the disc and the hardware must then be able to meet their requirement. Otherwise, the media will play at a reduced resolution.

This raises the obvious question: are the games you will be playing "protected content?" If the game publisher did not invoke a content protection policy, then I suppose the games would play in any configuration you set your equipment up.

Again, I am not an expert on the subject but it is important to note how this DRM-thing is triggered.

My 0.02
March 8, 2007 6:23:59 PM

Quote:
Geeeez, read this:
http://www.edn.com/article/CA6413792.html?ref=nbop

Are we screwed? If I read this right, you can only view 1080p (1920 X 1080) if you have a HDCP compliant v-card, calbe, TV and player?????

If this is true, then no matter what setup I have, I wont be able to get that res in games using my 1900xt?


I am not an expert on DRm or HDCP. From what I have read/understood, HDCP is required only if the media being played (i e, HD-DVD/Blu-Ray or protected content) calls for it. Thus, the protection policy must be invoked by the content provider on the disc and the hardware must then be able to meet their requirement. Otherwise, the media will play at a reduced resolution.

This raises the obvious question: are the games you will be playing "protected content?" If the game publisher did not invoke a content protection policy, then I suppose the games would play in any configuration you set your equipment up.

Again, I am not an expert on the subject but it is important to note how this DRM-thing is triggered.

My 0.02
The only thing that's not completely accurate about your statement, is the fact that Both side of the connection can call for the HDCP authentication. So if either the TV or PC fails the HDCP handshake, you won't get full access to HD resolutions. If you look at a couple of TVs specs, you will find that not only most HDMI inputs are HDCP compliants, but also some DVI inputs require HDCP compliance.

And this goes in accordance to the linked article in the OP's earlier post.
March 8, 2007 7:30:44 PM

[/quote]
The only thing that's not completely accurate about your statement, is the fact that Both side of the connection can call for the HDCP authentication. So if either the TV or PC fails the HDCP handshake, you won't get full access to HD resolutions. If you look at a couple of TVs specs, you will find that not only most HDMI inputs are HDCP compliants, but also some DVI inputs require HDCP compliance.

And this goes in accordance to the linked article in the OP's earlier post.[/quote]

I am not so sure that the TV can "call for the HDCP authentication." How does it do that if the content being played does not have content protection? I believe the article in the OP's link is referring to hardware-compatibility rather than DRM/HDCP-compliance.

FWIW, this Vista blog explains how DRM/HDCP work. I am neither for or against DRM; but, it is there for our understanding.

http://windowsvistablog.com/blogs/windowsvista/archive/...

Cheers!
March 8, 2007 9:15:51 PM

Look at it that way, You're talking about DRM protected content playing through a secure/non-secure platform (PC).
What we are talking about is when you JUST WANT TO CONNECT YOUR PC TO AN HDTV SET THAT REQUIRES HDCP COMPLIANCE TO DO ANYTHING ABOVE NORMAL RESOLUTIONS!!!
What I mean by this is that companies have made the recent hardware aware (through the HDCP protocol) of the possibilities that you could run HI-DEF content on a non-secure platform by using, for an example, a DVI to HDMI cable which circumvents the HDCP handshaking, but like I said earlier, as of now many DVI equipped HDTVs have HDCP enabled DVI inputs, and those are normally the only one you can use if you intend on outputting images in 720p,1080i or 1080p definition. BTW HDCP is enabled by placing chips in the CE products used to play/view/edit Hi-def content with the correct protocols which take care of the handshaking (checking if both sides are secure from an outside, third party, software or hardware that could permit the duplication of the said content) between the connected Products. So in that sense you need to have HDCP enabled hardware on both sides of the connection to be able to even have acess to the Hi-def resolutions available on the hardware. Some TV set can accept Hi-DEF 1080i(all recent sets can) through component inputs, but not all will. As i said before, my Samsung HDTV can output components fed images @ no higher than 720p, even if its capable of outputting 1080i from the HDMI or inputs. And if then i try to plug an DVI to HDMI cable from my GFX card to my HDMI (1080i enabled) input, i get a reduced size image since the TV can't have the HDCP protocol verified by my GFX card since it's not HDCP enabled....It's a mess. And you should look on the web to help you understand the full implication the whole HDCP things has in the Hi-DEF market. Beleive me i've been following this since Day 1, and I only get disapointed the further i goes.

P.S.: This does not affect the HDTVs and HD-DVD players from 6-8 months ago that were 1080i/p capable, since you were able to go and by all that and plug everything through component cables and get the best picture quality since the HDCP was only implemented on very few HDMI equipped HDTV sets and players at that time (which in turn could be tempered with since the connection between the 2 components wasn't secure, thus permitting, theoretically, the recording of the hi-def content using any component equipped recording hardware, since the DRMs that encode the Films are decoded before leaving the player).

Also if i'm correct only very few (if any) devices are even able to output 1080p images through component cables anyways. So if you intend on playing stuff @ 1080p, you either have the choice of going with HDMI or DVI, but as said earlier in this post, many devices are HDCP protected when using either of those connections.

EDIT: edited form
!