What kind of video card would be good, not just good but actually very good yet not excessive, for PC gaming using a Samsung 46" LCD TV? Yeah the TV is my monitor. I currently use an EVGA Nividia 8800GT and it has only recently started to choke on the newest games on high settings.
My real question here is does the size on the monitor effect the video card performance?
I do love the 8800GT for the power per watt performance. I mean compared to todays offerings it's a miser but I'm having to reduce my resolution now with newer games. Do I really need to get a dual card or SLI to get the best speed with this monitor?
The rest of the system is aging but still very capable: Intel 8500 dual core, 8Gb Muskin Redline DDR2, a mainstream but very fast Western Digital 700'ish gigabyte hard drive, Auzentech Prelude (man they have improved driver support like night to day), and an Abit IP35 Pro.
They really make these parts to last for years and years now ..too bad many of us need to upgrade regularly to keep the speed coming. I'm really wanting to get the most performance from a new video card but one that uses the least power (idle most importantly) draw.
Go with an ATI 4750x2 or...?
Haven't said hi in a few years. Thanks for the many years of informative networking Tom's!
a quote from member Serwan "and btw, doesnt matter if ure running a 20 inch or a 100 inch, what matters is the resolution size... how many pixels on the screen... that will determine the Frame Rates you will get... "
..and another member said the same thing so that answers that question. My 8800GT was never hindered by my TV size.
So if this is correct, and I think it is, then all that is left is - should I upgrade and to what.
I had just written a reply and in dealing with all the damn Intellitext ads I lost IE to a lock up...
I'd like to run all games at the native 1080P resolution. Maximum.
What should I test the card with for the FPS? Just a free version of FRAPS?
An article at MaximumPC on the Radeon 4850x2 recommends more memory bandwidth than GDDR3 for monitors above 30". Should I only get a card with GDDR5 then? It was unclear to me if the author meant the DDR3 was the only thing limiting the memory bandwidth on larger monitors. I don't know much about the technology in the components and how they work anymore, not bandwidth usage on a GPU. Can anyone comment to this? I'll post there also.
Power usage is hugely important, as I said, but I need the card to perform well across the board and not least importantly last me a good while. In Maine we still have some of the highest electricity usage bills in the U.S.A. and we probably always will.
I won't consider an ATI just yet because of the idle power usage. It' about an average of another 25 watts more than NVIDIA solutions and that gets expensive in Maine, it all counts. The dual GPU (x2) is probably a no go for me since there are games that get pissy with that setup still. I like to play some old games sometimes. It's not like I prefer NVIDIA but there is no other choice.
Usually I do not buy "Best of the Best" but the near underdog as long as the compromise is not too much. A GeForce 260 GTX should be considerably faster than the GeForce 8800GT but in the many benchmarks I've seen its very minimal ..maybe I was looking at the non-GTX 260. I've it narrowed to a GeForce 280 (variety) or a GeForce 285 (variety) although I haven't forgotten the 275's I haven't looked at many charts with it tested either.
The article's reference to 30" monitors (as opposed to 30" TVs) is actually a reference to the common resolution of quality 30" monitors - 2560x1600, which is roughly 2 times the pixel count of 1920x1080 HD. There will be no videocard performace difference between your TV and my Monitor, a 22" BenQ that also runs at 1920x1080 resolution. The display device itself is not a variable that effects performance, but the resolution you are choosing to send to the display.
Then at 1920x1080 I have more options than I was beginning to believe. That being my MAX resolution helps me narrow things down some. Some of cards that are higher end are not much if any faster than the other at a given resolution I hope. 1080P is up there though.
Check this out though.
While investigating what would limit GDDR3 vs GDDR5 in filling a 46" monitor, limit the speed, and prompted by the MaximumPC article (4850 X2 review) I mentioned above, I noticed NVDIA only uses GDDR3 yet meets or exceeds AMD/ATI in performance of the memory bandwidth ..I may be getting beyond my tech knowledge there. It would ppear to me that on some of the ATI higher end cards they are fast as the NVIDIA counterpart but can choke up at the highest resolutions despite the speed, like you were saying?
A few months ago IzzyCraft said this:
"Bigger monitor = more pixels meaning a cleaner image so you get a better picture with it but there is more pixels so more things to draw, which is why more memory is used 1 gig 2 gig card. Now what ATI uses GDDR5 is they found it probably to be cheaper to use faster memory to get the throughput in memory power the ability to take x amount of data and processes and put it back out there. While Nvidia doing a more classic rout increase the memory interface.
Basically the top performing cards have about the same power or somewhere near.
ATI is like a high pressure hose it has a small nozzle but uses more force to push more water out faster.
Nvidia is like a hose with a much bigger nozzle so water sloshes out in greater quantity.
ATI gets water out by using more force to get water out faster
Nvidia gets water out by widening the hole so more water can get out at once.
But overall when the day is done they both fill a bucket at the same time or just about...
Hope that analogy worked out for you, only real test in performance in look up benchmarks."
The MaximumPC article meant that streaming the information through 4850 X2 ATI ..method was quick but ..choaking at the very highest resolutions? lol
I may be analyzing it all too much.
Now I just need to find some more reliabe comparison reviews. There are some really nice deals on some GeForce 260 GTX's but not sure how log it would last me even for the $144.00.
Would I get my moneys worth going from a GeForce 8800GT to the 260GTX or should I invest a bit further into the NVIDIA product line spending about another 50 bucks?
My specs are above in the first post and I'm wonder about where my system will bottleneck, I mean what card would do it.
Sucks being FORCED to be Ultra Green in my state. If power requirements were no big deal here or just cost a little more per energy bill I'd have a quad core. I hope the upcoming Intel i8 have a quadcore that is energy efficient also. I admit I don't keep up with the tech news enough anymore until I need something... anyway I am drifing off toptic too much ..like my air conditioner going on and off every 5 minutes over here. haha
I looked around forever on what video card to put in my new build that I will be using with my Samsung 50" HDTV and I pretty much never found anything supporting any HD card would be better then another as long as it supported 1080 resolution.. I went with a Sapphire Toxic 4870 my biggest concern is the realiability of the card being hooked up to a 50" display all the time and how long it will last.