Image quality of the 4800 series

jackieboy

Distinguished
Jan 8, 2006
219
0
18,690
Hey guys....Just want to know what you think of the image quality of the new cards....Yes we've seen benchies and stuff. I'd like to hear what you think of the image quality they put out..As I have a 8800gt and want to get a new 4870. I'd just like to hear from the many who have switched.....Due to I'm uncertain as to its worth spending another 300 bucks when my gt performs great and has good image quality. However I did like my older 1900xt image quality better than my 8800gt even though it was slower. Thanks for any imput guys
 

dragunover

Distinguished
Jun 30, 2008
112
0
18,680
Yes,it will definitely be better,but whether you want better performance and image quality,is up to you.If I had some extra cash on hand,and had a crappy CPU,PSU,Monitor,etc.,I might want to upgrade those first.
 
G

Guest

Guest
but he doesn't... for you jackie i'd wait and see if the prices drop a little bit more... the gt 200 refresh (rumored to be GTX 350 http://www.tomshardware.com/forum/252962-33-nvidia.... not probable but w/e) should lower prices and the prices should come down anyways as the launch date gets farther away... so i'd wait a little bit
 

jackieboy

Distinguished
Jan 8, 2006
219
0
18,690
but he doesn't... for you jackie i'd wait and see if the prices drop a little bit more... the gt 200 refresh (rumored to be GTX 350 http://www.tomshardware.com/forum/252962-33-nvidia.... not probable but w/e) should lower prices and the prices should come down anyways as the launch date gets farther away... so i'd wait a little bit


Glad you brought that to my attention....I don't have lots of time to keep up with what's going on all the time.....Wait is what I'm going to do
 
Yeah unless there's something really pushing you to upgrade, let the prices settle a bit and a few of the unknowns settle a bit (like drivers, PhysX, etc).

The image quality differences should be next to nil, what you likely experienced going from the X1900 to GF8800 is the difference in defaults where the ATi default settings are generally preferred by most who run both. However spend a little time to tweak and calibrate and they look equally good (if anything the GF8800 should've shown betteer AF quality).

There are a few differences on how they do AF and AA quality, but they essentially amount to a tiny advantage for nV in AF and a tiny advantage for ATi in AA, for most people the difference will only be viewable under screen captures, not in game (this is not like the partial precision or shimmering issues). You of course see anomalies here and there, but overall the quality should be pretty much the same.
 

bifford

Distinguished
Jul 17, 2008
20
0
18,510
I will just say this... The default settings on the ATI card made everything desktop and 3d look very washed out on my computer. The included software doesn't make it particularly easier to adjust the settings to produce rich vibrant color. I am not saying it can't be done, but it just isn't easy.

The NVidia software has a calibration wizard that makes it soo easy to adjust color that even I can handle it with no problem whatsoever. I am sure both cards are capable of producing the same image, as I have seen screenshots to prove it. ATI may just be more for the tweaker crowed, but I found I really missed that wizard.
 


Pretty much the opposite of the average experience, and I wouldn't say the people who notice the difference are tweakers as they use the difference to swear that ATi has better IQ than nV, when if they did tweak they would know it's basically the same.
 

sdf

Distinguished
Aug 5, 2007
231
0
18,690
Ape or anyone else, what has to be tweaked on the NV card and for a novice how difficult would you say it is?
 
Depends on the card and monitor.

Main thing is to find a starting point, there are many calibration test images out there. Then tweak until correct.

The usual differences people mention are brightness and gamma (as in the ATi is brighter or has better gamma in games), and some mention the unspecific 'more vibrant colour', but if you want things to be overly vibrant, as in beyond oversatturated, then that's what the digital vibrance toggle is for on nV cards, and some people really prefer that despite being way beyond where calibration would want you to peg things.

I tend to find it depends alot on what kind of monitor you have, at work I have a very well calibrated but quite cold CRT with settings in the 9000K range, so that's going to be very different from an LCD which in general could only dream of that option.

Anywhoo, probably the most detailed and 'scientific' comparison I've seen is that of Maximum PC which made a valiant effort to compare the two in blind testing. It's last generation's contenders byt the IQ differences should remain the same while the performance situation has changed (so the conclusions may not be as valuable as the observations);

http://www.maximumpc.com/article/videocard_image_quality_shootout?page=0,1
 

sdf

Distinguished
Aug 5, 2007
231
0
18,690
Thanks for the link ape, interesting. It got me thinking though, same as your comments.

If they took both of those rigs, being identical with exception to the GPU, and made sure all settings between both cards identical as well ie, contrast, red, blue, green, gamma, brightness so on and so forth (you probably get the idea) and and then compared them would there be any difference then? Figure it would only be a fair comparison when all things are equal not default settings. This comparison also got me thinking about that.

http://www.driverheaven.net/reviews.php?reviewid=552&pageid=1

Don't know but it looks like the nvidia card might have been using a vivid setting while the ati card looks like its using the original setting, though IMHO the nvidia video looks better especially with the enhancement but again wonder what the video settings were.

Your thoughts?
 
We have descused this kind of issue many times before and TGGA is quite correct in what he is saying about IQ levels and the need to set things up properly. There are so many things that matter to the finished image that screenies are pointless really but people still post them. Generally as he said people tend to prefer the default ATI picture as most find the Nvidia one washed out. The general suspicion is that those who are so anti Nvidia are remembering the days when Nvidia quality really did suck compared to ATI. These days though its very close.
Your last statement about covers it though "but again wonder what the video settings were."
Thats what it comes down to at the end of the day. Settings and most importantly preferance.
In game you couldnt beat the image quality available on a 1950 card ,still cant in my opinion its just that the card isnt up to the job anymore.
ATT worked wonders with those cards and you could get much more detail and increased frames if you knew what you were doing. Sadly they seem to be lagging in supporting both the 3 series and the 4.
Mactronix
 


Yes there would be minor differences in the way the cards render, when you set them up for similar defaults, that doesn't make every exactly the same since they use different methods and algorithms to try and achieve the same result. When looking at images closely you will see minor differences in how the output the final render with one being more or less of one factor or another. The only way to know which is 'correct' would be if you have a reference raster output to compare to.

Figure it would only be a fair comparison when all things are equal not default settings. This comparison also got me thinking about that.

My thoughts on that are that it's nothing new, the two companies go back and forth releasing new updates to their AVIVO/PureVideo software all the time changing positions. That review happened to come before the Catalyst 8.5 update in May which added new features to their tool box. However the HD4K has additional features to those found in the HD3K, so there's been two updates since that test. It's a game of cat and mouse they keep playing and switching roles. And if you think you know which of the two started it you'd be wrong because of course it was Matrox who pushed them both to improve their 2D/Video IQ. [:mousemonkey]

Interesting that the reviewer doesn't agree with you for Image quality so much as colour (which is the typical preference most people have for over-saturated colours). If you look at a comparison that includes more than just the two, and more importantly and image compared to a hardware player, you'll see that the accuracy favours AVIVO, and like I mentioned before, some people prefer settings that may be far from accurate.
It's like when David Chang was talking to Charlie Rose this evening, just because something is 'more authentic' it doesn't mean it tastes better, which would be similar to just because something is displaying accurately, doesn't mean people are always going to prefer the accurate image.
 

bifford

Distinguished
Jul 17, 2008
20
0
18,510


For NVidia, it is almost a no-brainer. You run a wizard which walks you through calibration. Some of the options will not matter for digital monitors. My favorite part is the color calibration. You have a red, green, and blue color bar. Each bar is divided into 3 sections. To calibrate, you simply move each bar until the color tone of each of the 3 sections matches. It takes less than 30 seconds to achieve excellent results.
 


If that's what you think proper calibration is, then it's no wonder you have problems with the additional option in ATi's control panel.

For the calibration we're talking about here, it requires proper use of more than just wizards.
 

sdf

Distinguished
Aug 5, 2007
231
0
18,690
Hope this doesn't get redundent Ape, but if I decide to go back to nvidia when I upgrade this fall what do I need to do for proper calibration? What to do or links would be very helpful, either for myself or others that could use it. Thanks for everything so far.
 

bifford

Distinguished
Jul 17, 2008
20
0
18,510
Not saying that it is a standardized calibration. You wouldn't use it for professional video or photo calibrations for sure. It is just a simple tool for novice users and/or gamers to achieve a nice look.

Obviously, for professional calibration there is 3rd party software and hardware available. Prices range al over the place.

Just out of curiosity Ape, could you actually share your tips for a novice to calibrate their screen?
 
The best way to do it is to find sample images like I mentioned. Use those to tweak until the image is correct. This will set your monitor and card for 2D images and normal desktop.

For gaming if you're more picky about your images, then you would make sure using fraps that your image is similar to what you see, where a screen capture should be similar to others and as well as similar to itself (does it look different when looking at it in 2D than it looks when rendering (which would imply something in the driver [profile] or game is tweaking it too), then you'd create a profile for the game.

Personally I use calibration hardware and commercial software, but here is a very good collection of reference images for manual tweaking;

http://www.lagom.nl/lcd-test/

It has easy to follow instructions too, pretty helpful for most people.

For most novices the free/trial versions of DisplayMate should get you close for little effort and then you can manually tweak from there. To me it's worth the time, for many it's not, but anyone asking about the IQ differences between the two major IHVs should spend the 30-60 minutes required to make sure their card and monitors are correctly adjusted. Especially since a poorly adjusted monitor will mess-up AA above 8X and muddy AF a bit since both reply on proper colour, gamma and Alpha differences between adjoining pixels to achieve their effects.
 

sdf

Distinguished
Aug 5, 2007
231
0
18,690
Kind of getting back on the original topic. How much of an improvement is there, if any, of the 4800 series over the previous 3800 series in graphics and DVD playback?
 
No improvement in graphics other than performance, and very little difference in DVD. It's only in BluRay and such you would see a performance difference not a major quality difference.

Games it's still DX10.1, but of course higher setting may become more playable, not better looking unlike the move from the Gf7-GF8 and to a lesser extent X1K->HD2 and even less to 3/4K. HD content same thing, similar feaures just faster more effective in the update, like an HD3K can still decode dual streams, it just needs alot more help from the CPU than the HD4K. Not a quality difference just a relative performance difference, and on a very fast processor, might be unnoticeable.

The biggest differences are between architectures, and now even those differences are very small and really most noticeable on screen captures, usually not under use. Soem differences are a little more effective in motion than in screen capture like temporal AA, etc. but even that is a small difference. And like most things, some people are more sensitive than others to those differences and would swear to never be without or suffer with X or Y.