dtq :
I would welcome the opportunity to see first hand the ATI video difference as every movie I watch is on a 8feet wide screen powered by nvidia, I have no complaints over image quality, (its enough to drop most peoples jaws when they see something like star wars on it) but would like to see what the difference is for myself. I suspect that a new projector would do more for me than a new graphics card though. I quite fancy going blu-ray and Native 1080 on the projector, but I spend more time gaming than watching movies so new graphics cards come first.
Well, Pure Video's improved a bit since Tom's or Anandtech did reviews in 2005 through 2007. Right now, Nvidia handles picture in picture HD content better. That would work for those of us who watch on our monitors.
Still, ATI has better image quality:
Media Enthusiast Observations:
If you expected both images to be similar then this is where the surprises start. There are clearly major differences in image quality during this scene. In the ATI image we can see far more detail in Eva Greens hair where as on the Nvidia image it is too dark to make out much of the texture. The same can be said of Daniel Craig’s jacket; on ATI the collar is visible where as the Nvidia image shows no definition between jacket sections. In terms of skin tone, the Nvidia image is clearly darker with more reddish tones where as the ATI is lighter with more of a grey/green effect...
...Overall we can gather the following from the above information. While Nvidia's image looks more saturated in fact it is not. ATI are using higher levels of cyan content to create the illusion of a more natural image and to be fair, it works very well. Nvidia are using a redder hue to the image however rather cleverly it is not effecting black or neutral tones and is clearly a decision on their behalf to present a softer, warmer image.
Personally I prefer ATI's image especially for detail in the mid tone to black range, however I have a feeling that Nvidia's image would look more vibrant and full of life on a poorer screen.
http://www.driverheaven.net/reviews.php?reviewid=552&pageid=6
A few pages earlier, in regards to CPU usage and picture in picture:
Not much has changed with video acceleration performance since we last compared ATI to Nvidia. The 3870 still has the edge with CPU usage, especially in VC-1 encoded titles but Nvidia still have the feature advantage with the ability to play back high-definition content with Aero enabled...
...Nvidia also have a great feature called Dual Stream Decoding. Using Resident Evil Extinction (AVC codec) as an example, the 9600 will accelerate the movie and ensure that Aero is still being used. The cards will also accelerate a secondary stream. A good example is the picture in picture special feature which can be enabled when watching the movie. On ATI hardware when using picture in picture the average CPU use takes a huge jump from around 3% to 14% whereas Nvidia hardware stays at less than 8%. These figures really do make us question ATIs apparent dominance with video playback.
http://www.driverheaven.net/reviews.php?reviewid=552&pageid=3
More on image quality:
Media Enthusiast Observations:
With this outdoor scene we see that there is a significant difference in overall brightness between the two images and the result is that once again Nvidia lose much of the finer detail in areas such as dark hair. There is less of an impact in the rest of the image and Eva Greens skin and dress appear to have a much warmer tone on Nvidia, although it is hard to say which version of her skin is most true to life...
...There is no doubt in my mind that ATI are displaying a technically better image in this scene, however on the screens I tested both look rather good.
http://www.driverheaven.net/reviews.php?reviewid=552&pageid=7
More on image quality:
Media Enthusiast Observations:
In our final Casino Royale comparison we see that once again there is much more detail in the ATI image with the right side of Daniel Craig’s face being significantly more visible. On the Nvidia side the overall image is more alive and the skin tone gives the impression that the actor has really been through an action packed adventure in a hot and humid country. It is just more real.
So which is better? Well again first looks can be deceiving. The ATI frame immediately looks more neutral, but again this is due to the fact there is a good 10% higher cast of cyan throughout, which while in some instances will work well does not when there is a predominant face in the scene. A natural skin colour has almost no blue content, especially in a warm jungle setting and this would tie in with Stuart's comments above that the overall Nvidia image is "more alive". That said, after analysing Daniel's face, we found that the ATI solution had only a 2-3% higher content of cyan tones so clearly there is some clever coding behind the scenes, analysing the skin tone and reducing the cyan saturation. Still overall, the technical results are in favour of Nvidia.
http://www.driverheaven.net/reviews.php?reviewid=552&pageid=8
Background environment image quality:
Media Enthusiast Observations:
When we move away from Casino Royale’s more human based content and change to the more scenic Planet Earth BBC documentary we see that the differences in image quality occur regardless of the content used. In this particular image there are three main areas we highlighted as important. Firstly the Sky, on ATI hardware it is quite washed out where as on Nvidia it seems more real, a better shade of blue. Next when we look at the level of detail in the darker area at the base of the waterfall there is clearly more detail to the ATI image. Finally on the foliage throughout the scene is more lush and green on Nvidia than it is on ATI...
...Overall, both images are excellent, however on a technical standpoint I would give Nvidia another point for this test as their image is purer, although the sky tones are slightly flat.
http://www.driverheaven.net/reviews.php?reviewid=552&pageid=9
More background image comparisons:
Media Enthusiast Observations:
When watching the Grand Canyon section we see that the cliff face on the left of the screen is more impressive on ATI, in particular the small cave area has much more detail visible. We also feel that the colour of the rock on the left is slightly more realistic on ATI. On the Nvidia side we feel that the lower sections of the valley have far better definition to them, especially in the distance. It is also noticeable that the sky is different on the two images, if we had to choose a preference it would probably be ATI’s version as the Nvidia one seems slightly too blue.
Expert Observations:
Taking a section of the sky and analysis the colour components shows again that ATI are using more magenta to give the impression of richer colours (Cyan 27%, Magenta 14%) while Nvidia have 27% Cyan and only 1-2% of Magenta. This would tie in with Stuart's comments regarding the Nvidia sky being 'slightly too blue' as without the Magenta content the sky would appear to be lighter and much colder. Glancing over the scene the Nvidia image is slightly more 'blasted out' (a term to signify less detail in the highlight areas), as you can see if you look at the stream meandering down the centre of the image.
Section 2, a shadow area on the rock face shows a failing of Nvidia rendering with a much higher saturated image - in direct comparison to the ATI image there is 15% higher black content and moderately higher yellow content. Adjusting the black curve in Photoshop allows the detail to return slightly, however ATi win this test on almost every level.
http://www.driverheaven.net/reviews.php?reviewid=552&pageid=10
Underwater scenes and fish:
Media Enthusiast Observations:
Our final image looks at an underwater scene and there are two areas we feel are worthy of singling out. The first is the darker fish, towards the front and left of the image. On ATI its scales/markings are much clearer where as on Nvidia we again see detail suffer. Looking at the bright yellow fish near the centre of the image we find that the Nvidia image actually shows it to be richer and the stripes better defined. The ATI image of the fish is a little too washed out for our liking.
Looking across the rest of the image we can see that the overall cast for the Nvidia solution is Magenta biased apart from the blue ocean background. Both manufacturers are displaying the image slightly differently and in the end it will be up to the viewer to ascertain which is the 'better' image. Technically they are both very good, just slightly different.
http://www.driverheaven.net/reviews.php?reviewid=552&pageid=11
Nvidia's Pure Cinema adds new features to match ATI but they aren't enabled by default:
With the 9600 and 9800 series of graphics cards Nvidia have added two new video features to PureVideo HD. Dynamic Contrast Enhancement takes each frame and calculates enhancement values for that frame which is a change from older methods of contrast enhancement which apply the same changes to every scene, regardless of whether enhancement is required. Colour Enhancement analyses the tone of an image and dynamically adjusts it for each frame. The example Nvidia use to describe this is "skin tones changing from flat to vibrant".
We decided to test these options and the results follow, it should be noted that we also enabled the noise reduction feature in the Nvidia drivers and set it to a level similar to ATI. (ATI enable noise reduction by default). In theory, this should represent the best image quality Nvidia has to offer, without changing edge enhancement values. When we asked ATI what settings we should use for testing, 'default' was the response, so we can assume that the screenshots produced so far represent their best image quality...
...Unfortunately with this enhanced algorithm there are also some rendering issues as you can see in the zoomed analysis above. We are seeing stepped bands of pixel discoloration within certain boundaries, this was noticed by Stuart above and I feel this is to do with the changing of the black content in the attempt to brighten up the overall impression of the image.
http://www.driverheaven.net/reviews.php?reviewid=552&pageid=12
ATI renders more images slightly better than Nvidia:
Media Enthusiast Observations:
In this scene the changes to the Nvidia image are beneficial in terms of brightness but the ATI still wins on hair detail. We were also slightly bothered by background colours again on the Nvidia frame. The brightness has a real impact and in specific sections (for example the blue/green hills across the water on the left) they end up very un-naturally coloured on Nvidia's enhanced image...
...Pictured above is the selected area from the red circled analysis in the prior image above (bottom left of screen). This time however the image is converted to CMYK and the yellow channel is isolated from the other colours. It is quite clear to see the noise within this yellow channel and its a clear indication that the Nvidia algorithm is somewhat struggling to deal with this image.
http://www.driverheaven.net/reviews.php?reviewid=552&pageid=13
They conclude:
We were quite surprised with the results of Nvidia's colour/tone enhancements also; these features have the ability to greatly increase the detail levels but most of the time this comes at a cost with other areas of the screen becoming overpowering and unnatural. We firmly believe that features like these can only be considered a success if they can be left enabled at all times, after all the last thing you want to think about when sitting down to watch a movie is “Do I want to chance the image quality being impaired?”. Therefore we cannot recommend any end user enables these options until they have evolved a little more.
So this leaves us with Nvidia default quality against ATI default quality, and the performance or features that both manufacturers offer. In terms of performance and features we have to say that Nvidia is the clear winner as they currently allow playback with Aero enabled and good CPU usage statistics regardless of whether the image includes one stream or two
IMHO, ATI wins their test, even though they prefer Nvidia. CPU usage isn't all that great with ATI when Aero is enabled (who watches picture in picture anyways?), and they are comparing the latest Nvidia card with a prior generation 3870. We will see if this holds up when the 4850 and 4870 arrive.
ATI's image quality, except on one face scene in their review, is better than Nvidia's at default and Nvidia's fix for image quality actually causes more problems than it solves. Both card settings probably need tweaking but I've used Pure Cinema in a prior generation with a 7600gs and used AVIVO from the old All in Wonder cards to the 3870 generation and found it much better.
Plus, last time I checked, Pure Cinema costs an additional $50 or so direct download from Nvidia, whereas AVIVO is free and included in Catalyst releases. ATI began as the second best gamer GPU company but was always oriented towards video with gaming quality not far behind.
Considering the "tweaks" Nvidia's made at times to their drivers just prior to a game's release (Crysis demo's water fudging), I'm not sure that Nvidia wins hands down nowadays as the best gamer card. They are ahead in some titles but behind in others, ahead with some cards, but behind with others. Overall, it's neck and neck and I don't think that Nvidia deserves their rep as they've played fast and loose more often than ATI (i.e. their TWIMTBP program plus behind the scenes "tweaks" like boosting the PCIe x16 bus on certain Nvidia boards for 9600 benchies that actually led to instability overall).
Buy the GTX280 and enjoy, it won't be a bad card, but ATI's drivers get things balanced once the game's are released, and though Pure Cinema's improved over the past few years, IMHO, AVIVO still has it beat hands down where image quality is concerned.
Anyways, I hope this information helps you compare AVIVO and Pure Cinema. Hopefully, more sites like Tom's and Anandtech will update their comparisons once the new cards arrive.