Page 1:Image Quality: Examined
Page 2:Intel, AMD, And Nvidia: Decode And Encode Support
Page 3:Transcoding Quality Revisited: CUDA Problems?
Page 4:Test Setup
Page 5:Hardware Decoder Quality: Examined
Page 6:Software Decoding: All CPU, All the Time
Page 7:Full Blu-ray Transcoding Speed: APP Versus CUDA Versus Quick Sync
Page 8:Small Clip Transcoding Speed: APP Versus CUDA Versus Quick Sync
Page 9:Transcoding Quality: APP Versus CUDA Versus Quick Sync
Page 10:Transcoding Quality: Rated By Software Title
Page 11:Playing Devil's Advocate: "There is No Spoon"
Page 12:Inside The Black Box: GPGPU Encoding
Page 13:Final Words
Hardware Decoder Quality: Examined
Part of what makes transcoding a difficult subject to tackle is that we are dealing with different decoder hardware. Even though a program like Windows Media Player 12 or PowerDVD may use near-universal API calls for DXVA (used for hardware-accelerated decoding), the hardware actually doing the processing is designed differently. For that reason alone, we want to pay some attention to decoder output.
There are a couple of problems to overcome here. When you play back a video, you need to grab the exact same frame for comparison. Two sequential frames in the same clip can have large differences, which is why you need to be sure that you are comparing the exact same frame across multiple platforms. Using VLC, it is easy because you can go to a specific time and take a screenshot using the built-in capture function. However, this means you are limited to VLC's decoder, which only uses the decode portion of the pipeline. It doesn't utilize motion compensation or frequency transformation acceleration, even when you enable hardware-based decoding. Plus, it doesn't support Intel's implementation, which means that's only an AMD versus Nvidia versus software decoder battle.
So, we need to find a way to use something like WMP12 and still capture from the video renderer. There is actually an easy solution to the last part. WMP12, PowerDVD, and other recent video players all use a new MediaFoundation component, known as the Enhanced Video Renderer (EVR). Normally, when you take a screen capture of WMP12, you get nothing, since it comes in as an overlay. If you download the DirectX 11 SDK, there is a setting within the control panel to enable screen capture, which solves the problem of examining decoder hardware. Now, this does use the older DirectDraw code to dump a still frame, but that happens after it has been rendered by EVR. At this point, the video is paused and is no longer behaving like a streaming data set. DirectDraw is simply allowing us to dump that as part of our regular screen capture. Since we are dealing with a static frame, what we have captured is representative of what you see on the screen.
We can address the problem of capturing specific scenes by using an unprotected copy of a Blu-ray-based workingprint segment from Iron Man. Before a movie is completely finished, directors and video editors use this rough cut of a movie to help them add effects, animation, insert footage, remove footage, or redub audio. Workingprints contain time codes to the centisecond. This is a property we can exploit to ensure that we consistently extract the same the frame for analysis. This isn't without complications. You need to be in a hyper-reaction mode, and over many countless hours of mouse clicks fueled by caffeine, we were able to capture everything we needed.
The last issue with which we have to deal is scaling. When you watch a Blu-ray movie on a screen larger or smaller than 1920x1080, the video data gets piped through a scaler in order to accommodate the larger resolution. In order to minimize the effect of scaling, we play our video back on a native 1920x1080 screen. Smart decoders will bypass the scalers at native resolutions. Some decoders don't, and that is often one point where image quality can be improved. Furthermore, decoders are inherently different in the very way they take in data. Some may prefer an 8x8 block; others may use a different block size for processing. For us, we care about the final output. That is what is going to show us the real-world difference, if there is one.
Hardware Decoder Quality: Are they the same?
As far as color space manipulation is concerned, AMD actively changes the color profile of video during playback. Transcoded video is actually unaffected because you don't actually play back the video, which means you aren't sending the data stream to the renderer. Nvidia and Intel default to the color profile of the application. I intentionally leave the manipulation by AMD enabled so you can see the difference.
Aside from saturating the image, UVD 3 still looks worse than Intel's Clear Video HD (CVT) and the fourth generation of PureVideo (VPDAU4). You can specifically see this around the blades of the helicopter to the far-right of the image. There is no aliasing around the edges, and the workers on the truck are less blurred, which means Intel's and Nvidia's motion compensation algorithms are coping with the camera panning motion more accurately.
This next scene is where Rhodes is driving towards the battle between Obadiah and Tony near the end of the film. This is high motion to the extreme, and these three frames were some of the most difficult to capture. It is near impossible to tell Nvidia and Intel part. AMD sticks out like a sore thumb simply because the high brightness setting causes a halo effect on the street lamp. Even when we turn off color manipulation, there is a very slight degree of graininess present near the front bumper of the car.
Even with color manipulation, in dark scenes we're hard-pressed to tell AMD apart at first glance. Overall, all three companies show much more consistent results here. The only thing obvious detail we noticed was that Nvidia looks slightly lighter under Iron Man and the car than in our Intel shot. AMD is noticeably brighter in these two areas, but that is to be expected.
In bright scenes it is harder to distinguish the quality of UVD 3 unless you turn off color manipulation. When you do, you notice that the Nvidia and AMD solutions show the background officer slightly grainier. Even with color settings handed over to the application's control, you can still see that some color management occurs beyond what the viewer can control. If you overlay the Nvidia and Intel shots over one another, you can tell that Nvidia is a shade lighter.
In this last scene, the camera is slowing panning as Gwyneth looks up. AMD's color saturation makes much more sense here, as the scene appears more vivid. With that said, the brightness is very aggressive and tends to wash out a lot of detail. Nvidia is overly sharp. You get more detail, but overall, the picture looks grainier. Intel is the one with the upper hand here; its image shows a good balance betweeen detail and sharpness.
- Image Quality: Examined
- Intel, AMD, And Nvidia: Decode And Encode Support
- Transcoding Quality Revisited: CUDA Problems?
- Test Setup
- Hardware Decoder Quality: Examined
- Software Decoding: All CPU, All the Time
- Full Blu-ray Transcoding Speed: APP Versus CUDA Versus Quick Sync
- Small Clip Transcoding Speed: APP Versus CUDA Versus Quick Sync
- Transcoding Quality: APP Versus CUDA Versus Quick Sync
- Transcoding Quality: Rated By Software Title
- Playing Devil's Advocate: "There is No Spoon"
- Inside The Black Box: GPGPU Encoding
- Final Words