To be completely frank, 1080p video looks so darn good, I'm not sure it will make a big difference. Recall that DVD video is a mere 720x480 pixels: things like edge enhancement and jaggy reduction are quite noticeable. But when we up the ante to 1080p - that is, 1920x1080 pixels - it's hard to complain about image clarity. It looks really fantastic as it is. I'm sure I've offended legions of video quality enthusiasts, but I'm just calling this one as I see it guys.
This paragraph does not make much sense. You are talking about 1080i video in the HQV benchmarks where de-interlacing and jaggies are important, but you specifically mention 1080p content which is not interlaced and does not require any change to the video. Of course 1080p video looks great. The computer does not need to take the two frames of 1080i video and combine them into one 1080p frame.
The HD HVQ benchmark is a little misleading. It is designed to test the deinterlacing of HD video processors, but all HD-DVDs and Blu-Ray discs released so far are in 1080p format and don't require deinterlacing at all. Hence, it provides little insight into video quality of HD-DVD and Blu-Ray discs because they look awesome no matter what. However, every HD broadcast in the US using an antenna or cable, is either in 1080i or 720p. For 720p there is no deinterlacing just like 1080p, but for 1080i video broadcast from CBS, TNT, FOX, HBO, etc, the video processing is very important. These HQV tests are designed to test the deinterlacing processors in your TV, or a separate video processor between your cable box and the TV. They also are very important for HTPCs because you can now easily record HD programming on your computer. The only way to distribute HD material is on HD media and so they are released on HD-DVD and Blu-Ray, but they really are for traditional mpeg2 broadcasts in 1080i from your local TV station.
As you can see, the video cards cannot properly deinterlace 1080i broadcast TV. Nvidia claims HD purevideo support on their website for all the advanced processing features like Inverse Telecine, Spacial Temporal Deinterlacing, etc, but apparently it does not work because otherwise the cards would score more than a zero. This should be brought to the public's attention and Nvidia should get slapped for this one.