AVIVO vs. Purevideo HD: What You Need to Know about High-Definition Video

nicolasb

Distinguished
Sep 5, 2006
93
0
18,630
I can't see most of the graphs in this article. :(

May I also ask, why is that yet again we have an article appearing on "Tom's Hardware UK & Ireland" which discusses video processing but makes absolutely no mention whatever of European video formats, instead focusing exclusively on American/Japanese ones?

It's slightly more forgiveable this time round, given that BluRay and HD-DVD are actually 60Hz formats internationally, but any reference to SD or DVD video always betrays the author's apparent ignorance of the fact that there is a world outside the borders of the US of A. I wouldn't mind if this happened once, but that you keep doing it concerns me.

Can I assume that this subject will be revisited when RV610 and RV630 ship?
 

icepop456

Distinguished
Nov 27, 2006
19
0
18,510
What is it with the editors not proof reading things? Where are the charts? How in the world is it possible for almost all of Tom's Hardware articles to be missing graphs, wrong charts, or have conclusions that are not supported by any data?

I've been reading TH for 8+ years now and I think I have to find a better review site. Any suggestions?
 

pinchythelobster

Distinguished
Jun 8, 2007
3
0
18,510
Living in the UK there are currently very few titles available in either HD DVD or BLU RAY so the options are limited. however I have been watching high definition content from various sources, but read from HDD not an optical drive, on my PC for about 6 months now using an Optoma HD70 projector. Originally with an Athlon XP2800 w/ 9800 pro, this rig didn't like x264 but coped well with 720p VC1 (didn't like 1080p). I built a new PC in April with this task primarily in mind. It included an e4300, 2gb, and an 8800 gts 320mb.

My new rig has eaten everything I've thrown at it, regardless of codec or resolution. I've monitored CPU utilisation to fluctuate between 15% and 40% depending on the scene (action scenes with lots of movement cause the spikes), but rarely higher. This is using either PowerDVD7 or VLC, and it always looks incredible!!!

The projector is connected via a HDMI cable to DVI>HDMI converter. However I don't know if this converter is HDCP compliant, or if it even needs to be. But the connection should in theory be purely digital.

I wonder if there is any difference in performance between reading the content from HDD or off an optical drive?



(my first post... woohoo)
 

Eurasianman

Distinguished
Jul 20, 2006
883
0
19,010
Interesting article. Normally, I can skim through articles like this in no time... without charts, it makes it kind of hard to skim :oops: :x

Then again, by reading the article, you kind of gain more knowledge on the topic (solely relies on how accurate the article is though).

*sighs*

I guess, in a way, it's not bad that my ATI X1900XT 512 GDDR3 broke, seeing as it didn't even support HDCP! :?

Why didn't they throw in other cards? My Leadtek 7900GS is HDCP compatable. I'd like to see how well this performs as well.

Btw, is there an actual HD-DVD ROM drive you can buy for computers? I mainly use newegg.com and I can never find HD-DVD ROM, but I can always find Blu-Ray ROM. Just wondering.
 

flabbergasted

Distinguished
Mar 1, 2006
113
0
18,690
What is it with the editors not proof reading things? Where are the charts? How in the world is it possible for almost all of Tom's Hardware articles to be missing graphs, wrong charts, or have conclusions that are not supported by any data?

I've been reading TH for 8+ years now and I think I have to find a better review site. Any suggestions?

My sentiments exactly. Now I mostly lurk in the forums. Serious reviews are no longer found here.
 

cleeve

Illustrious
*sigh*
Sorry guys. I let them know so they could fix it.

In the meantime, here's the charts:

CPU_usage.jpg
 

dogman-x

Distinguished
Nov 23, 2006
44
0
18,530
After reading this article, my main question is: How does the Intel G33 integrated graphics HD Video quality rate against these cards? For example, the Shuttle XPC Barebone SG33G5M Deluxe specs include HDCP over an HDMI output connector. Do I really need an add-in card if I don't play newer 3D games?

Also, what about testing using a real 1080p HDTV monitor? like this:
http://www.newegg.com/Product/Product.aspx?Item=N82E16889101108
 

cleeve

Illustrious
After reading this article, my main question is: How does the Intel G33 integrated graphics HD Video quality rate against these cards? For example, the Shuttle XPC Barebone SG33G5M Deluxe specs include HDCP over an HDMI output connector. Do I really need an add-in card if I don't play newer 3D games?

Good question, I'm probably going to write a part 3 when the 2400/2600 cards are released, I'll see if I can get intel to supply me with a G33 sample mobo.

I should add though, if you have an e4300 CPU or better, you shouldn't have a problem playing HD DVD/blu ray with no video acceleration...
 

daverimer

Distinguished
Feb 13, 2007
64
0
18,630
Before reading this article, I knew next to nothing about HD playback on the PC. It provided a great intro to the topic and current offerings from Nvidia and AMD/ATI. While I have no plans to upgrade for HD content anytime soon (prices way to high - DUH!), it was informative. Thanks Tom's!

Dogman-x - not replying to you, not sure why it selected you
 

autoboy

Distinguished
Jan 10, 2007
35
0
18,530
To be completely frank, 1080p video looks so darn good, I'm not sure it will make a big difference. Recall that DVD video is a mere 720x480 pixels: things like edge enhancement and jaggy reduction are quite noticeable. But when we up the ante to 1080p - that is, 1920x1080 pixels - it's hard to complain about image clarity. It looks really fantastic as it is. I'm sure I've offended legions of video quality enthusiasts, but I'm just calling this one as I see it guys.


This paragraph does not make much sense. You are talking about 1080i video in the HQV benchmarks where de-interlacing and jaggies are important, but you specifically mention 1080p content which is not interlaced and does not require any change to the video. Of course 1080p video looks great. The computer does not need to take the two frames of 1080i video and combine them into one 1080p frame.

The HD HVQ benchmark is a little misleading. It is designed to test the deinterlacing of HD video processors, but all HD-DVDs and Blu-Ray discs released so far are in 1080p format and don't require deinterlacing at all. Hence, it provides little insight into video quality of HD-DVD and Blu-Ray discs because they look awesome no matter what. However, every HD broadcast in the US using an antenna or cable, is either in 1080i or 720p. For 720p there is no deinterlacing just like 1080p, but for 1080i video broadcast from CBS, TNT, FOX, HBO, etc, the video processing is very important. These HQV tests are designed to test the deinterlacing processors in your TV, or a separate video processor between your cable box and the TV. They also are very important for HTPCs because you can now easily record HD programming on your computer. The only way to distribute HD material is on HD media and so they are released on HD-DVD and Blu-Ray, but they really are for traditional mpeg2 broadcasts in 1080i from your local TV station.

As you can see, the video cards cannot properly deinterlace 1080i broadcast TV. Nvidia claims HD purevideo support on their website for all the advanced processing features like Inverse Telecine, Spacial Temporal Deinterlacing, etc, but apparently it does not work because otherwise the cards would score more than a zero. This should be brought to the public's attention and Nvidia should get slapped for this one.
 

pinchythelobster

Distinguished
Jun 8, 2007
3
0
18,510
My screen is just under 100" but the room it's in isn't particularly large, so I sit only maybe 8 - 10ft away. At this distance I do not notice pixellation, but the interlacing on 1080i media along none straight edges is apparent when you look for it. However VLC's de-interlace filer seems to alleviate the problem somewhat. Unfortunately I don't have the HQV test, so I can't really make a proper judgement as to the de-interlacing effectiveness.

Is a hardware post processor likely to give me significantly better results?
 

Clob

Distinguished
Sep 7, 2003
1,317
0
19,280
I knoticed that the article states that component video only support resolutions up to 1080i. I do believe it supports a 1080p signal just fine.
 

tlreaves

Distinguished
Oct 26, 2006
29
0
18,530
No, he said that current computer hardware doesn't support the full 1080P resolution over component. The fact is the content, signal processors and monitor hardware for PC, TV, etc... are not all designed, built and manufactured by the same company so they can vary. His observation was limited to the content of the article: BlueRay and HD-DVD content playback on the PC, not the all encompassing statement you just made.
 

Clob

Distinguished
Sep 7, 2003
1,317
0
19,280
Firstly, analog Component video is limited to 1080i (interlaced) resolution. The full 1080p (progressive) resolution is not available on analog component video. It is available on an analog VGA output, however.
-The article

Actually, I did some research on the subject....

For most consumer-level applications, analog component video is used. Digital component video is slowly becoming popular in both computer and home-theatre applications. Component video is capable of producing signals such as 480i, 480p, 576i, 576p, 720p, 1080i and 1080p.

http://en.wikipedia.org/wiki/Component_video

Seems only digital component is capable of 1080p[/code]
 

SuperFly03

Distinguished
Dec 2, 2004
2,514
0
20,790
My bad, I didn't know.

No biggie, most don't know. It is unusual that an author is this active, but Cleeve does a very good job and does it consistently.

one of my favorite people around here and damn good at video matters, he makes my head hurt sometimes... in a good way. :wink:
 

cleeve

Illustrious
I'll have to look over my notes guys to see the sources, but my research indicated that only 1080i was available through component video on the PC. (yes, I was referring to the PC and not component in general, and yes, I could have made that clearer).

I'll verify that though. Just keep in mind, Wikipedia isn't always 100% accurate. Not saying I am either, but I'll find out where I got the info and let you know.

Autoboy: The point is well taken. I probably didn't explain the situation nearly as well as I could have. Keep in mind though that noise reduction is something that should work in 1080p as well as 1080i... but yes, the lions share of tests were focused on interlaced performance and looking back I could have explained that much better than I did.

Thanks for your feedback gents, a pleasure as always. :)
 

pinchythelobster

Distinguished
Jun 8, 2007
3
0
18,510
My 8800 definately allows 1080p through component. Although I do not use it because I have a DVI>HDMI converter with a long HDMI cable, but the HDTV dongle that was provided with the card is certainly capable of suppporting 1080p.
 

dogman-x

Distinguished
Nov 23, 2006
44
0
18,530
I must be missing something. I only know of two types of monitors that can support a 1080p signal:
1) a computer monitor
2) a new 1080p HDTV
Computer monitors use VGA or DVI.

For a new 1080p HDTV, a cable with HDMI on one end and DVI on the other is the best option. My 1080p HDTV (that I'm using to type this post) came with an HDMI/DVI cable included. I just plugged it in to the video card and it worked great.

Why would anyone want to use component for 1080p?

By the way, newegg just lowered the price of my 1080p HDTV to $899:
http://www.newegg.com/Product/Product.aspx?Item=N82E16824112174