Sign in with
Sign up | Sign in
Your question
Closed

AVIVO HD vs Purevideo HD Part 3

Tags:
  • Graphics Cards
  • Video
  • HD
  • Graphics
Last response: in Graphics & Displays
Share
October 26, 2007 1:38:35 PM

For last few reviews, you have shown how these cards offload the processor when decoding a video while watch DVD/ HD-DVD. But do these cards help offload the processor when encoding a video? For example, if I am changing an MPEG2 movie to avi, will one of these cards lighten the load?
October 26, 2007 2:46:18 PM

prodystopian said:
For last few reviews, you have shown how these cards offload the processor when decoding a video while watch DVD/ HD-DVD. But do these cards help offload the processor when encoding a video? For example, if I am changing an MPEG2 movie to avi, will one of these cards lighten the load?


No, unfortunately the video card is not involved in the encoding process.

In the past, there were hints that this would happen, but it's been years since those whispers have come and gone...
Related resources
October 26, 2007 2:55:35 PM

Ugh. That's very frustrating. I keep hearing about how much we can do with GPUs (i.e. the password cracking article on TGDaily a few days ago). Wouldn't it be amazing if that power was helped for video?

Thanks for the response, Cleeve.
October 26, 2007 3:01:13 PM

I hear you, encoding assistance is one of the things I've been waiting for.

It's inevitable with the fusion of the CPU and GPU, but we'll have to wait a little longer yet. :) 
October 26, 2007 3:04:14 PM

Yea I just got a Zune (very cheap, otherwise it might have been something else) and have been encoding videos from my HTPC DVR. It is quite difficult on my P4 3Ghz, though so I was hoping I could get some help if I pick up an 8800gt or ati 3870. Looks like I'll be buying a new processor sooner than I thought.

Cleeve, how do you have a 2900 and 8800? Is that two separate computers?
October 26, 2007 3:08:20 PM

prodystopian said:
Cleeve, how do you have a 2900 and 8800? Is that two separate computers?


I do have separate computers, but I always find I'm swapping these two out of my main box for testing and things. They're both great cards I use all the time so I decided to have them both in my sig. Wishy washy of me, I know. :) 
October 26, 2007 3:19:13 PM

Don't you know that you need to pick one so that the board members can claim favoritism and fanboy behaviour?
October 26, 2007 3:26:23 PM

lol! Too true, mate...

Too true. :D 
October 26, 2007 6:31:28 PM

It's time for Linux and MAC people to respond to this. They claim their OSes are superior in EVERY way yet I have never seen a video hardware acceleration benchmark.

Guys, if video acceleration is possible at all on Linux and/or MAC, please let us know!! If so, point us to some article and benchmarks.

By the way, XP still rules even for the 2600xt. My Fusion HDTV tuner has smooth 1080i playback on XP but it is jerky on Vista. Plus XP worked perfectly out of the box and Vista wasted my time with dozens of renderer/decoder/driver options to try. Some failed completely and none work like XP.
October 26, 2007 7:12:27 PM

PLEASE TEST MPEG2 HD, especially 1080i. Many of these cards have issues with plain old MPEG2 1080i de-interlacing, and anyone using either a broadcast OTA HD signal or Cable based QAM tuner will have lots of MPEG2 1080i content to view.

These companies have not responded to fixing issues with 1080i decoding, and you have a great opportunity to put a spotlight on these issues and inform buyers before they purchase a card assuming that MPEG2 decoding just works.

Pretty please? :-)
October 26, 2007 7:42:02 PM

One thing I really don't get is how does the sound work on the 2x00 series cards? From what I can tell the cards only have DVI connectors on them and I thought DVI was Video only. Putting an HDMI converter on it doesn't seem to me to be a way to get sound out of a Video only channel. Anyone know how this 5.1 decoder works?
October 26, 2007 7:54:38 PM

If you're not using HDMI, the sound on the Radeons works exactly like the sound on a GeForce: you need to use your motherboard's sound chip, or a discrete sound card. Audio is delivered to your regular speaker setup.

If you use the HDMI, the sound is delivered to the HDMI device... your television, or an inline audio HDMI reciever.
October 26, 2007 7:56:29 PM

mikesm said:
PLEASE TEST MPEG2 HD, especially 1080i.


I'll try to tackle that in the next review, where I'll hopefully have a Blu-Ray playback device at my disposal instead of just the HD-DVD player.

I think the MPEG2 HD titles are all on Blu-Ray right now, although I'm not sure, I'll have to look into it...
October 26, 2007 8:25:23 PM

What I like to see is an unifying driver that allows you to use 2 different video cards at a same time like having your GTX and XT both running concurrently.
October 26, 2007 8:48:21 PM

Heheh...well, to have that, Nvidia and Ati would have to co-operate. I don't forsee that happening... :D 
October 26, 2007 10:30:49 PM

Great to finally see a benchmark that goes beyond the fairyland of 30" widescreen Full quality gaming. :bounce: 

Thanks Cleeve :hello: 

I do have one question, how about noise production. I hear the 2600xt can be loud compared to the 8600



Shuttle SG33G5, E6550, 2x1GB DDR2 667, 400GB, videocard is TBD.
October 26, 2007 11:01:05 PM

The 2600 XT I used was a fanless completely silent version from Sapphire. But I've used a Powercolor 2600XT and never noticed any noise from it.
October 26, 2007 11:25:13 PM

Cleeve,

I think he was talking about a review of HD 1080i video from TV, not from BluRay, although it would still be an interesting article to see how well they cope with 1080p Mpeg2. Neither card does full acceleration on mpeg2 despite the misnomer "universal decode" from AMD.

Anyways, I have both, the 2400pro and the 2600pro and found that they stumbled on mpeg2 1080i video from TV. It seems like the mpeg2 decoding is done in the shaders of the card, and the deinterlacing is done in the shaders so they compete for resources. VC-1 and H.264 are decoded in the UVD and leaves the shaders available to do the deinterlacing so the h.264 encoded HQV test disk looks perfect. However, 1080i h.264 is rare and so this test disk actually really doesn't prove anything for the majority of end users out there. However, because that is what the reviewers focus on, the 1080i mpeg2 just gets overlooked despite the fact that it is much more abundant because of the huge numbers of HD tuners people have in their PCs. There are considerably less HD/BD drives out there than mpeg2 HD tuners, and all the movies for them are in 1080p which does not need the deinterlacing.

So, what we would really like to see is a showdown of the video cards when it comes to 1080i mpeg2 video where it really counts for 90% of the HD viewing. I would include such cards as AMD and Nvidia onboard video (should only be able to do Bob and weave), a 7300GS and X1300 (also bob and weave), a X1600 and 7600GT (capable of 3:2 pulldown I think), the 8400 and 2400pro (nobody really knows what they can do), 2600 and 8600 (should be able to handle 3:2 pulldown and Vector adaptive deinterlacing), and the 8800 and 2900 which should perform like the x1600 and 7600GT. What are you going to use for test material? I don't know but you are the experts right?
October 27, 2007 3:22:55 AM

Hi,

what about the 8600GT? is it like the 8600GTS?
i"m going to buy tomorrow a new card and i can't choose between the HD2600XT and the 8600GT .
please , i need your advice.

thx
AmiR
October 27, 2007 3:41:56 AM

When the new cards come out from Nvidia and Ati, I'm going to see if I can do an all-encompassing video review of the entire 2x00 and 8x00 line.
October 27, 2007 7:16:09 AM

Hey I'm with amirbd. What about the 8600GT? I have that card right now on my EVGA 650i Ultra motherboard. I was wondering if ATI would be better for watching HD video than Nvidia which I'm thinking is better for gaming..? I mostly watch videos and graphic design on my PC and most of the HD video I download and watch are H.264 formats. I'm just really confuse...
October 27, 2007 8:46:58 AM

Linux and hardware acceleration...

Well, it exists, actually, with a free implementation - now, what one requires is driver support. Heh.

Right now, video acceleration is done through the X video (Xv) extention. Most current drivers support basic Xv: video scaling, YUV conversion and output (overlay, blit, or pixmap). Xv was originally designed to get and send video fluxes, but this functionality is now rarely used.

OpenChrome (for VIA Chrome hardware) has a very complete and Free implementation of X video Motion Compensation (XvMC): inverse discrete cosine transform, and motion compensation - typically, these help in MPEG-2, MPEG-4 ASP and AVC decompression (and since VC-1 is a primitive form of MPEG-4, it can be accelerated too, but for now, the open decoder is too basic to use it).
Another extension is being considered: due to its age, XvMC was never designed for more complete video acceleration than what is required by MPEG-2; as such, Intel is leading the development of a Free specification for video acceleration (called libva, hosted at freedesktop.org), regularly updated).

The OpenChrome's implementation of XvMC is being considered for porting to Intel, Matrox and AMD hardware; as there is no feature complete Nvidia Free driver, XvMC won't be supported on Nvidia hardware for a while (the Nouveau reverse engineered driver still doesn't have enough 3D features to support XvMC properly).

One thing to note though, is that under Linux+X11 Xorg, Free drivers usually share code - as such, a feature's quality is identical across the range.
October 27, 2007 9:24:51 AM

Hi, I'm very curious to know if the Quality benchmark of the 8600GTS and HD 2600 Xt, are the same for the 8600GT and HD 2600 pro.

:) 

Nik
October 27, 2007 4:52:44 PM

I did not see if you used 64- or 32- bit Vista for these tests. In other reviews I have noticed a difference in CPU usage that seemed to require a driver improvement for 64-bit (or maybe a 64-bit version of Cyberlink). For your next review it would be really helpful to test both, if possible. Which did you use for this test?

Chris
October 27, 2007 6:47:08 PM

zozzlhandler said:
I did not see if you used 64- or 32- bit Vista for these tests. In other reviews I have noticed a difference in CPU usage that seemed to require a driver improvement for 64-bit (or maybe a 64-bit version of Cyberlink). For your next review it would be really helpful to test both, if possible. Which did you use for this test?

Chris


32-bit Vista was used for this test. My bad, definitely should have noted that in the article.
October 27, 2007 6:49:08 PM

SiDE said:
Hey I'm with amirbd. What about the 8600GT? I have that card right now on my EVGA 650i Ultra motherboard. I was wondering if ATI would be better for watching HD video than Nvidia which I'm thinking is better for gaming..? I mostly watch videos and graphic design on my PC and most of the HD video I download and watch are H.264 formats. I'm just really confuse...


Not sure how the 8600 GT would fare. in theory it should be close if not identical to the 8600 GTS but I'll have to test it.

For gaming, the 8600 GT and 2600 XT are right on par. The 2600 XT is actually a bit better on the whole. When AA is enabled the 8600 GT does a bit better, but the framerates bor both are pretty dismal with AA...
October 28, 2007 6:47:57 PM

Autoboy above is correct: the root of most of the 2400pro's problems is that mpeg2 is handled in the shaders, and not in the UVD as VC1/h264 are. If you dig around the dll, you can see this wasn't the original intention, but seems to have happened accidentally (hardware screwup?) and too late in development to fix.

As a result, you can see (via rivatuner) the 2400pro's shaders max out extremely quickly with mpeg2, thus they have to cripple deinterlacing, and thus the failure of the deinterlacing tests above (even though those are in VC1!). The 2400pro is actually _just_ capable of the same vector-adaptive deinterlacing the 2600xt uses, but not while decoding mpeg2 - you have to tweak things to force this on, then use a combination of software decoding and hardware deinterlacing (possible via bitcontrol and ffdshow).

Similarly, the mpeg2 limitations affect max scaling size - again this is done in the shaders, and thus has to be limited. Yet again, h264 and vc1 get crippled even when it's only mpeg2 interlaced that is the problem.

This is only one of a whole range of stupid bugs affecting ATI HD playback, that just never get fixed. Most irritating of all is that by default the cards do a forced levels expansion on HD (from 16-235 to 0-255), but not with SD, making a single calibration impossible. I reported this in July, 3 driver revisions ago, and there's been no fix.


Oh, btw the 2600xt is fine for 1080i mpeg2. Even with decoding mpeg2 in shaders, denoising, vector-adaptive deinterlacing, and scaling to 1080p, the shaders never go above 50%. A 2600pro gets a bit closer to maxxing but will still be OK, while the 2400s will clearly never be able to do all of this at once. However, with denoise off, and using software mpeg2 decoding (or if ATI fix it so mpeg2 gets decoded in the UVD), they are just capable of doing vector-adaptive deinterlacing on 1080i mpeg2.

Edit: while i say the 2600xt is fine for 1080i mpeg2, there is a bug from cat7.8 onwards where the cyberlink mpeg2 decoder and apps using EVR produces a deinterlacing mess. The answer is to use another decoder, in particular ATI's own avivo one.

Edit2: while this may sound like a long whiny post (and it is!), don't get the impression these are bad products. ATI's hardware is _excellent_ for video playback. Their drivers are appalling though, clearly a complete afterthought to the main gaming market. If you're willing to experiment and understand a bit about the various problems, most are possible to bypass.
October 29, 2007 1:38:04 PM

Could we assume that the results would be similar for the mobile versions of the cards?
October 29, 2007 3:53:06 PM

I sure would like to see the effects the 2600 can have on an aging AGP system. Taking the load off the CPU is extra important on a CPU that can't keep up with the load. Like my P4 2.53ghz. If I could buy the 2600 AGP and get seamless 1080p playback I would buy it in an instant. However, rumour has it the AGP platform bottlenecks 1080p acceleration. I sure would like to see a similiar article on AGP systems. Pretty please?
October 29, 2007 5:21:38 PM

I don't see why the AGP bus would bottleneck 1080p, there should be plenty of bandwidth there. I'd be curious if there's any driver issues thoguh. If I can set up an AGP test, I'll make it happen.
October 30, 2007 2:52:24 AM

cleeve said:
If you're not using HDMI, the sound on the Radeons works exactly like the sound on a GeForce: you need to use your motherboard's sound chip, or a discrete sound card. Audio is delivered to your regular speaker setup.

If you use the HDMI, the sound is delivered to the HDMI device... your television, or an inline audio HDMI reciever.


But I still don't get it. How do you "use HDMI"? The only two ports are DVI, right? And the DVI connector doesn't carry audio, right? So how does the adapter work? Is there a separate cable that goes from the board to the adapter? I see that the adapter is available but I don't understand how it can convert from the video-only DVI to the video-and-sound HDMI.
October 30, 2007 1:17:08 PM

noble5 said:
But I still don't get it. How do you "use HDMI"? The only two ports are DVI, right? And the DVI connector doesn't carry audio, right? So how does the adapter work? Is there a separate cable that goes from the board to the adapter? I see that the adapter is available but I don't understand how it can convert from the video-only DVI to the video-and-sound HDMI.


The DVI connector DOES carry audio on the Radeon 2x00 series, so when you use the DVI-to-HDMI adapter - the one that comes with Radeon 2x00 series cards - it's all ready to go. No separate cabling required for audio.
October 31, 2007 10:48:54 PM

I have an older computer XP3000 and would like to use it as a home theater PC. Is the ATI 2600 card capable of output of 1920x1080 over the HDMI or one of the DVI connections and standard 1024x768 over the other DVI connection.

I am asking because I would prefer to have the computer in a seperate room with a monitor for general use, and run a HDMI, DVI, or VGA connection to my 46" samsung LCD for watching videos or browsing the internet.

On a seperate note, is there any quality difference between the three connections or would VGA give me the same quality as the HDMI or DVI on a 50'-75' run of cable.

Thanks
November 8, 2007 2:09:52 AM

Quote:
I'd be curious if there's any driver issues thoguh. If I can set up an AGP test, I'll make it happen.


That would be nice! If you think about it, there are "millions" of old AGP systems that can be used as a High Definition HTPCs using these cards and HD-DVD/Blu-ray drives.

Unfortunately, looks like Hardware acceleration (UVD) is NOT enable in AGP version of these cards.

From ATI website:

Quote:
Symptoms:
On AGP versions of the above mentioned graphics cards the Hardware Acceleration does not get enabled. High CPU usage, stuttering HD playback or even system crash are expected behaviours, although DXVA (DirectX Video Acceleration) is enabled by the application.

Solution:
Currently there is no solution

November 8, 2007 5:18:20 PM

This is indeed distressing. Older AGP platforms have the most to gain from hardware acceleration. If your CPU can't keep up with the load of 1080p, thats when hardware acceleration is a must.
With these core2duo machines, hardware acceleration is just a luxury.

The AGP systems are the ones that really NEED this feature. It would be really great if tomshardware could run some avivo/pure video tests on AGP. If anything, it could highlight the driver failings, putting more pressure on the driver teams.
November 26, 2007 5:47:27 PM

bump.
Any update on AVIVO on AGP?
November 26, 2007 10:43:05 PM

Still waiting for the 3x00 AGP cards to come out. :) 
January 23, 2008 9:02:40 PM

Cleeve,
I'm converting an existing system to an HD video player. Do you think I would be able to achieve the best HD quality possible with the following configuration? Specifically, is the single core 1800Mhz Athlon powerful enough considering that the 2600pro unburdens it from processing video?

MB: Gigabyte GA-K8N Ultra 9 F3
CPU: AMD Athlon 64 3000+ (1800 Mhz)
HD: Seagate 500GB, 3.0 GB/s, SATA
Display adapter: ATI 2600 pro.
Blu-ray disc drive: TBD

Thanks.
January 23, 2008 9:51:17 PM

Kudos to all who had input on this subject.As I speak for the mases,thank you for interpreting some of these aspects that I've had many questions about...I bought a 8600gts to run my htpc,was that a good choice for $100?Was ati a better choice for the same cash?
a b U Graphics card
January 23, 2008 10:10:07 PM

pwillikers said:
Cleeve,
I'm converting an existing system to an HD video player. Do you think I would be able to achieve the best HD quality possible with the following configuration? Specifically, is the single core 1800Mhz Athlon powerful enough considering that the 2600pro unburdens it from processing video?


Should be fine, the main issue with be since it's not dual core when other apps kick and steal resources (even just for a second to check status) in you may see minor hiccups. Best way to reduce that is to have enough system memory and set the play-ahead buffer higher.

However it should be fine.

Quote:

MB: Gigabyte GA-K8N Ultra 9 F3
CPU: AMD Athlon 64 3000+ (1800 Mhz)
HD: Seagate 500GB, 3.0 GB/s, SATA
Display adapter: ATI 2600 pro.
Blu-ray disc drive: TBD


I would suggest getting (if you can afford it) the LG BR burner with HD-DVd playback support. While some people say HD is a dying format, for the price of a BR only drive you can get HD-DVD ROM support which is a good idea since it's future now seems more like a PC thing than a stand-alone thing.

Also at this point in time I would recommend waiting for the HD3400 series card as it may offer different performance/features than the HD2600Pro and for about the same price. Check next week when it's launched. The HD2600P may still end up better (the 120 SPUs vs the 40 on the HD3400 still makes me prefer the 2600 for AA and post-processing potential and possible GPGPU potential), but it's better to know rather than miss an opportunity this close to a new launch.
a b U Graphics card
January 23, 2008 10:17:28 PM

Ninjaz7 said:
Kudos to all who had input on this subject.As I speak for the mases,thank you for interpreting some of these aspects that I've had many questions about...I bought a 8600gts to run my htpc,was that a good choice for $100?Was ati a better choice for the same cash?


Maybe, but the differences are still rather minor under Vista, the more intruiging thing for me when this review launched was the difference in support under XP/Vista for the GF8600 series.
I was fighting with Vista on my laptop at the time, and I tell you it made me feel better about my choice of VPUs since I'm blowing out Vista and replacing it with XP, had I chosen the GF8600M I may have felt less tempted to do so after reading this.

I'm not sure what the driver situation is now, but when Cleeve first reviewed this it was looking annoyingly similar to past situations where both IHVs had to wait for driver updates to offer what they promised.
Anonymous
a b U Graphics card
September 26, 2010 5:26:30 AM

prodystopian said:
For last few reviews, you have shown how these cards offload the processor when decoding a video while watch DVD/ HD-DVD. But do these cards help offload the processor when encoding a video? For example, if I am changing an MPEG2 movie to avi, will one of these cards lighten the load?



I use xilisoft video convertor ultimate 6, It offloads the cpu ...It requires nvidia cuda for this.

a b U Graphics card
September 26, 2010 5:45:13 AM

lock it up MM. lock it up!

lol.
a c 280 U Graphics card
September 26, 2010 5:53:56 AM

This topic has been closed by Mousemonkey
!