Sign in with
Sign up | Sign in
Your question

Avivo vs. Purevideo, Round 1: The Radeon X1000 vs. Geforce 7000 Generation

Last response: in Graphics & Displays
Share
January 9, 2007 11:26:09 AM

Video play is not created equal: Our quality and performance analysis shows quality gaps exist between the video processing and acceleration technologies used by ATI's Radeon X1000 and Nvidia's Geforce 7 generation.
January 9, 2007 2:14:49 PM

nVidia maybe won this fight for 5 points, but you have to pay for it :? , making ATI the winner.
January 9, 2007 2:31:10 PM

What a surprise, those of us in the UK are currently prevented from viewing this article - *again* - because the UK site hasn't updated and your extremely poor geolocation system prevents UK users using the US site, while French, German, etc etc users are allowed to choose whatever site they like.
Related resources
January 9, 2007 3:37:42 PM

When it comes right down to it, the detail (in distance) is the ONLY setting that nVids got an advantage over ati. Though other settings ati is slightly better in most with nVvid winning maybe 1? And like said before, you have to PAY for it? And then pay again as it surely uses more cpu. And wheres the comments of nVids encoder? All I read was complaints about REPORTS on the INTERNET of all places (go figure) of ati's supposed abilities regaurding this (using the hardware itself) and how thats so bad of them, assuming ati's claim of acceleration is from their hardware. I saw no quotes regaurding ati directly claiming their gpu's were responsible for this supposed acceleration and NOT the cpu. I would like that clarified before bashing ati. Either way, ati's solution is free as it comes with their gpus and nVids is an paid for add on. I like free, or included, so for value and power consumption ati's current solution is clearly the winner here
January 9, 2007 3:40:56 PM

Maybe it's me or maybe it's De Ja Vu.....

But didn't I read this same article a few weeks ago???
January 9, 2007 4:35:12 PM

The previous article was missing a few things and was not done with up to date nVidia drivers. Therefore it was pulled and re-done / re-written.

nVidia really does need a free decoder, what you are paying for is the inlcusion of dolby digital decoding.
January 9, 2007 4:46:32 PM

Quote:
Maybe it's me or maybe it's De Ja Vu.....

But didn't I read this same article a few weeks ago???


yeah and did ATI not win :) 

its a conspiracy!!!
January 9, 2007 5:11:52 PM

Quote:
nVidia maybe won this fight for 5 points, but you have to pay for it :? , making ATI the winner.


Until you take 5 minutes to search the internet and get purevideo for free :p 
January 9, 2007 5:12:12 PM

Quote:
The previous article was missing a few things and was not done with up to date nVidia drivers. Therefore it was pulled and re-done / re-written.

nVidia really does need a free decoder, what you are paying for is the inlcusion of dolby digital decoding.


dolby digital decoding? forgive my ignorance, but dolby digital is an audio format right? what does that have to do with 2D DVD video decoding? Would the dolby digital not be handled by the sound card?

Believe me I'm not trying to just give you a hard time, this is very important to me because I just put in a dolby digital sound card (Auzentech) into my (super cheap) HTPC, and now DVD playback is choppy and my cpu utilization is at 100% (it's only a 1ghz processor though). It could play DVD's just fine before I added the sound card, I would have expected the sound card to take work AWAY from the PC, but it seems to have added??
January 9, 2007 5:52:15 PM

Quote:
nVidia maybe won this fight for 5 points, but you have to pay for it :? , making ATI the winner.


Until you take 5 minutes to search the internet and get purevideo for free :p 
January 9, 2007 7:07:00 PM

Quote:
nVidia maybe won this fight for 5 points, but you have to pay for it :? , making ATI the winner.


Until you take 5 minutes to search the internet and get purevideo for free :p 


January 9, 2007 7:24:05 PM

Was the Purevideo de-interlacer set to Automatic or Smart?
January 9, 2007 7:42:57 PM

I have used both a 6600GT and a X1600 and have learned something I thought very odd. I have XP MCE which naturally comes without a DVD decoder. The DVD MPEG decoder is also needed to watch HD TV. I first tried the X1600 with an old PowerDVD MPEG decoder and that really worked poorly with horrific nonexistent deinterlacing (but no Jutter). I then purchased the PureVideo decoder and naturally I switched to the 6600GT. This fixed the interlace problem in HD, but I dropped a lot of frames (Jutter). I thought I was really stuck until I tried the PureVideo decoder with the X1600. To my delight and surprise this worked great!
I've never seen anyone mention this hybrid Ati-NVidia combination before.
January 9, 2007 8:32:04 PM

Nvidia cards no longer need to use the Purevideo encoder for hardware acceleration, they now work with any encoder that supports hardware acceleration like the ones that come withPowerDVD or Showtime...

Hi guys, it's me the author, and I want you to know that Nvidia cards no longer require a special purevideo decoder like the first version of the article stated...

It was this change that prompted us to pull the entire article and rewrite it.

As it stands, Ati and Nvidia are on an almost even keel now: Nvidia has slightly better sharpening or 'enhancement', but Ati's is still very good. Other than that they're both so close to perfect it's uncanny.

I apologise for any confusion.
January 9, 2007 8:52:58 PM

What I want to know is this:

Does the 6600gt support these functions as well?
I haven't upgraded my drivers since 84.21 came out (since they are so stable I saw no need to), but if the new drivers will unlock some new purevideo features in the control panel, then I'd be willing to upgrade, so long as they will work with the 6600gt.

Anyone know?
January 9, 2007 10:47:34 PM

I know the article is about hardware accelerated decoding, but it would be nice if the software contender could be something like Media Player Classic + ffdshow.
January 9, 2007 11:14:31 PM

This article only shows the performance of fast GPUs for video playback, however, on my 1080p monitor with my Geforce 6150, which supports all purevideo features at standard def, I sometimes get choppy playback of DVDs with inverse telecine turned on. With high def 1080i material the 6150 stands no chance with inverse telecine turned on.

On my 7300LE, my system looks great with inverse telecine on standard def and does not drop frames, but it cannot handle 1080i with inverse telecine even when overclocked to GS speeds. Since the inverse telecine option is so buried in the control panel, it is not practical to switch it on and off every time I switch between high def material and standard def material so I find myself living with no 3:2 pulldown which results in a decent image most of the time, and a crappy image enough to be a PITA.

To add to your rant, when the system autodetects the video card, it should also set the apropriate settings according to the source material and capability of the video card.

I have been wanting to upgrade to better video cards in my two HTPCs but I still don't know what cards can handle 3:2 pulldown/Inverse Telecine for high def content. This and HDCP are the reasons I have crappy cards. Can the x1600pro handle 1080i material with 3:2 pulldown on a 1080p display? what about the 7600GT? I'm not buying a x1900 or 7900 card just for proper 1080i playback unless i have to. I can tell you that the 6150, 7300LE, and 7300GS cannot.
January 9, 2007 11:26:45 PM

Quote:
nVidia maybe won this fight for 5 points, but you have to pay for it :? , making ATI the winner.


Until you take 5 minutes to search the internet and get purevideo for free :p 




January 10, 2007 12:06:05 AM

Quote:
nVidia maybe won this fight for 5 points, but you have to pay for it :? , making ATI the winner.


Until you take 5 minutes to search the internet and get purevideo for free :p 






soulrider4ever has a good point, I don't know of anyone who has actually paid for purevideo....so that argument is null.

Anyways, why does the author complain about dvd player software for windows, when WMP works automagically with purevideo? (does it do the same for avivo?)
January 10, 2007 12:20:48 AM

I'll be doing a follow-up article dealing with HD in the next month or two.

The short answer is that an X1600 is supposed to be able to hardware-accelerate 720p, I think you need an X1950 PRO or X1900 GT to accelerate 1080i. Ati cards use shaders on the GPU to accelerate decoding, so they need fast GPUs to do high end stuff.

A 7600 GT is supposed to be fast enough to accelerate 1080i because it does it on a separate chip on the card, not the GPU. It depends more on raw clockspeeds than the class of GPU.

I'll try to verify all of this stuff in the review.
January 10, 2007 8:26:01 AM

So why do we discuss prices? I know everyone can dl everything on the internet and crack it. Should I complain when I see an article in Tom´s saying that vista will cost $140 or $150 and argue with the author because I'll get it for free?
January 10, 2007 10:03:54 AM

Quote:
I'll be doing a follow-up article dealing with HD in the next month or two.

The short answer is that an X1600 is supposed to be able to hardware-accelerate 720p, I think you need an X1950 PRO or X1900 GT to accelerate 1080i. Ati cards use shaders on the GPU to accelerate decoding, so they need fast GPUs to do high end stuff.

A 7600 GT is supposed to be fast enough to accelerate 1080i because it does it on a separate chip on the card, not the GPU. It depends more on raw clockspeeds than the class of GPU.

I'll try to verify all of this stuff in the review.


geez, and I always thought 2D video was nothing for modern video cards. :(  now I have to consider buying a video card better than my gaming rig has to support HDTV, BOOO. :) 
January 10, 2007 2:00:22 PM

Quote:
Video play is not created equal: Our quality and performance analysis shows quality gaps exist between the video processing and acceleration technologies used by ATI's Radeon X1000 and Nvidia's Geforce 7 generation.

Video quality is subjective; accepting the author's learned opinion still doesn't cut it.
January 10, 2007 2:46:07 PM

Quote:

Video quality is subjective


Both yes and no (Mostly No, now that I think if it - A clear sharp picture with awesome colors, no bleeding and black black will look good to anyone <- fact).. Anyone (who's looking for better quality) can see whether Noise Reduction on moving objects looks good or not. - or if 2:3 Pulldown works - I can tell you my dad would never notice any of these things, but my dad doesn't read TG either..

Audio on the other hand is infinite more subjective... A pair of speakers I may loathe the sound of will be heaven to others <- Also fact...
January 10, 2007 4:49:09 PM

Based on Cleeve's response below I need to correct and clarify the post about the decoder. I was referring to the "Pure Video Decoder" software from nvidia that plugs into WMP.

http://www.nvidia.com/object/dvd_decoder.html

I think (but am not sure) it used to be that you HAD to get this decoder to access any of the hardware accelleration features. That is the way I understood it. nvidia has made Pure Video and Pure Video Decoder terminoology confusing now. Anyway the Bonze version is often packaged with the card, and available for DL from a few sites. I guess that one is supposedly to cover MPEG rights (my bad). I thought it also included the Dolby Digital extraction (sent to the sound card), but not Dolby Head phones.

Its nice to see nvidia made the features available without the decoder. :)  Maybe in reality the "Decoder" was just a plugin for Windows all the while.
January 10, 2007 7:05:23 PM

Quote:

Video quality is subjective; accepting the author's learned opinion still doesn't cut it.


Not really. I ran quantifiable IQ benchmarks.

Either the cards displayed Noise reduction, pulldown detection, de-interlacing, detail enhancement... or they didn't.

There's nothing subjective about, say, whether a card detects and properly displays 2:3 pulldown or not. Either it does, or it doesn't. It's painfully obvious if it doesn't.

I guess you could argue over a couple points of scoring minutia if you were feeling argumentative, but other than that there's not much subjectivity...
January 10, 2007 7:26:26 PM

Quote:

Video quality is subjective; accepting the author's learned opinion still doesn't cut it.


Not really. I ran quantifiable IQ benchmarks.

Either the cards displayed Noise reduction, pulldown detection, de-interlacing, detail enhancement... or they didn't.

There's nothing subjective about, say, whether a card detects and properly displays 2:3 pulldown or not. Either it does, or it doesn't. It's painfully obvious if it doesn't.

I guess you could argue over a couple points of scoring minutia if you were feeling argumentative, but other than that there's not much subjectivity...

I already said that 2 posts ago.. Nobody ever listens :-(

OT.. Is it just me, or is the forum slow as syrup today ?....
January 10, 2007 7:28:11 PM

Quote:
Video quality is subjective; accepting the author's learned opinion still doesn't cut it.


While the HQV test is somewhat subjective in some areas, it is thorough in a wide variety of tests, with many being closer to pass/fail than subjective. Based on the tests and the article ATI and nvidia have very good features quality features. The only subjective part is how they may compare to some highend specialized video processors. I don't think a few 4/5 variations would skew the results much.

I am impressed at how many tests both cards passed. I have seen MANY standard and especially upconverting DVD players and HDTV sets that would not score as near well, some costing as much as the video cards.
January 10, 2007 7:38:50 PM

Quote:
I'll be doing a follow-up article dealing with HD in the next month or two.


I would be very interested to see how / if HD acceleration is available for a non HDCP card such as the 7800, for content not requiring HDCP such as WMV9 and QT movies (Various trailers are available for DL). Also, what happens when you try to play HDCP content on a non HDCP card and / or monitor. What features still work when one gets the lower resolution output. Does it tax the CPU more because it is not using the HW acceleration, or becuase it is using the CPU to down convert or ???? If only the monitor is non HDCP how does that effect the processing (I would assume would behave the same as if the card was not HDCP)
January 10, 2007 8:00:19 PM

Quote:
Video quality is subjective; accepting the author's learned opinion still doesn't cut it.


While the HQV test is somewhat subjective in some areas, it is thorough in a wide variety of tests, with many being closer to pass/fail than subjective. Based on the tests and the article ATI and nvidia have very good features quality features. The only subjective part is how they may compare to some highend specialized video processors.
I've heard of $$$ video freaks replacing US$ 40.000 Faroudja line doublers with a high end PC and DScaler.. Dscaler really is an awesome piece of SW.. Too bad I still haven't found a way to use it properly with my Laserdisc and HTPC without turning Mediaportal off...

So I guess a decent piece of hardware doesn't fare all that bad against dedicated video processors... I'm willing to bet a bottle of Islay that my HTPC will beat most sub US$ 1000 DVD players....
January 10, 2007 8:23:37 PM

Codec acceleration and video processing acceleration are two different things. Codec acceleration simply offloads the cpu from processing some of the decoding of the codec. For DVD and broadcast HD the video card accelerates the mpeg2 stream, offloading the cpu. This really doesn't matter anymore because any cpu over 1.2Ghz can decode 480i, 1080i, or 720p material without acceleration.

For HD-DVD and BluRay codec acceleration is much more important because of the higher bitrates of these formats. Any DxVA card (Ati 9600 and higher, and Nvidia 5200 and higher) can accelerate mpeg2 streams, but only Purevideo and Avivo cards can accelerate h.264, and VC-1 which are used for HD-DVD and BluRay (both formats also support Mpeg2 but a pretty good cpu is needed even for this because of the high bitrates) For Nvidia the amount of acceleration (the amount offloaded from the cpu) is dependant on clock speed because the acceleration is done in a programmable processor on the gpu. A 7300GT, with a low clock of 350mhz, provides little acceleration, while a slower 7300LE at 450mhz would provide better acceleration. ATI apparently does this processing in the shaders (stream processing). This is what you are looking to explore with part 2 of the article. While video acceleration is an important topic, as cpus become more powerful it becomes less important. Just like when DVDs were released, the pentuim II 400 could not decode it without help, eventually cpus became fast enough that acceleration didn't matter anymore. More important to me is whether or not the cards can handle 1080i source material with the same quality as they did on your HQV benchmark.

This brings me to video processing acceleration. Here it becomes important to know the difference between interlaced and progressive scan images. With a progressive scan image like HD broadcast 720p, all the information for a frame is present and the video card and cpu simply decode the codec, scale the image to fit the screen, and the image is displayed in full quality. Some scaling is better than others, Nvidia has an advanced scaling algorithm as a part of Purevideo, but for the most part this is easy on the video card and cpu. HD-DVD and BluRay are also progressive images but the cpu overhead is significant due to the advanced codecs and high bitrate. With these progressive images, little processing is required for the video.

Interlaced content is more difficult for the gpu to display. Interlaced images are when the full frame is stored in two different fields. It is the job of the video card to restore these fields into a progressive frame that is then displayed on your monitor or progressive scan TV. There are several ways to do this, some easier than others. As is always the case, the easy ways, like bob and weave, do not provide superior image quality. The test DVD that you used tested these algorithms. The software solution uses the simplest algorithm and hence, does not do well with the benchmark. The ATI and Nvidia cards use specialized hardware to apply advanced algorithms and score well. Where this processing is done is unknown to me. I am assuming it is done in the sharders which would explain why faster 3D cards handle higher resolutions. Clock speed does not seem to make a huge difference here. All Purevideo and Avivo video cards, from the 6150 and x1300, to the 8800 and x1900XTX, can do these algorithms on 480i DVD material.

I can tell you from experience, that a 6150 and 7300LE cannot deinterlace 1080i material with 3:2 pulldown and inverse telecine turned on. They have no problem with 720p and 1080p material. With the inverse telecine and 3:2 pulldown options buried in the options menus, it is difficult and impractical to switch it on and off depending on source material. Over the air (OTA) and cable (QAM) high def video is more widespread than HD-DVD and BluRay on HTPCs. It would be more helpful to me and probably a lot of other people out there to know what video cards can handle HD material with more than simple bob and weave deinterlacing, how much a difference it makes in picture quality, and how to enable it on their cards.

Also, thank you very much for pointing out how to enable 3:2 and inverse telecine. This information is not well known and will help many users. Now, help us figure out how to use it properly.
January 18, 2007 5:20:21 PM

Quote:
I'll be doing a follow-up article dealing with HD in the next month or two.

The short answer is that an X1600 is supposed to be able to hardware-accelerate 720p, I think you need an X1950 PRO or X1900 GT to accelerate 1080i. Ati cards use shaders on the GPU to accelerate decoding, so they need fast GPUs to do high end stuff.

A 7600 GT is supposed to be fast enough to accelerate 1080i because it does it on a separate chip on the card, not the GPU. It depends more on raw clockspeeds than the class of GPU.

I'll try to verify all of this stuff in the review.


I would like to belive that but as another user already stated that his overclocked 7300 was not capeable of HD inverse telecine. Aside from that if you look at the feature set provided by Nvidia
Purevideo Table
you will see that the feature set is much greater on the 7600GT vs. the 7300gs, yet the 7300gs has virtually the same clock speed 550mhz vs 560mhz for the 7600gt. The reasoning that an external chip driven by the the core clock speed doesn't explain the gap in performance.

I am in the process of buiding a new HTPC to handle HD content. Prior machine was a Mobile northwood celeron @2.66mhz in an p865 MB with a 6600gt. While this setup was ok of course CPU peeked during HD playback.

The new rig I am building will revolve around an overclocked e4300 and a 7600gt (this could be a mistake but the price was right). My question is how the CPU affects dropped frames. The CPU seems to do most of the work. If the CPU is the bottleneck fames should be dropped. Does the videocard introduce another bottleneck now? CPU could be at 40% and I could still be dropping frames, because the GPU is maxxed? If this is so considereing the power of todays multicore cpus, couldn't we expect software decoders to do just as good as a job in the near future?

By the same line of reasoning. If I had a Powerful cpu like a QX6700 with a weak GPU like a 6200tc. I would be able to encode hd video faster than play it back. That doesn't seem right.
January 19, 2007 3:57:46 PM

Quote:
Nvidia cards no longer need to use the Purevideo encoder for hardware acceleration, they now work with any encoder that supports hardware acceleration like the ones that come withPowerDVD or Showtime...


Hi, a few comments/questions:

1) Why would one use the PowerDVD or WinDVD mpeg2 decoder when Nvidia's PureVideo decoder is available? Is it better in any way? Assume for a moment that the reader, like me, have no interest in running PowerDVD or WinDVD, but rather prefer Zoom Player, TheaterTek, the Microsoft Media center application or any other player that is chosen based on how it performs as a player, to be paired with a separately installed mpeg2-decoder. Sure, you could install Nero to get an mpeg2 decoder for your ZoomPlayer or MCE application, but why would you, when there's the PureVideo decoder touted for so long to be so good by so many (atleast when paired with an NVidia video card)?

2) The NVidia control panel settings you need to enable now that you for some reason chose to use PowerDVD to compare with, do they need to be enabled when using the NVidia decoder instead? In the Media Center application for instance, for those running XP MCE. Does the detail enhancement setting work with the PureVideo decoder?

3) Isn't the test supposed to be about correctness, fidelity? Then how can artificially adding stuff not in the original image with a detail slider improve the score?

The reason I ask these questions is because I'm making these assumptions;

1) Only if you have HTPC ambitions do video quality matter. Watching clips or something on your gaming or workstation rig is not done with same mindset as watching a film on your projector or plasma from your sofa.

2) If you do have HTPC ambitions, you would have read (or been toldabout) all the old tests saying "yaay, purevideo!", bought yourself an Nvidia card and installed the purevideo decoder

3) If you had the purevideo decoder, purevideo video card and a PC in your livingroom and read up a little, chanses are you'd be using a player capable of WMR9 "exclusive mode" - Zoom Player, TheaterTek or the MCE application.

From that standpoint combining an NVidia card with anything else than they purevideo decoder seems strange.

I would have preferred to see NVidia card with purevideo decoder in exclusive mode (since that is what is considered to be "the way to do it" right now) tossed against an ATI card using the decoder that gives the best results with ATI cards in whatever mode and player that makes the best of that ATI/decoder combo.
January 20, 2007 5:15:38 AM

Quote:

I am in the process of buiding a new HTPC to handle HD content. Prior machine was a Mobile northwood celeron @2.66mhz in an p865 MB with a 6600gt. While this setup was ok of course CPU peeked during HD playback.


You must have something set up wrong. I can do broadcast mpeg2 HD content with a 1200mhz Athlon XP and a 9600XT on a 1080p monitor. Are you talking about HD-DVD or Blu-Ray? If that is the case then you will need a HDCP video card to use digital outputs, and most 7600GTs are not. With Blu-Ray or HD-DVD content, there is no deinterlacing, it is already 1080p/24. Inverse telecine is not needed and any video card will be able to display it using VGA. For these formats, a 7300GS would work well because the clock is high at 550mhz but if you also want inverse telecine for broadcast TV than a 7600GT is needed.
January 20, 2007 2:03:45 PM

Quote:
[quote="NurgleTheUnclean]
I am in the process of buiding a new HTPC to handle HD content. Prior machine was a Mobile northwood celeron @2.66mhz in an p865 MB with a 6600gt. While this setup was ok of course CPU peeked during HD playback.


You must have something set up wrong. I can do broadcast mpeg2 HD content with a 1200mhz Athlon XP and a 9600XT on a 1080p monitor. Are you talking about HD-DVD or Blu-Ray? If that is the case then you will need a HDCP video card to use digital outputs, and most 7600GTs are not. With Blu-Ray or HD-DVD content, there is no deinterlacing, it is already 1080p/24. Inverse telecine is not needed and any video card will be able to display it using VGA. For these formats, a 7300GS would work well because the clock is high at 550mhz but if you also want inverse telecine for broadcast TV than a 7600GT is needed.[/quote]

I have various 1080i/p content in various formats and true I am amble to playback some very easily. but not all, for example wmvhd 1080 stutterd and the cpu was maxxed out. This was better on a faster computer but cpu was still maxed. I don't anticipate needing to playback hd-dvd or blu ray on this pc so HDCP isn't important to me.

Typically for most video playback I use a windows port of mplayer
alternatively I use zoom player with various codecs installed, vmr9.

While these work for 99% of my content it's the 1% that manages to irritate me enough to throw money at it.
January 21, 2007 3:28:30 PM

You are not getting good WMV playback because Nvidia broke the drivers and disabled all acceleration except mpeg2 on AGP cards. I had the same problems with my old 1.2Ghz machine and 9600xt. I thought you were not getting good mpeg2 HD playback. I upgraded to a sempron 2600+ and PCI-e 7300GS and I can play any HD material except HD-DVD and Blu-Ray. You might be able to just buy a motherboard with PCI-e.
January 30, 2007 1:03:11 PM

Hello people,

I have a x1650 pro agp, and in my humble opinion ATI avivo is a pure joke.

The encoder is worthless, and the decoder, well if u like watching DVD's it is ok like this review shows...

If u like any HD content like WMV/h264, ull not be hitting a brick but a steel wall.

Ive been trying to get wmv/h264 accel to work for nearly 2 months with no result.

In my opinion ATI cards CAN NOT accelerate HD content.
Ive tried with a gazzilion driver versions, dvd players (windvd/powerdvd/nero showtime etc etc, even atis (old) decoder hooked in wmv).

None of them even use dxva, in powerdvd u can enable hw acceleration, play a h264 file and notice it is NOT using dxva at all. Same with all the other players....
U can check cpu usage when playing 1080i, or 720p files, and notice no difference with acceleration enabled or disabled. Same with gpu temperatures..
This is on a 2.3 Ghz athlon XP.

The only files i got dxva to work was with a plain mpeg2 vob file.

So if u got a slower cpu, not able to decode 1080 files totally DO NOT get an ATI. Sadly i dont got a vidia agp wth h264 accel to try the purevideo, but im guessing that will work better, unless their selling air aswell.

Now im stuck on this rig, with 95% cpu usage in decoding 1080i h264 files, with some hits into 100% cpu so its just not fluid, and ati cant help me shit, actually i can play 1080 almost fine using coreavc decoder and with ati decoders i cant even do 720p most of the time.. :( 

Also WMV acceleration is a JOKE, its the only thing i got to accelerate something, but if u compare the colors when accelerated with original ones, ull soon notice that ati just tremendously degraded quality in order to get more speed.
I wonder if the gpu is even decoding anything or if its just a cheap trick to show less cpu usage. (if u dont believe me get a wmv like step into liquid, enable wmv accel, make sure u got patched wmp10, or wmp11 on it and play it, ull see what i mean)

Be great if tomshardware could review the hd decoding features....
And if u guys get it to work, at least tell us what versions of decoders etc u used, cuz i gave up weeks ago trying to get it to work.

/edit:
here another review page showing the wmv problems, they have been there for years already, think i even a page from 2004 (!) noticing the bad qual, and still nothign has been fixed:
http://www.computerbase.de/artikel/hardware/grafikkarte...
Its in german so i hope u can decode a bit from it :p 

Other reports of people mention that accell only works on cpu with sse, but even people with intel cpus are not getting it to work.
Some blame AGP, or nforce gart drivers or whatever... beats me.
All i know is that i havent met a single person whose acceleration actually works from. Most people have such fast cpus that they dont need the acceleration, or use players that dont have acceleration..
January 30, 2007 4:46:27 PM

My previous post addressed some of the confusion about Nvidia's "Pure Video" vs. "Pure Video Decoder". All the features of "Pure Video" are available with the latest drivers from Nvidia, provided whatever player you choose can enable hardware acceleration. This is what the article stated. The PowerDVD or WinDVD or which ever player, were comparisons of a software decoder used in the days of old (when hardware acceleration was not available or limited). The article also showed how much better the hardware decoding was (for both ATI and Nvidia) compared to the software solution, ie enable hardware acceleration if you can.

The "Pure Video Decoder" now is essentially a plugin for WMP so that WMP will use hardware acceleration. And since you theoretically need to purchase the decoder, it in effect is on the same level as other players requiring purchase. Many people have a version of PowerDVD, WinDVD, Nero, or ??? if it came with some hardware they purchased, or purchased the software.[/b]
February 9, 2007 7:44:38 PM

I would hurry and get round 2 done. If reports are correct, both cards and drivers will score 0.

With crap in so called "HD" sets out there and everything else getting on the "HD" band wagon "HD Ready" is meaning as much as "Compact Disc Ready" :roll: buyer be ware. :(  I thought HD and HD labeling was supposed to me more strict in its requirements.

PS Based on the video card reports and many HD sets, including expensive ones, I am understanding why people think 1080p30 is SO much better than 1080i60. Most of the processing for 1080i60 to 1080p30 conversion is garbage even to just combine the fields. No processing of progressive images is required, and a refresh rate of 30 is easier and a little cheaper, so everyone takes the easy route and interlaced images can look like crap.
February 10, 2007 9:41:08 PM

Hi folks, interesting thread we've got here. Interesting enough to get me registered on this board, so I'm glad to meet you all ;) 

Here's the brief experience i've had on this subject.
My setup is a skystar 2 based pc with a 19" sxga display. Initially i had a 6200TC (350Mhz, 128MB onboard) card, which later was replaced by a fanless 6600GT.
On my laptop i have a 7300TC card, so i could use the purevideo acceleration too.
I'm in Europe so most OTA stuff i watch is probably PAL? not so sure about that, but anyway. Most HD content (1080i) over here is H.264.

Basically, my observations are that i can't see any picture quality difference between my 6200TC and my 6600GT, or deinterlacing quality.
Have used loads of codecs/decoders, and pretty much all the drivers from the 8X.XX and 9X.XX series.
I can't see any difference either between the Purevideo decoder and the latest Cyberlink 7 codec.
Is this because Cyberlink can uses my geforces' VPU in exactly the same way as the Purevideo codec? I didn't think they looked the same earlier 2006 but the past few couple of months i've been getting my eyes stuck to my screen to try to notice any detail whatsoever, in vain.
As a matter of fact, the only difference i see between the Purevideo/Cyberlink codec and a couple of software-only codecs (dscaler, gabest open source mpeg2dec) is a slight better sharpness on the purevideo/cyberlink. Deinterlacing looks the same too, if i set gabest codec to bob only and no blend.
I do not use purevideo anymore as it causes too much judder/ frame pulsating for me.
Obviously, there's a difference in CPU load when using a 100% cpu-based software codec, but my setup is a conroe E6600 with p5b-dlx and 2GB ram so that's not that important.
This, for me, is true for both SD and HD in mpeg2. The purevideo codec gives me too much stuttering on mpeg2 1080i, whilst the software codecs run smoothly, but with a higher CPU load.

Voilà, my 2cents...
February 12, 2007 12:15:17 PM

Quote:
I would hurry and get round 2 done. If reports are correct, both cards and drivers will score 0.

With crap in so called "HD" sets out there and everything else getting on the "HD" band wagon "HD Ready" is meaning as much as "Compact Disc Ready" :roll: buyer be ware. :(  I thought HD and HD labeling was supposed to me more strict in its requirements.

PS Based on the video card reports and many HD sets, including expensive ones, I am understanding why people think 1080p30 is SO much better than 1080i60. Most of the processing for 1080i60 to 1080p30 conversion is garbage even to just combine the fields. No processing of progressive images is required, and a refresh rate of 30 is easier and a little cheaper, so everyone takes the easy route and interlaced images can look like crap.


wow I have NO idea what you're talking about. are you telling me now there are 1080i30 AND 1080i60 versions??? Now I have to worry about buying a monitor/tv that supports the refresh rate I want, as well as the HD versions I want??? ARGHHH!!!
February 12, 2007 1:26:27 PM

i think you read it wrong mate...

1080i= 60fps, 1080p=30fps
February 12, 2007 1:34:31 PM

lol oops thanks for setting me straight. :)  I still don't understand his point though. is he saying 1080i could look just as good if the processing was better?
February 12, 2007 1:59:15 PM

He did wrote 1080i60 and 1080p30, you need to read it right :) 

I think he's saying that 1080p is a lot "better" than 1080p as there's no deinterlacing to do.
with progressive images you have no artifacts like jaggies, poorly blended fields, etc...
interlacing is 50's technology.
February 13, 2007 4:39:08 PM

I am saying it CAN look as good if the processing was better, AND depending on the original source video frame rate.
May 7, 2007 10:56:54 PM

Well, I've been been reading the first Avivo vs. Purevideo Roundup prior to the revised version.

In the forum I posted several deficiencies of this first Roundup. Unfortunately I can't find my post anymore to quote it and even think it got deleted due to the revised version... Anyway:

If you are interested in issues like:

. Avivo vs. PureVideo vs. PureVideoHD regarding H.264, VC-1
. power consumption when offloading decoding to GPU
. tools support and their maturity
. requirements on SSE capabilities (actually only answered indirectly)

just go ahead an read this article - it's all answered:

http://anandtech.com/video/showdoc.aspx?i=2977

cu,
7oby
May 8, 2007 9:11:07 AM

yup, had read it,

great improvements have been done in the 8 series geforce (apart the 8800. remember the 6800s anyone?)

It's a shame CoreAVC is still left out of the battle, i'm sure it would have had some interestings things to say.
!