I've been reading a lot of posts asking for suggestions on good GPUs for HD video playback. And people always respond with this or that card...
Now, I got a computer science degree from the University of Illinois specializing in graphics, and now I'm doubting my education. I was taught that you need to issue commands to the GPU through an API such as OpenGL, DirectX, CUDA, ect.. for it to offload any kind of processing from the CPU.
Everytime I watch an HD movie on VLC or WMP, the GPU temperature never increases, which is indicative that these players don't use the GPU. So my gripe is why are there a million threads on asking for a good GPU when in fact they should be asking for a good software player? Any GPU will suffice in video decoding; it's a dead simple process. You will not notice a difference between the 8800GT or the GTX 260 decoding video. What's really going to make the difference is the software that is using those cards.
Now, I got a computer science degree from the University of Illinois specializing in graphics, and now I'm doubting my education. I was taught that you need to issue commands to the GPU through an API such as OpenGL, DirectX, CUDA, ect.. for it to offload any kind of processing from the CPU.
Everytime I watch an HD movie on VLC or WMP, the GPU temperature never increases, which is indicative that these players don't use the GPU. So my gripe is why are there a million threads on asking for a good GPU when in fact they should be asking for a good software player? Any GPU will suffice in video decoding; it's a dead simple process. You will not notice a difference between the 8800GT or the GTX 260 decoding video. What's really going to make the difference is the software that is using those cards.