Sign in with
Sign up | Sign in
Your question

Use CPU to do some GPU tasks?

Last response: in Graphics & Displays
Share
June 4, 2012 7:00:22 AM

Hi,
I had a post asking about possibly upgrading my PC a few weeks ago. I now know that this is quite possible, but I only wish to do that as a last resort, considering I still have a years warranty that will be breached.
I have a good CPU (i7-2600) but a bad GPU (ATI Radeon HD6450). When gaming the CPU is only at about 20% load, so I'm wondering: Can I use my CPU to do some of the GPU's processing?
It seems a waste to have my CPU just sitting at 20% when as far as I know its possible to have it do some processing for the GPU. Maybe have a few cores do some extra GPU work and the rest for CPU? Just a tiny boost, thats all. Specifically it would be best to work with DX11 games because when enabled that cripples my frame rate on most games to about 10-15FPS.
Thanks for any answers you guys have.

More about : cpu gpu tasks

a c 291 U Graphics card
a b à CPUs
June 4, 2012 7:09:13 AM

Not really. Your HD 6450 is actually way faster than your i7 2600. There's a good reason we have graphics card and not CPU doing graphically intensive work.



HD 6450 has around 250 GFLOPS. Your i7-2600 has around 70-80 GFLOPS performance (don't watch sandy bridge in the chart, those sandy bridges are high-end dual xeons).

The reason we don't use GPUs for everything because they suck in single threaded performance. However, they shine in multithreaded ones - and that's exactly what you need to display graphics.

Quote from nvidia:
Quote:

The reason behind the discrepancy in floating-point capability between the CPU and the GPU is that the GPU is specialized for compute-intensive, highly parallel computation – exactly what graphics rendering is about – and therefore designed such that more transistors are devoted to data processing rather than data caching and flow control.

More specifically, the GPU is especially well-suited to address problems that can be expressed as data-parallel computations – the same program is executed on many data elements in parallel – with high arithmetic intensity – the ratio of arithmetic operations to memory operations. Because the same program is executed for each data element, there is a lower requirement for sophisticated flow control, and because it is executed on many data elements and has high arithmetic intensity, the memory access latency can be hidden with calculations instead of big data caches.
June 4, 2012 7:22:51 AM

Exactly as sunius said.
Additionally, all games software will already have their gfx data in a pre-optimised format (e.g. block data/pre-calculated tables/etc)....so there's not much else you can do in this respect.

Of course, updating your gpu would be the way to go.
Related resources
June 4, 2012 7:27:30 AM

OK, thanks. I think I've just learned something today XD
Im just a little too scared to upgrade my GPU right now... maybe in the future I will. One last thing: I came across this thing called Swiftshader. Before I download and try it, can anyone tell me if it likely works? Thanks.
June 4, 2012 7:49:13 AM

Interesting application! - cudos to you on your research ;)  ....new to me (and most others, I'm sure)

From a quick look:

cpu-based but benefits only for very specific applications.

From the site:

-----------------------------------
2) Exactly how fast is SwiftShader?
SwiftShader performance can vary significantly depending on the exact CPU and system being used as well as the application being tested. For example, on a modern quad-core Core i7 CPU at 3.2 GHz, the SwiftShader DirectX® 9 SM 3.0 demo scores 620 in 3DMark06, surpassing most integrated graphics hardware.

To see how SwiftShader performs for your application, download the demonstration version and give it a try!


3) What makes SwiftShader so fast?
SwiftShader uses runtime compilation techniques to dynamically build exactly the right code needed to run the graphics commands that a game or application requires. That code is then cached so that the next time the same set of graphics commands is required, it can simply be executed immediately. Beyond that, SwiftShader uses highly advanced graphics algorithms to squeeze every last drop of performance out of the CPU cores available on your system. Please see our SwiftShader Technology page for more information.

-------------------

So, "scores 620 in 3DMark06, surpassing most integrated graphics hardware" tells us that it's definitely not going to approach your current gfx card in game performance.

By all means give it a go and if it improves your gameplay fps do let us know!! However, it's still safe to say that no cpu-based method can ever approach a dedicated gfx card - for gaming!

...but there may be trickery or magic afoot with this application ;) 
a c 78 U Graphics card
a b à CPUs
June 4, 2012 8:10:22 AM

As far as I know upgrading your GPU does not void your warranty. Upgrading memory doesn't and its just as easy. You'd have to check. of course depending on your PSU and card you might need to upgrade both. but really sitting around not enjoying your computer while waiting for the warranty to run out is just not the way to go
a b U Graphics card
June 4, 2012 8:12:10 AM

Getting a playable framerate is more important than image quality. The 6450 is still able to handle plenty of games on lowered settings/res. Before going out and buying a new video card, at least give that a shot. A game might still be enjoyable without having all the eye candy turned on.
June 4, 2012 3:43:50 PM

Hi, thanks for the replies.
I tried SwiftShader and its just a .dll and a readme telling me to put it in my game's folder to use it. I'm not sure if it's working though- I have noticed completely NO difference whatsoever.

Also, as far as the warranty thing goes, I never actually read the booklet but there's a sticker over the part to remove to open the system that says 'Read warranty before opening.' I assumed that meant that opening it voids the warranty- it looks like one of those stickers that prints a 'Void' over the metal, but instead of void, it says 'Acer'.

And I would say, everyone seems to think the GPU is an absolute disaster- coming from reviews, what people have said here and elsewhere. Actually it seems not all that bad- I can run GTA IV at 25-30FPS on medium settings at 1366x768, higher based on messing about which I haven't been bothered to do yet. I can run TF2 at 1920x1080 on MAX settings for everything... so its not all that awful. Just would like to not have to worry about 'Oh, it passes minimum, but not recommended. Should I buy the game? Hmm...' It also, suprisingly, runs Combat Arms at over 80FPS stable on MAX settings 1920x1080. The card isn't that bad.

Felt like a rant there, lol!
Thanks for the help, I'll read the warranty now.
a b U Graphics card
June 24, 2012 10:48:39 AM

You have to realize that a lot of people who bash lower end cards are just trying to convey that they own better cards whether they really do or not. Thing is most people play games like TF2/Sims/WoW and not these games that the review websites run, with the exception of BF3 maybe, and the latest gen low-end cards will handle these games just fine. As you said, the 6450 is not that bad of a card at all as long as you're fine with lowered settings.
June 24, 2012 4:48:27 PM

If and I say well 'IF' you want to upgrade your GPU, what is your budget?
Also why the hell did you pair an i7 with low end GPU?
Could have gone with a 2500.
Or is it a an OEM PC??
June 26, 2012 3:46:27 PM

I bought the PC from a store, never made my own. TBH I'm 14, but I know enough and I am starting a prokect to build a PC with my friend this summer.
I am happy to use slightly lowered settings and this card actually handles TF2 at MAX settings 1080p at 30-40, so...
But anyway, thanks :) 
June 26, 2012 5:31:08 PM

So, this PC was built for gaming??
June 26, 2012 6:27:56 PM

Not on the box, but it does better than most pre-made PCs of its own price or better to be bought in the shop. Especially when VAT is 20% here.
June 26, 2012 6:32:32 PM

You could have gone with a 2500/k and a better GPU IMO :/ 
December 9, 2012 7:05:52 PM

themegadinesen said:
You could have gone with a 2500/k and a better GPU IMO :/ 


Read the thread, didn't build it myself...
!