Bulldozer & GPU for FPU speculation

speedemon

Distinguished
Mar 29, 2006
200
0
18,680
Ive been thinking about AMD's current strategy and why they cut the FPU on their current CPU's. Software Workloads in general are moving to parallel designs, not there yet but getting there.
Could AMD be positioning themselves to be able to "compete" or beat Intel on the integer front (where they previously lacked big time) while totally (for the most part) disregarding the FPU because they are planning to move the FP workload over to the GPU?
If this were to happen, I think AMD would have a VERY big advantage with floating point arithmetic while still being able to compete with intel on the integer front.
Thoughts? and forgive me for not searching google before this thread.
 
^^ Heres the issue: while it makes some sense, you have to consider that GPU's are meant specifically for heavily threaded tasks, like Rendering. For anything else, they are VERY weak. So while using the GPU as a giant FP engine sounds like a good idea, if the software doesn't take advantage of the GPU's design, you can kill FP performance by using the GPU.

Also, this technique is useless in any task that also requires the GPU to fulfill its traditional role [games, for instance].

I'll say it again though: BD's design is more suited for server processing then general computing.