AMD-ATI to make a GPU on a CPU

AMD-ATI to make a GPU on a CPU

Or a CPU on a GPU

By Fuad Abazovic: Tuesday 15 August 2006, 08:13
http://theinquirer.net/default.aspx?article=33678


ENGINEERS from AMD and ATI companies are going to start working on a unified chip that will have GPU and CPU on a same silicon. We learned this from high ranking sources close to the companies, more than once.

Don’t get too excited as it will take at least eighteen months to see such a dream come true.

This is the ultimate OEM chip, as it will be the cheapest way to have the memory controller, chipset, graphics function and CPU on a single chip. This will be the ultimate integration as will decrease the cost of platform and will make even cheaper PCs possible.

CPUs are being shrunk to a 65 nanometre process as we speak and the graphics guys are expected to migrate to this process next year. The graphics firms are still playing with 80 nanometre but will ultimately go to 65 nanometre later next year.

DAAMIT engineers will be looking to shift to 65 nanometre if not even to 45 nanometre to make such a complex chip as a CPU/GPU possible.

We still don’t know whether they are going to put a CPU on a GPU or a GPU or a CPU but either way will give you the same product. µ
-------------

Like oh my god that is the best idea ever. Of course the Inquirer found out first because they are like so totally great.
11 answers Last reply
More about make
  1. Well, I suppose this is good for budget system builders since a unified chip should decrease overall costs. Will probably make a good embedded CPU as well. But it will have to fit in either socket AM2 or AM3 for it to be viable. A separate socket just for a unified chip is simply not feasible, but I am sure AMD knows that.

    It might also make a good mobile unified chip as long as power consumption is low enough and heat dissipation is modest.

    For the enthusiast, the chip means nothing. In fact, enthusiasts would avoid it like the plague since the extra non essential circuitry for chipsets and GPU will add to build cost. Overclocking issues will likely be an issue as well.
  2. Never in my life will I take anything The Inquirer says seriously. Never
  3. Quote:

    For the enthusiast, the chip means nothing. In fact, enthusiasts would avoid it like the plague since the extra non essential circuitry for chipsets and GPU will add to build cost. Overclocking issues will likely be an issue as well.

    So very true. I would steer clear for as long as possible if faced with that technology.

    -Miz
  4. Quote:
    AMD-ATI to make a GPU on a CPU

    Or a CPU on a GPU

    By Fuad Abazovic: Tuesday 15 August 2006, 08:13
    http://theinquirer.net/default.aspx?article=33678


    ENGINEERS from AMD and ATI companies are going to start working on a unified chip that will have GPU and CPU on a same silicon. We learned this from high ranking sources close to the companies, more than once.

    Don’t get too excited as it will take at least eighteen months to see such a dream come true.

    This is the ultimate OEM chip, as it will be the cheapest way to have the memory controller, chipset, graphics function and CPU on a single chip. This will be the ultimate integration as will decrease the cost of platform and will make even cheaper PCs possible.

    CPUs are being shrunk to a 65 nanometre process as we speak and the graphics guys are expected to migrate to this process next year. The graphics firms are still playing with 80 nanometre but will ultimately go to 65 nanometre later next year.

    DAAMIT engineers will be looking to shift to 65 nanometre if not even to 45 nanometre to make such a complex chip as a CPU/GPU possible.

    We still don’t know whether they are going to put a CPU on a GPU or a GPU or a CPU but either way will give you the same product. µ
    -------------

    Like oh my god that is the best idea ever. Of course the Inquirer found out first because they are like so totally great.



    Verndewd posted this link a couple of days ago. Look at page 10(??)
    http://www.ati.com/companyinfo/about/AMD_ATI_Investor_Presentation.pdf

    Oh, never trust anything from the Inquirer. Only trust the Enquirer

    click to expand
    http://img119.imageshack.us/img119/9388/july18issuebh7.th.jpg


    Peace
  5. Seems like a pretty good idea, except for two things: you can't upgrade the cpu/gpu separately and placing the gddr would be a tricky thing to do because it can't really go into the processor itself...maybe modules?
  6. Quote:
    Seems like a pretty good idea, except for two things: you can't upgrade the cpu/gpu separately and placing the gddr would be a tricky thing to do because it can't really go into the processor itself...maybe modules?


    Yes,

    The notion of the integrated CPU/GPU has been batted around in the THG forums since rumours of the AMD/ATI merger surface. Many people are enthralled with the idea of the GPU on CPU die, but dont really understand the technical problems with making this happen (the GDDR for one) as well as the end user limitations (upgrading components) They are so focused on the thought of a cheaper component and potentially improved graphics performance, that they dont realize the negative aspects.

    Peace
  7. I dont know you just cant trust the enquirer at all. But it would be cool and probably will happen someday
  8. Quote:
    Seems like a pretty good idea, except for two things: you can't upgrade the cpu/gpu separately and placing the gddr would be a tricky thing to do because it can't really go into the processor itself...maybe modules?


    Yes,

    The notion of the integrated CPU/GPU has been batted around in the THG forums since rumours of the AMD/ATI merger surface. Many people are enthralled with the idea of the GPU on CPU die, but dont really understand the technical problems with making this happen (the GDDR for one) as well as the end user limitations (upgrading components) They are so focused on the thought of a cheaper component and potentially improved graphics performance, that they dont realize the negative aspects.

    Peace

    And the negative aspects are...

    Technical problems need to be overcome, yes?
  9. Think outside of the enthusiast mindset, think of what a cpu/gpu on the same die would do for appliances and electronic devices...think of how it would apply to a multimedia cell phone, PDA, or balckberry type device...this type of technology could actually make tablet pc's a reality, something no bigger than a notepad but as powerful as any e-machine, possibly make current day laptops obsolete...

    Quite a number of possibilities and opportunities for AMD/ATI with this idea...just remember, while Texas Instruments may not make desktop pc's anymore, they are one of the biggest chip makers and chip sellers in the world...

    I look forward to see what magic they pull out their hats with this...
  10. I can see benefits from having an integrated solution.

    It could offer a wider variety of...

    You know what? I changed my mind. I was gonna say that it would offer easy upgrades as a packaged unit.

    But, then I see, at the same time, a CPU listed under several names to accomodate the different gfx chips.

    I've already had it with the Intel's nomanclature (not that AMD is innocent in this either).

    The speed that might be possible is a good thing but, the opportunity for nOOb exploitation could be too tempting.

    I speak to you, long time forum members, how would you like to field those questions? (should I go with AM2x2100xt or C2D8200gt or AM3x1950xls).

    It never ends, does it :(
  11. Quote:
    Seems like a pretty good idea, except for two things: you can't upgrade the cpu/gpu separately and placing the gddr would be a tricky thing to do because it can't really go into the processor itself...maybe modules?


    Yes,

    The notion of the integrated CPU/GPU has been batted around in the THG forums since rumours of the AMD/ATI merger surface. Many people are enthralled with the idea of the GPU on CPU die, but dont really understand the technical problems with making this happen (the GDDR for one) as well as the end user limitations (upgrading components) They are so focused on the thought of a cheaper component and potentially improved graphics performance, that they dont realize the negative aspects.

    Peace

    And the negative aspects are...

    Technical problems need to be overcome, yes?
    I apologize, I should have said "....potential negative aspects" :oops:

    Performance: Will the GPU on die CPU actually offer better performance? It would be easy to just say yes, wider bus, on die, no latency, etc. But theorizing these things, producing them and actually have them work out are all different "horses" Yes the latency from CPU to GPU will be greatly reduced, but what will happen to the GPU/video memory latency? Depends where they put the memory. The latency from the die to the output is another possible bottleneck.
    If I had to hazard a guess, it would be that the CPU/GPU will not be offered as a performance product, but a bargain product because of potential performance limiters

    Memory issues: Where will the memory go? On die? That’s a huge problem. Eat up valuable die space or put the memory on the mobo. Memory on the mobo? Share system memory, or add another bank of slots for "video only RAM"? More bus bandwidth and potential latency issues.

    Thermal issues: Waste heat should theoretically be reduced, to some extent by the CPUs smaller die size. But will this really solve the GPU waste heat problem completely?

    Upgrades: What if you want to upgrade your graphic processor? Mobo manufacturers could get around that easily, just as is done now with onboard audio/video. Simply leave the user the option to plug a card into a slot and disable the onboard device they dont want. Assuming AMD allows this.
    Will they allow you the option of using an off die GPU, with a GPU on die CPU system. If they do, will they permit you the option of using an Nvida chipped card, or only allow ATI chipped cards?

    Cost issues: Will putting the GPU on die with the CPU really cut costs? The die size will balloon with the GPU co-located. Will the increase in die cost really be offset by not having to buy a graphics card? What about memory. Will we now have to buy additional RAM for video processing (this actually has a positive aspect in that you could potentially "customize" your graphics performance easily by adding memory?)

    The above is in no way all inclusive, nor is it qualifiable. Some of the potential issues I listed may not be issues at all, or may already have easy solutions. Some may just be dead wrong.

    Please, understand, I'm not saying GPU on die with CPU is a bad thing. I'm not saying AMD can't or shouldn't try it. I'm not saying AMD is a bad company. I am saying there are potential problems (I corrected myself this time :wink: ) that some people are not seeing because they are focused on possible price reductions.


    Peace
Ask a new question

Read More

CPUs GPUs AMD ATI