Closed Solved

AMD's Zacate/LLano Fusion Chips look Competitive/Delayed!

Zacate 1st Off seems like a Smart move with A GPU DX11 80 Stream proccessor's that show faster performance in DX11 Titles running some at 30FPS, while the Intel ICore-5 is a few FPS! Good for single player gaming and around 18 watts competes against the Atom Proccessor Interestingly. Intel doesn't seems to want to add DX11 support which gives AMD a performance edge, just maybe media extension for Video Decoding and performance and converting Media Files. Good position for AMD in the Mini Atom Market!

LLano is Delayed until summer of 2011 which brings the question what will Intel have to compete against LLano? 1Q production is when its ramping up which is soon, but AMD doesn't have the FABs as Intel, plus it has delays sometimes which kinda shoots them in the Foot! AMD K6-3 and Athlon FX 51+55 were executed perfectly and put them in a good position vs. Intel. 480 Stream Proccessor looks pretty impressive but delayed until Next Year is a misfire when you have some good Titles for the Holidays Possibly F.e.a.r. 3 and Doom IV!

Sandy bridge looks to have a Media Access Unit and A GPU, but gives Intel some time to make some tweaks to the Proccessor, I still think that 480 Stream Proccessors are fast especially with early benchmarks of Zacate so it should be interesting!
56 answers Last reply Best Answer
More about zacate llano fusion chips competitive delayed
  1. GunBladeType-T said:
    Zacate 1st Off seems like a Smart move with A GPU DX11 80 Stream proccessor's that show faster performance in DX11 Good for single player gaming and around 18 watts competes against the Atom Proccessor Interestingly. Intel doesn't seems to want to add DX11 support which gives AMD a performance edge, just maybe media extension for Video Decoding and performance and converting Media Files. Good position for AMD in the Mini Atom Market! Ontario I read is around 9Watts and has integrated graphics for NetBook! Which is competing directly vs. Atom Netbooks!

    LLano is Delayed until summer of 2011 which brings the question what will Intel have to compete against LLano? 1Q production is when its ramping up which is soon, but AMD doesn't have the FABs as Intel, plus it has delays sometimes which kinda shoots them in the Foot! AMD K6-3 and Athlon FX 51+55 were executed perfectly and put them in a good position vs. Intel. 480 Stream Proccessor looks pretty impressive but delayed until Next Year is a misfire when you have some good Titles for the Holidays Possibly F.e.a.r. 3 and Doom IV!

    Sandy bridge looks to have a Media Access Unit and A GPU, but gives Intel some time to make some tweaks to the Proccessor, I still think that 480 Stream Proccessors are fast especially with early benchmarks of Zacate so it should be interesting!


    http://www.anandtech.com/show/3920/amd-benchmarks-zacate-apu-2x-faster-gpu-performance-than-core-i5
    Another Review puts the ICore-5 14 - 19 fps vs. performance in the 27 - 34 fps range on Zacate
    Another interesting website!
  2. I dont think anybody here likes to read about AMD spanking intel gunblade. :(
  3. I think its more of a wait n see thing.
    Too many disappointments in the past.
    Also, many here in the past have claimed this type of perf on these smaller solutions arent desirable to them, tho I think that will change
  4. eyefinity said:
    I dont think anybody here likes to read about AMD spanking intel gunblade. :(


    No Windsdor Hands-Shake puns man, Anyways thanks for the post man amusing!
  5. Best answer
    GunBladeType-T said:
    http://www.anandtech.com/show/3920/amd-benchmarks-zacate-apu-2x-faster-gpu-performance-than-core-i5
    Another Review puts the ICore-5 14 - 19 fps vs. performance in the 27 - 34 fps range on Zacate
    Another interesting website!


    Yea but Zecate wont really have to worry about current Core i5s in the Notebook market. They will rather have to worry about Sandy Bridge which was giving more than 2x the performance of current Core i5s GPUs and it wasn't even able to use Turbo for the GPU and might have been a low end version instead of a high end version.

    It is nice though to see preliminary benchmarks starting to come out. Just ened to wait for real results and nothing setup by AMD. I would rather see Anand or THG doing the benchmarking since they wont try to splice it one way or the other.

    I just hope this is competative enough to help AMD get back into the notebook market because ever since Intel put out the Pentium M, where Core came frome, AMD hasn't been doing quite as well.

    eyefinity said:
    I dont think anybody here likes to read about AMD spanking intel gunblade. :(


    Its more that most of the people here have been around in the technology field to know not to trust anything thats given to you from the maker. If Intel says SB gives 2x GPU performance over Core i5, great. But in order for us to trust it most of us have a wait and see from thrid party sources.

    There are your ocassional fanboys from both sides but most people here just want to be able to get the best performance for their price range. Some just want best performance no matter what price. And then others want best performance/value.

    But trusting ANYTHING the maker of a product states is pointless. ATI (and I have owned ATI since th 9700Pro till my current HD4870) was using the TFLOPS as a marketing gimmick to show 2x performance gains but the problem was that that never translated into games.
  6. Best answer selected by GunBladeType-T.
  7. jimmysmitty said:
    Yea but Zecate wont really have to worry about current Core i5s in the Notebook market. They will rather have to worry about Sandy Bridge which was giving more than 2x the performance of current Core i5s GPUs and it wasn't even able to use Turbo for the GPU and might have been a low end version instead of a high end version.

    It is nice though to see preliminary benchmarks starting to come out. Just ened to wait for real results and nothing setup by AMD. I would rather see Anand or THG doing the benchmarking since they wont try to splice it one way or the other.

    I just hope this is competative enough to help AMD get back into the notebook market because ever since Intel put out the Pentium M, where Core came frome, AMD hasn't been doing quite as well.


    Its more that most of the people here have been around in the technology field to know not to trust anything thats given to you from the maker. If Intel says SB gives 2x GPU performance over Core i5, great. But in order for us to trust it most of us have a wait and see from thrid party sources.

    There are your ocassional fanboys from both sides but most people here just want to be able to get the best performance for their price range. Some just want best performance no matter what price. And then others want best performance/value.

    But trusting ANYTHING the maker of a product states is pointless. ATI (and I have owned ATI since th 9700Pro till my current HD4870) was using the TFLOPS as a marketing gimmick to show 2x performance gains but the problem was that that never translated into games.



    It helps to have a proprietary game engine or API to get your complex chips on the market used to their max potential! or Even trying buying out a game engine thats popular! ATI's gear has some juice and a early entrance over Nvidia in DX11 but no games like Doom IV or F.e.a.r. 3 not sure about Vanquish to undercut NVidia! They have some Horsepower in their chips to do some damage, but no finish-him like MK Kung Lao split the Sector Fatality!
  8. GunBladeType-T said:
    It helps to have a proprietary game engine or API to get your complex chips on the market used to their max potential! or Even trying buying out a game engine thats popular! ATI's gear has some juice and a early entrance over Nvidia in DX11 but no games like Doom IV or F.e.a.r. 3 not sure about Vanquish to undercut NVidia! They have some Horsepower in their chips to do some damage, but no finish-him like MK Kung Lao split the Sector Fatality!


    Oh absolutley. ATIs GPU have some power but still suffer from a SP clock tied into the core clock. If they could clock their SP units at the same speed as nVidia, I believe they would make nVidia look like a Intel IGP. But they cannot.

    Back in the day, before AMD bought ATI, ATI used to work with game devs. Source, VALVe HL2 engine, was designed on the 9800 GPU and still to this day you will see less people with problems on a Source based game than nVidia and also better performance. A guy started up ROllercoaster Tycoon 3 the other day and it had the ATI logo in the beginning.

    That is the one thing I hate that AMD did. They stopped ATI from working with game devs. nVidia still does so thats why a lot of games give an advantage to nVidia. They are optimized for nVidias hardware and not for ATIs.
  9. jimmysmitty said:
    Oh absolutley. ATIs GPU have some power but still suffer from a SP clock tied into the core clock. If they could clock their SP units at the same speed as nVidia, I believe they would make nVidia look like a Intel IGP. But they cannot.

    Back in the day, before AMD bought ATI, ATI used to work with game devs. Source, VALVe HL2 engine, was designed on the 9800 GPU and still to this day you will see less people with problems on a Source based game than nVidia and also better performance. A guy started up ROllercoaster Tycoon 3 the other day and it had the ATI logo in the beginning.

    That is the one thing I hate that AMD did. They stopped ATI from working with game devs. nVidia still does so thats why a lot of games give an advantage to nVidia. They are optimized for nVidias hardware and not for ATIs.


    Well the ATI Radeon DDR AIW with tv tuner and DDRAM did some damage with the catalyst drivers and was competitive after 3dfx and matrox weren't competing anymore. The NV20 Geforce Ti-200 was fast and price pretty good on the market! After that ATI was very competitive with the Hyper-Z buffer and Truform Rendering that knocked out Nvidia until the Geforce FX/Geforce 6 lineup!Nvidia brought the Heat with the geforce fx 5800ultra cooler and geforce 6 sli! LMA was comparable to ATi but not as effective as I believe ATI had the 512bit token ring network! But delayed until R580! Get a Lucid Hydra and you can run both i wonder if it can replace ATI or Nvidia?
  10. The HD2900 (R600) had a 512bit token ring bus system that pushed out 126GB/s memory bandwidth on GDDR3. I always imagine what it could do with current GDDR5.

    ATI was great from the 9700Pro to the X1900. It made nVidia work. But their current lineup is not the best overall performer. Its great but in raw power, nVidia is better and has better multi-GPU scaling.

    But ATI has the price, well somewhat.
  11. So how much did ATi/AMD spend on Dirt 2 and AvP?
  12. Not as much as nVidia did here.
    Adding a benchmark plus tweaks?
    Oh, and its for netbooks UNDER 500$
    http://www.xtremesystems.org/forums/showpost.php?p=4553984&postcount=209
    Im not sayin anyones doin this, just sayin
  13. ^What was the last game that had a ATI logo in the intro and was optimized specifically for ATI hardware? I can tell ya....

    Source engine. 2004. Just a bit before the buy out began.

    Hell they even removed the ATI logo from the subsiquent HL2 releases.
  14. It would seem, if ATI knew they were for sale, as you allude, then itd only behoove them to do more BEFORE any sale, not less.
    Create more of a market name, up its worth.
    How many games came out between then and the buyout?
    And nothing?
  15. The ATi logo appears on the Dirt 2 website but the Nvidia one doesn't, when it's the other way around the game is said to favour Nvidia hardware whether it does or not so why can it not be implied that this game favours ATi hardware?
  16. So, Crysis will be just another game?
    Not a spearhead to raise the market appeal for nVidia?
    Along with all that comes the "it wont (well maybe) never ever run on ATI cards (unable to, theyre too weak, devs forgot to remove if ATI then fail from game bench) etc etc
    This is progress?
  17. Mousemonkey said:
    The ATi logo appears on the Dirt 2 website but the Nvidia one doesn't, when it's the other way around the game is said to favour Nvidia hardware whether it does or not so why can it not be implied that this game favours ATi hardware?


    It might have been there due to the fact that ATI had a DX11 capable card out quite a bit before nVidia.

    http://www.maximumpc.com/article/features/nvidias_hot_rod_gtx_480_powerful_and_power_hungry?page=0,1

    Thats a review when Fermi first hit. Its only a bit slower single than a HD5970.

    Not saying anything bad about ATI, just that they don't seem to work as closley with the game devs as they used to. I think that Dirt 2s website only had an ATI logo due to DX11, and not because they optimized it for ATI hardware.
  18. I agree jimmy. It wasnt an exclusive, only on ATI HW, with devs making it unaccessable to other branding.
    This is just another reason why Im so happy with Intel entering into the gfx arena, itll calm this old cr4p down
  19. jimmysmitty said:
    It might have been there due to the fact that ATI had a DX11 capable card out quite a bit before nVidia.

    http://www.maximumpc.com/article/features/nvidias_hot_rod_gtx_480_powerful_and_power_hungry?page=0,1

    Thats a review when Fermi first hit. Its only a bit slower single than a HD5970.

    Not saying anything bad about ATI, just that they don't seem to work as closley with the game devs as they used to. I think that Dirt 2s website only had an ATI logo due to DX11, and not because they optimized it for ATI hardware.

    Of course AMD would never have said anything like :-
    Quote:
    DiRT® 2 Offers Exhilarating DirectX® 11 Gaming Experience - Only On ATI Radeon Graphics Cards
    or
    Quote:
    "AMD has worked tirelessly with Codemasters to collectively transform realism in PC gaming through new DirectX 11 technologies only available today on the latest ATI Radeon graphics cards," said Matt Skynner, vice president & general manager, GPU Division at AMD.
    would they?
  20. ^Of course at the time of Dirt 2s release, ATI was the only one that had DX11 capable GPUs.

    I still doubt it. Until I see the game devs actually claiming they work with ATI, like VALVe did with Source, AMD can say all they want.
  21. They made a big noise about it last year in a press release so were they being dishonest? Should I no longer believe any of these press releases? Because if that's the case then the one posted about Nvidia pumping 2 mill into Crysis 2 should also not be believed.
  22. For years, ATI had the only DX10.1 cards out as well, and a few DX10.1 games were mysteriously pulled from that ability.
    Again, lets see a three way race with gfx, and push towards a better overall platform, and less exclusionary actions
  23. Since there was no other DX11 HW out there, they worked with ATI, ala M$ and nVidia in DX10, which was downgraded, and why DX10.1 was ever needed to begin with.
  24. Mousemonkey said:
    They made a big noise about it last year in a press release so were they being dishonest? Should I no longer believe any of these press releases? Because if that's the case then the one posted about Nvidia pumping 2 mill into Crysis 2 should also not be believed.


    They made big noise and the thing is, nVidias Fermi beat them in Dirt 2 under DX11 tests. That tells me it was not optimized because L4D2 still runs best on a ATI GPU.
  25. If you remembered my posts from back then, Huddy said they werent going for exclusives then, it was done for a DX11 platform , and would be done for every DX 11 capable HW
    http://www.bit-tech.net/bits/interviews/2010/01/06/interview-amd-on-game-development-and-dx11/
  26. If you read my link above, its no wonder why it was ATI, and not nVidia in the buyout
    They both had similar approaches, which meshed better, OTOH.....
  27. So they did work with Codemasters or they didn't?
  28. RH: Absolutely. BattleForge, STALKER, and with Dirt 2 have had engineering visits [from us] and engineering on site and they have access to engineering communication on the phone or email. Along with that we have a co-marketing program, so BattleForge - there was a DirectX 10.1 version out in March - we had a co-marketing program with them and we pushed it as "this is DirectX 10.1, look at what it can do for you and the performance advantages you can get by running 10.1 over 10." We do this by bundling games as well, I think we bundled BattleForge - we've certainly bundled STALKER and we're bundling the hell out of Dirt 2 right now. The engineers and CTO from Codemasters put together a video for us to talk about the co-operation we've done with them.
    http://www.bit-tech.net/bits/interviews/2010/01/06/interview-amd-on-game-development-and-dx11/
  29. JAYDEEJOHN said:
    Oh, and its for netbooks UNDER 500$


    18W is twice the total power consumption of my netbook, so if your CPU + GPU is sucking up that much power you'd be looking at either a huge battery or a poor battery life.
  30. JAYDEEJOHN said:
    RH: Absolutely. BattleForge, STALKER, and with Dirt 2 have had engineering visits [from us] and engineering on site and they have access to engineering communication on the phone or email. Along with that we have a co-marketing program, so BattleForge - there was a DirectX 10.1 version out in March - we had a co-marketing program with them and we pushed it as "this is DirectX 10.1, look at what it can do for you and the performance advantages you can get by running 10.1 over 10." We do this by bundling games as well, I think we bundled BattleForge - we've certainly bundled STALKER and we're bundling the hell out of Dirt 2 right now. The engineers and CTO from Codemasters put together a video for us to talk about the co-operation we've done with them.
    http://www.bit-tech.net/bits/interviews/2010/01/06/interview-amd-on-game-development-and-dx11/

    So it's OK for ATi/AMD to do it but underhand and skewing the results if Nvidia do the same?
  31. So, your netbook is fast, decodes etc?
    Or, with this, and good gfx as well, will there be a market of compromise?
  32. Mousemonkey said:
    So it's OK for ATi/AMD to do it but underhand and skewing the results if Nvidia do the same?

    OK, heres what I mean
    Intel often leads in cpu/HW
    It goes to M$ et al to better its HW as well as the SW thatll run off of it.
    They dont bang their drum, hold up fake cards, scream and yell
    ATI and nVidia have traded blows many a time, and ATI doesnt exclude functionality on any HW, and if certain other HW is better at some point, then it will show it, whereas, if its borked, then other HW will never have this ability
    AMD won the 64 bit war, and shares it with Intel
    No screaming, no chest beating, and again, this is why I welcome Intel into a better wholoe/complete platform, where now 3 HW makers will help shape gfx, and we win
    If nVidia had its way, itd only be nVidia IGPs able to do many things many users would like to do
    So, in essence, as its p-layed out in the past, nVidia "investing" 2 million into a game isnt good for everyone, only nVidia, and not so with ATI, Intel and AMD
  33. So it's fair to say that you've already made your mind up on this one even though no benchmarks or figures have been seen because the game isn't even out yet.
  34. That was never really my point
    Im hoping that nVidia will make the game kicka$$ for everyone
    Let nVidia gain a lil more, but in so doing, let that money make the game even better

    Like I said, Intel worked with M$ for MT, in doing so, some of the work AMD is doing with M$ is already done, and not exclusive

    M$, as well, isnt just going to say, we will change it later, and never make some changes for AMD, they will make those changes, unlike certain devs etc

    I like cooperation, as it betters everyone, not just one group, but all who join/use such outcomes.

    You can blame the devs ONCE when they never follow up, but twice? Then its whoever is pushing such things
  35. JAYDEEJOHN said:
    So, your netbook is fast, decodes etc?


    It doesn't need to be: it's designed to be small and go a long time between charges, not to play games or run climate simulations.

    If you're only going to have a three-hour battery life, then you might as well buy a full-size laptop.
  36. Why not own the largest car?
    Why not get the coldest freezer, anything under 32....
    If theres a market, theres a market

    The cpu advantage will be nice compared to Atom
    The gfx a plus, and dont think itll cost all the time.
    Im thinking time will prove an expanded market here, and a better product than currently thought of, by judging most comments here.
    Just my opinion o course
    Just dont forget, if theres a market for it
  37. jimmysmitty said:
    The HD2900 (R600) had a 512bit token ring bus system that pushed out 126GB/s memory bandwidth on GDDR3. I always imagine what it could do with current GDDR5.

    ATI was great from the 9700Pro to the X1900. It made nVidia work. But their current lineup is not the best overall performer. Its great but in raw power, nVidia is better and has better multi-GPU scaling.

    But ATI has the price, well somewhat.


    Whats about the dual core Asus Ares 5970? Isn't it number one dual board solution? Token Ring Bus 512bit was pretty advanced lead to some delays, ahead of its time, hyper-z was the killer in performance also
  38. MarkG said:
    18W is twice the total power consumption of my netbook, so if your CPU + GPU is sucking up that much power you'd be looking at either a huge battery or a poor battery life.


    18watts isn't that bad with 80 stream proccessors, they also have ontario 9watts which is more of a netbook!
  39. Mousemonkey said:
    So how much did ATi/AMD spend on Dirt 2 and AvP?



    Are you talking about Aliens Vs Predator the movie you can check out production costs info on www.boxofficemojo.com
  40. GunBladeType-T said:
    Are you talking about Aliens Vs Predator the movie you can check out production costs info on www.boxofficemojo.com

    No I was talking about the game, I've seen the movie and I thought it was rubbish.
  41. Mousemonkey said:
    No I was talking about the game, I've seen the movie and I thought it was rubbish.



    Special Effects were cool, laser scope blew up some one nicely kinda of like the video game series with special effects, showed off alot of hi-tech gear and bases! Plot looked to be survial of the fittest, predator-alien vs. old predator for control of the tribe that simple!
  42. According to the front-page news here, the embedded Tunnel Creek Atom SoCs use anywhere from 3.9 watts down to 2.7 watts TDP, including 3D graphics & HD encode/decode. That's about 1/3rd to 1/4th the power consumption of Ontario.
  43. Nice Buck stops here arguements guys!, But a good point to compete with fusion is why couldn't Intel License a PPU Proccessor from Asus, and put it embedded on a mainboard to compete or proccessor die if IGP is outdated according to some articles on the net? Clock that bad boy at 1ghz and higher, and Asus seems like a good driver team! Multiple solutions in IT guys, and no anti-trust problems or lawsuits! :o Asus is good with soundstorm and IGP graphics.
  44. Ummm maybe Zacate, but Im not so sure about Ontarios power usage being that high
  45. Is Atom still plagued by that power-hungry chipset?
  46. Yes it runs on electrons,quarks and tesla :bounce:
  47. JAYDEEJOHN said:
    Ummm maybe Zacate, but Im not so sure about Ontarios power usage being that high


    9 watts according to the AT article, and IIRC they were quoting AMD on that.

    Personally I'd be interested in something like that flip-screen 10" netbook with keyboard and capacitive multi-touch that Dell demoed in the front-page news article here. I gather it uses the new Atom x86-compatible SoC for reduced power consumption, but no benchies or even projected battery life. I'd think 8-10 hours would be necessary to compete with the iPad. Now if they could dump the LCD for OLED, with at least 720P resolution, that'd be great IMO.
Ask a new question

Read More

CPUs Performance AMD Intel