Intel copied us, says head of AMD - page 3

279 answers Last reply
  1. UT3 takes advantage of quads too if I remember correctly.
  2. thunderman said:
    Paid Intel fans bash AMD?....that's the claim Sharikou has made on his blog. Not sure if this is true...However Sharikou is one of the leading experts with a reputable blog, so it could well be possible.

    Link to the Blog:
    http://sharikou.blogspot.com/

    AMD4Life!


    Enough said :bounce: :bounce: :pt1cable: :pt1cable:
  3. thunderman said:
    Paid Intel fans bash AMD?....that's the claim Sharikou has made on his blog. Not sure if this is true...However Sharikou is one of the leading experts with a reputable blog, so it could well be possible.

    Link to the Blog:
    http://sharikou.blogspot.com/

    AMD4Life!


  4. harna said:
    "Heck, even when AMD had the superior chip, Intel was easily able to outsell AMD on all fronts."

    ....and that's right Heyyou27, and that's what AMD are threatening to change. Intel won't be on par with A64 until Nehalem. At that time probably we'll all have quad cores. AMD will also have its own platform to take on Intel in every market segment. Don't you think Intel is highly aware of this very serious threat. I'm sure that from what I'm reading between the lines around the net that AMD has pulled the hand brake on Barcey of its own accord. This TLB issue is no big deal and easily worked around and just wouldn't affect any home user. So just add 20% performance off the top for users not applying the TLB fix. Just doesn't make sense that AMD are just sitting back allowing the Barcey launch to appear flawed unless they aren't wanting to, for whatever reason, deliver in mass market quantities. Going past the C2D in raw speed will serve AMD no purpose simply because their bloody processors are fast enough. I'm sure the Barcey will be primed to tangle with Nehalem when the time comes. We DON'T NEED QUAD CORE PRCESSORS in todays computing environment. Jeeze, we haven't even got to 64 bit yet, and most software wouldn't know what to do with a dual core, let alone with 4 or more, so why worry about too much about quads and octo cores? VISTA needs at least 4 Gb to run well, and we can't feed that until we get proper 64 bit OS support. Crikey! let's start using what we already have. Buying Intel quad today is buying you squat. Not even any real speed win, just clock the cheapest of them up with the best of them. Then buy a new mother board for Nehalem and buy another quad core processor with the same features. There's no feature gain, and the speed difference between AM2 and core2 platforms would not be even remotely visible to the average user. Does 11499 3Dmarks compared to 10254 3Dmarks make one computer a less capable gamer than the other? No. But when a new feature is released then both systems need to be upgraded at the same time anyway, so I hope you enjoyed spending that extra $1000.00 for the privilege of the higher score, and I'll laugh my way to the bank. And while we on the 3D Mark score, I wonder what an Intel gpu might score when it's pushed real hard by a U-Beaut quad core screamer. Oh well it looks like that Intel needs DAAMIT afterall, doesn't it? :-) PS and while I read in this forum that Hector is busy claiming Intel is a copy cat, I should add my suspicion as why I think DAMMIT has pull the hand brake on Barcey. Remember the migration of platform by Prescott. Slow 478 cpus that slowly held there own and then jump to 775 platform. Hmmmm Hector I don't think Intel is the only copy cat around. Ouch!


    Hitting ENTER to make paragraphs really helps. Try it some time.
  5. TechnologyCoordinator said:
    What are you smoking? Please pass some this way.

    So you are saying that AMD is purposely sabotaging itself because the market isn't ready for the awesomeness AMD is about to bestow upon it so it needs to be lied to so they think the processors are "defective" so that AMD can deliver quad cores when software catches up?

    Ok Sharikou!
    No, not quite. I feel we are in for a platform change. The last one from 939 to AM2 did damaged to DAMMIT, I think it's the Intel road 478 to 775 for AMD this time. Too many pointers, uninspiring performance not to expectations, AM2 boards not ready for Barcey, and many have speculated that this is odd given the length of the road map, AMD's lack of urgency over the TLB issue, and bargain sales of Spider platforms with cpu ram and mobo at sale price, seems odd for future platform. Hmm that's my thoughts.

    But I will agree with you on one point, most software you can buy today doesn't take advantage of Quad-Core. However, the future is quad-core, and that future seems to be arriving now. Look at Crysis, for example, you really need a quad-core to get the most out of that game. I know that's just one example, but look for the next couple years for software transformation, you're going to start to see many highly threaded application starting to be released.


    See, that depends on what you mean by, "getting the most out of it." I've finished the game on my trusty 939 X2 3800 & new Radeon 3870. Resolution 1680 X 1050 and medium settings. Even enough power to run it up at full graphics to see what I was missing. OS VISTA Basic with Dx 10. Loved it, the environment was ace, everyone who saw it run is out buying it, so the experience was good. It's not quad core that did it, or dual core, it was the Radeon 3870. Imagine this on Spider platform in X-Fire. Do you really need quad to play it? Might be a tad better, but not necessary.
  6. justinmcg67 said:
    Hitting ENTER to make paragraphs really helps. Try it some time.

    I'll keep it mind. lol :-)
  7. Anonymous said:
    UT3 takes advantage of quads too if I remember correctly.

    That's cool. Let me know when you need a quad to play it.
  8. Who gives a crap what he says?

    thunderman said:
    ...that's the claim Sharikou has made on his blog. Not sure if this is true...However Sharikou is one of the leading experts with a reputable blog, so it could well be possible.

    QFT Sigged!
  9. sailer said:
    Looks like he might have sampled some of the loco weed we have here in Nevada. If that's so, you don't want any, TC.

    I'll take all the loco weed you can ship to Australia. Might then be able pay for my next Intel quad core upgrade. I don't have one because I'm poor.
  10. You don't NEED a quad to play it, but if you have one it will get used. That's all I'm saying.

    http://www.anandtech.com/video/showdoc.aspx?i=3127&p=4
  11. This is funny. Could some please ask AMD why they copied off Intel 10-20 years ago with the 386 and 486 CPU's then. That will make them retract that statment in a hurry
  12. jimmysmitty said:
    Don't forget the first on die L3 cache that didn't come with a bug. Came out in several different CPUs most noted the P4EE and the Itanium back in 2003. Also the IMC was originaly developed by Intel but not seen as beneficial at the time as the FSB is obviously able to keep up. Lets not forget their push of SATA which is now the standard for harddrives and becoming the standard for DVD/HD-DVD/Blu-Ray drives and now WiMax.

    Hector is just trying to turn the focus off of their failed launch off Barcy/Phenom and onto Intel. Now in a way we can state that AMD copied Intels first and probably one of the most important innovations of all time really. The first x86 microprocessor. Where as the IMC can't hold a candle to that.

    At least thunderman helps us pay the day without Baron to listen to praise AMD. Only difference is Baron doesn't act like a idiot trapped in 2005. BTW AMDs own website still promotes Lucas films as using their CPUs for the second Star Wars trilogy. Maybe thunderman is just Hector or maybe AMD is still stuck back in 2005......


    err, didnt AMD used the first l3 cache in their K6-III cpu?
  13. TC - as far as Crysis goes I play it on all High Settings with 16x10 resolution and don't see more than 24-30% processor usage.
  14. harna said:
    I'll keep it mind. lol :-)


    That's all we ask. You rock. :sol:
  15. yomamafor1 said:
    *yawn*

    I don't even want to try anymore.


    I have attempted to tell this guy that Intel made an IMC back when they were trying to use rambus memory, so that in fact make AMD a copier, but he never rebutes me or responds back to any of us, so I agree with you and I am just goin to leave this guy alone. Hes really not worth our time. Maybe if we all ignore him he will just go away. lol

    Best,

    3Ball
  16. BSMonitor said:
    L2 cache ...
    On Die L2 cache ...
    Dual Core w/ shared L2 Cache ...
    MMX ...
    SSE, SSE2, SSE3, SSE4
    Dual Channel Memory ...
    RDRAM - fast, but expensive
    DDR2, DDR3 support
    First to 130nm, 90nm, 65nm, 45nm
    Double pumped Integer processing units...
    Quad pumped FSB...
    VLIW Itanium Big Box processors...
    Hyperthreading tech...
    Dual OS tech...
    etc....

    64-bit was a bit premature... No one who bought an Athlon 64 in 2003 ever ran a 64-bit stable OS with it... unless it was a linux box...
    On-Die Memory controller... Both Intel and AMD had always been developing this, but Intel decided that is wasn't yet necessary... As Core 2 without on-die memory controller owns any Athlon with an on-die memory controller, seems Intel was right...


    You can go ahead and add IMC and 64 bit (EM64T has been around for a while) to the list for Intel since they came up with those first, but just didnt end up implementing them...as you said...it appears they were right.

    Best,

    3Ball
  17. AMD are deserving of the top spot and are being robbed. Intel copy and cheat

    Articles:
    http://www.custompc.co.uk/news/601749/intel-copied-us-says-head-of-amd.html

    http://archive.gulfnews.com/articles/07/12/01/10171434.html

    Quotes from the Article
    Quote:
    AMD's CEO, Hector Ruiz, says that all the all the major recent innovations have come from AMD, while Intel is ‘trying to catch up’


    Quote:
    Talking to Gulf News, Ruiz said that ‘If you look at the last five years, if you look at what major innovations have occurred in computing technology, every single one of them came from AMD. Not a single innovation came from Intel


    I think that there are some macro-architectural features that Ruiz was talking about AMD implementing before Intel did: widespread use of an integrated memory controller (yes, I know about the canceled PIII Timna and the 386SL CPU package- those were not widespread), monolithic multi-core CPUs, "wet" lithography, independent point-to-point buses used for CPU-to-CPU communication rather than a FSB setup, and desktop/server CPU dynamic clockspeed adjustment. Ruiz also pointed out the x86_64 ISA, which was developed by AMD and not Intel. AMD did bring those to market before Intel did, or in the case of x86_64, brought it to market at all, and such Intel is playing "copycat."

    One notable area in which Intel has done development in the last five years is SSE. AMD has adopted almost all of Intel's SSE instructions (SSE, SSE2, SSE3, SSE4) but Intel has adopted none of AMD's (SSE5, but nobody has any chip that runs SSE5 instructions yet.) Intel also brought the first 128-bit SSE engine to a microchip, something that AMD put in the 10h family CPUs. Intel also deployed the first shared cache of the two makers- the shared L2 in the "Yonah" Core Duo. That feature also made its way into the 10h CPUs as a shared L3 cache. Intel also has a big R&D budget for process development and I am sure that some of the gate materials Intel has been using for 45 nm will find their way into AMD's gates at 32 nm if not sooner.

    I am sure there are other things as well that I am not thinking of, but truth to be told, both companies have used ideas that the other came up with. Only an idiot would totally ignore everything else in the industry as no one company has a monopoly on coming up with all of the good ideas and having all of the smart engineers working for them. I really don't see the point of the article except for its "red meat" aspect. But a flamebait can certainly sell copy or attract people to your website, you know...
  18. Weren't AMD were the first to implement DDR1 Support (Intel went the path of rambus), 3dnow! (which Intel countered with SSE), Double pumped FSB (can't remember when quad pumped came out) and single die Dual-Core?

    Short list in comparison now that I think about it. :)
    Intel had Itanium, but AMD came up with the whole x86-64 design (AMD64) which intel used for their em64t instruction set.
  19. cnumartyr said:
    TC - as far as Crysis goes I play it on all High Settings with 16x10 resolution and don't see more than 24-30% processor usage.


    That is because of the split load on the processor cores, which you have 4 of. When I check my CPU usage on my system it has similar numbers and you will notice that I have a dual core. Thus, in relation you are using less of you cpu's power, but still running more power to the game. With the amount of physics in this game and detail present, the ability to offload many physical calculations off of the GPU is vital especially with the heavy shading involved in the vegetation rendering that is basically what makes up this entire game. Any game that is going to use 100% of a CPU will not run on anyones computer, the OS has to run as well. Quad > Dual in Crysis...thats all I am saying...do you need it...no, but it is nice to have. I mean...you do have one, so...why fight it. lol

    Best,

    3Ball
  20. ...and the whole performance-per-watt ideology whilst Intel was touting more gigahertz.
  21. Hector did make some good points though.
  22. Reynod said:
    If it were not for the group splitting from Fairchild Intel would not exist.

    Texas Instruments should have then been appropriately attributed with the design and manufacture of the first microprocessor.

    I miss my old TI programmable ... and my slide rule.

    I have a tear in my eye ...

    .

    Actually, thier were two groups who split from fairchild. Sanders led the other group
  23. Heyyou27 said:
    ... Heck, even when AMD had the superior chip, Intel was easily able to outsell AMD on all fronts.


    yes, and many have wondered how.
    The Japanese FTC knows, while the South Koreans and EU suspect.
    BTW, didn't you use to have a celery?
  24. 3Ball said:
    I have attempted to tell this guy that Intel made an IMC back when they were trying to use rambus memory, so that in fact make AMD a copier,

    He is probably ignoring you because RDram had the memory controller integrated, not the processor.
  25. One notable area in which Intel has done development in the last five years is SSE. AMD has adopted almost all of Intel's SSE instructions (SSE, SSE2, SSE3, SSE4) but Intel has adopted none of AMD's

    This is a real interesting point MU_Engineer, but two things it shows,

    1. the close link between software developers and Intel, which is vital, of course and certainly in AMD's interest to mimic as for a large part of that time AMD has been conceeding a huge frequency advantage to Intel, any software coding advantage would have put AMD further behind.

    2. It has come back to really hit the P4 in the back of the neck, because this architecture with its long pipeline really needed streamlined instructions, but of course is not able to utilize SSE 3 & 4. The Athlon turns out not to be so aflicted by the implementation of the new instruction set, certainly nt as fast as the new CPU's but doesn't get bogged down like the P4 does.
  26. I like trolls. They make me smile a lot.
    Unfortunately, this thread has deteriorated into usefull information in spots.
  27. endyen said:
    I like trolls. They make me smile a lot.
    Unfortunately, this thread has deteriorated into usefull information in spots.


    :kaola:
  28. thunderman said:
    AMD are deserving of the top spot and are being robbed. Intel copy and cheat

    Articles:
    http://www.custompc.co.uk/news/601749/intel-copied-us-says-head-of-amd.html

    http://archive.gulfnews.com/articles/07/12/01/10171434.html

    Quotes from the Article
    Quote:
    AMD's CEO, Hector Ruiz, says that all the all the major recent innovations have come from AMD, while Intel is ‘trying to catch up’


    Quote:
    Talking to Gulf News, Ruiz said that ‘If you look at the last five years, if you look at what major innovations have occurred in computing technology, every single one of them came from AMD. Not a single innovation came from Intel


    Quote:

    Meanwhile, Intel’s forthcoming Nehalem architecture promises an integrated memory controller, which AMD has had since 2003


    Intel are Evil!


    INTEL TIMNA - WHOS COPYING WHO
  29. apache_lives said:
    INTEL TIMNA - WHOS COPYING WHO

    Did it actually have a working ODMC? I dont know of anyone who has had the priviledge of using one.
    Since it was never released, I'd say it doesn't count.
  30. captaincharisma said:
    This is funny. Could some please ask AMD why they copied off Intel 10-20 years ago with the 386 and 486 CPU's then. That will make them retract that statment in a hurry


    Now this one is dead easy to answer. Just look at RRP of Intel's cpu's, of the time and that'll tell you why.
  31. Boy do they have all of you fooled. The Illuminati controls both Intel and AMD and orchestrates the feined competition between them to maximize its own profits and maintain world domination through secret electronic surveillane and evesdropping using hidden back doors built into all Intel and AMD processors. They co-opted the military and industrial complex and and provided the clandestine impetus behind DARPA to develope the internet to provide a world wide net to connect all the processors for amorphous data collection and amalgamation at Area 51. With this data they control world trade and make massive amounts of money manipulating futures prices for commodities and foreign exchange. AMD's current loses (a drop in the bucket to the Illuminati which subsidize them) are part of the scheme to feign competition. The real reseach and innovation is done at secret labs and doled out to Intel, AMD and other techology companies. Originally based in Area 51, the increased notoriey of the Area forced them to move the labs first to Springfield, Arizona (just try to find that on your maps) and then, when Arizona became too populous, to several Micronesian islands until the satellite electronic traffic became so massive that it started causing cancer in lab technicians. Now no one outside the Alliance knows where they are, but there are reports that they have developed terrawaves - a way of sending sonic waves through hard ground as easily as we send mircowaves through the air - to send secret data to and from their lab sites. There are also reports that these waves have contributed to geostatic disruptions causing minor earthquakes and geothermal warming which is one of the covert causes of global warming.
  32. wow ... do you seriously believe that??

    They are all here in Australia now ... Pine Gap etc etc.

    You can find them from the McDonalds inventory ... they ship tons of prepacked food from Pt Augusta to them weekly.

    Dark matter experiments inside the domes.

    But I could be just pulling your leg.
  33. 3Ball said:
    That is because of the split load on the processor cores, which you have 4 of. When I check my CPU usage on my system it has similar numbers and you will notice that I have a dual core. Thus, in relation you are using less of you cpu's power, but still running more power to the game. With the amount of physics in this game and detail present, the ability to offload many physical calculations off of the GPU is vital especially with the heavy shading involved in the vegetation rendering that is basically what makes up this entire game. Any game that is going to use 100% of a CPU will not run on anyones computer, the OS has to run as well. Quad > Dual in Crysis...thats all I am saying...do you need it...no, but it is nice to have. I mean...you do have one, so...why fight it. lol

    Best,

    3Ball


    I've got a weak video card and a dual core processor (Opty 175). I have to have all my settings, including physics, on low :-( 800*600 sucks...
  34. TechnologyCoordinator said:
    I've got a weak video card and a dual core processor (Opty 175). I have to have all my settings, including physics, on low :-( 800*600 sucks...


    Tisk, tisk...I would have expected better from you TC. lol

    Best,

    3Ball
  35. endyen said:
    He is probably ignoring you because RDram had the memory controller integrated, not the processor.


    I may be wrong on this one, but the article and interview is on this very website stating what I have said and it was just a few months ago, but I cannot find the article and dont care enough to spend all of my time looking for it. I like both companies, and dont care who makes what, but this guys is ridiculous regardless of what I say. AMD is still in a bind...IMC or not!

    Best,

    3Ball
  36. harna said:


    This is a real interesting point MU_Engineer, but two things it shows,

    1. the close link between software developers and Intel, which is vital, of course and certainly in AMD's interest to mimic as for a large part of that time AMD has been conceeding a huge frequency advantage to Intel, any software coding advantage would have put AMD further behind.


    AMD adopting Intel's extensions for performance reasons is just scratching the tip of that iceberg. The real reason why AMD has adopted just about every new instruction Intel has developed is for code-executing compatibility. If AMD CPUs could not execute all of the code that Intel CPUs execute and execute it in the same manner, AMD would find themselves in a real world of hurt. Incompatibilities could force software makers to have to compile for one chip or the other (judging from the lack of 64-bit applications, I am guessing very few will release both an AMD-optimized and Intel-optimized version). Not too many will choose to compile for a chip that has somewhere in the 20%s market share, so AMD is adopting Intel's instructions for their own good.

    Quote:
    2. It has come back to really hit the P4 in the back of the neck, because this architecture with its long pipeline really needed streamlined instructions, but of course is not able to utilize SSE 3 & 4. The Athlon turns out not to be so aflicted by the implementation of the new instruction set, certainly nt as fast as the new CPU's but doesn't get bogged down like the P4 does.


    The original two revisions of the Pentium 4- Willamette and Northwood- did not have SSE3 support. They had SSE and SSE2 support, with SSE2 being introduced on the Willamette. SSE3 was introduced on the P4 Prescott. In fact, the CPUID() flag for SSE3 happens to be "pni," which means "Prescott new Instructions." The only currently-shipping chip with SSE4 support is Intel's 45 nm units, which at the moment consist of a handful of Harpertown server CPUs and the Core 2 Extreme QX9650. Every other CPU Intel sells has supplemental SSE3 or less.

    The reason that the P4 got bogged down so much if SSE was not used was that its long pipelines of the P4 made branches very expensive for the CPU. The Willamette had 20 pipeline stages, the Northwood 21, and the Prescott and Cedar Mill had 31. If there was a branch miss, the pipeline had to be flushed and reloaded, which could take as many clock cycles as the pipeline is long. A 3.00 GHz Prescott executes an instruction every 333 picoseconds. Completely reloading a stalled pipeline would take 10.33 nanoseconds. A 2.00 GHz Athlon 64 executes an instruction every 500 picoseconds, and it takes the chip only 6.00 nanoseconds to reload its 12-stage pipeline after a stall. This helps to make the Athlon 64 more friendly to branchy code and less sensitive to the use of optimized instructions than the P4. There is also speculation that the ALUs in the P4s are partially crippled compared to the PIII's ALU to spur developers to use SSE and SSE2, and that would also add to the difference in performance between SSE-optimized and non-SSE-optimized performance. Also, the Athlon 64 as well as the original K7 Athlons had a ton of x87 FPU power with three complex FPUs, compared to one complex and one simple FPU in the Pentium 4. You would see quite a bit of difference in performance between a Pentium 4 and an Athlon/Athlon 64 in non-SSE (x87) FP math as well.
  37. The latest AMD innovation is their Hybrid graphics technology (read my post in GFX section)…Intel are already copying it.
    Yeah sure the argument is ‘well Intel started before AMD’, things have moved on, the student has outgrown the teacher , so that is an obsolete argument. I’m not seeing anything Innovative out of Intel lately, they sit back and watch AMD invent technologies because the evidence suggests they lack the capability. Core2Duo is light years ahead of Intel’s last generation Netburst…however AMD still have the more advanced processors namely the X2 and Phenom.
    I cannot deny AMD’s poor financial predicament…but at least they still command a huge amount of respect for all the contributions they have made to the computer industry. AMD have their dignity…that counts for a lot.

    AMD4life!
  38. thunderman said:
    I cannot deny AMD’s poor financial predicament…but at least they still command a huge amount of respect for all the contributions they have made to the computer industry. AMD have their dignity…that counts for a lot.


    Respect, dignity and $5 will get you a cup of coffee at Starbuks. And given that $5 is just about what AMD stock is worht right now, there is verylittle respect or dignity left in the shell that was once AMD.

    Thundy, it's perfectly ok with me for you to be securely seatbelted into the pasenger seat of the BaronMobile with MrsBytch riding shotgun, but keep in mind that if you're serios about AMD4Life, you might have a very short life expectancy. :pt1cable:
  39. thunderman said:
    Core2Duo is light years ahead of Intel’s last generation Netburst…however AMD still have the more advanced processors namely the X2 and Phenom.
    AMD4life!


    There are a few unique features in the K8s and Phenoms, but part of something being advanced is being able to get the technology to perform. The Phenom may be a monolithic quad-core with three levels of cache, multiple power planes, and an integrated memory controller, but the Phenoms don't outperform the two-dies-on-an-old-school-shared-FSB Core 2 Quads clock-for-clock (although it is not a big difference) and especially not in terms of absolute performance due to lower clock speeds. The Core 2 Quads also run much cooler at higher clock speeds than the Phenoms do, especially if you compare the Yorkfield to the Phenoms. I bet that some of the differences between the Core 2s and the Phenoms stems from the fact that the Phenom is such an ambitious project and there are significant wrinkles that will get ironed out, increasing both clock-for-clock performance and absolute performance (by increasing clock speed.)

    The numbers don't lie though. Currently Intel has the faster and more efficient CPUs on the market, at least as far as midrange and high-end units are concerned. Their "bigger hammer" approach of adding bigger caches, more dies, and upping clock speed is working out better at the moment than AMD's "sharper scalpel" approach of trying to increase IPC by increasing internal bandwidth.
  40. MU_Engineer said:
    part of something being advanced is being able to get the technology to perform.


    The Suzuki RE5 Rotary was light years ahead of the competition. A 497cc NSU Wankel cranking out 64 bhp at the rear wheel, it shoudl have taken over the market in 1974 and never looked back. Suzuki built a whole plant to make just that model. Unfortunatey they had the usual Wankel crackin tips and they used up more fuel than a Peterbilt haulin logs up Pikes Peak. (Comparisons to Phenom stricly coincedental). The result was that the tech wasnt up to snuff and they sold about three of them. (Comparisons to Phenom... blah blah blah).
  41. Anonymous said:
    3dnow! (which Intel countered with SSE)



    Intel came out with MMX, then AMD came out with 3Dnow!

    1Haplo
  42. MU_Engineer said:
    The reason that the P4 got bogged down so much if SSE was not used was that its long pipelines of the P4 made branches very expensive for the CPU. The Willamette had 20 pipeline stages, the Northwood 21, and the Prescott and Cedar Mill had 31. If there was a branch miss, the pipeline had to be flushed and reloaded, which could take as many clock cycles as the pipeline is long. A 3.00 GHz Prescott executes an instruction every 333 picoseconds. Completely reloading a stalled pipeline would take 10.33 nanoseconds.


    MU, great post as usual. I have on comment to add to the above quote, however. Much is said about how long Cedar Mill's pipeline is, but not much is usually said about how spectacular its branch predictor is. It stands to reason that the effort involved in increasing the pipeline (i.e. increasing the frequency) had an effect on the effort in increasing the hit rate of the BTB.

    About the ALU speculation: I can't say for sure, but it seems more likely to me that floating point / integer cluster trade-offs were probably due to transistor budget reasons (yes, this could be consider 'intentional,' but isn't nefarious). Again, just my reasoning.

    Disclaimer: I work for Intel, and to my knowledge, saying something is "good" does not break an NDA. :wahoo:

    Also, just a general comment about this thread: like I've stated before, people inside the industry aren't so passionate about a business. I have coworkers who are married to AMD and Freescale employees, and it simply isn't a big deal. And "copying" each other is what feeds the industry. AMD copied x86, and Microsoft copied WIMP, but who cares now?
  43. 1haplo said:
    Intel came out with MMX, then AMD came out with 3Dnow!

    1Haplo


    Whoops! my bad. :lol:
  44. OlSkoolChopper said:
    Respect, dignity and $5 will get you a cup of coffee at Starbuks. And given that $5 is just about what AMD stock is worht right now, there is verylittle respect or dignity left in the shell that was once AMD.

    Thundy, it's perfectly ok with me for you to be securely seatbelted into the pasenger seat of the BaronMobile with MrsBytch riding shotgun, but keep in mind that if you're serios about AMD4Life, you might have a very short life expectancy. :pt1cable:


    AMD is just a little short on money and released a mid-range product with a minor bug... They're not a bunch of jobless hobos like you depict them as in every one of your posts.
  45. Can Not said:
    AMD is just a little short on money and released a mid-range product with a minor bug... They're not a bunch of jobless hobos like you depict them as in every one of your posts.


    Try telling that story to your bank when you are just a little short of making your mortgage, each and every month. But wait! There's More! The 300k house you bought 1.5 years ago is now worth only 150k...
  46. Can Not said:
    AMD is just a little short on money and released a mid-range product with a minor bug... They're not a bunch of jobless hobos like you depict them as in every one of your posts.

    AMD is very short of money, and released a poor performing product with a major bug. They will soon be a bunch of jobless hobos once Hector takes the money and runs.
  47. hahahah..... this dude on page 1 said... double cheeserburger quads... <---- LOL
  48. Can Not said:
    AMD is just a little short on money and released a mid-range product with a minor bug... They're not a bunch of jobless hobos like you depict them as in every one of your posts.


    AMD Staff. Q4, 2008:




    :bounce:
Ask a new question

Read More

CPUs AMD Intel