Star wars Episode III used AMD Technology

75 answers Last reply
More about star wars episode technology
  1. So What?

    The article you referenced is over 2.5 years old. At that time AMD had a great product.
  2. yawn?
  3. DELETED Yeah Amd USED to be on top, and no, no Amd4life and no Intel4life, A greaterproduct4life.
  4. <-- leaves a can of flammable fuel to be lit.

  5. Ahh theres no fanboi like an AMD fanboi.
    Where's Baron when you need him?
  6. And if they made another Star Wars and used AMD Barcelona chips, so what? The chips don't make the movie, people do. I seem to remember how some movies used to be made using Apple computers, and again, so what? But that did give rise to a lot of people who to this day claim that Apple is better at graphics than IBM based machines.

    No, using one chip or another, or one OS or another, doesn't prove a thing other than that was what was used at the time.
  7. DELETED I wouldn't pay much attention to him. All of his posts are similar to this. This is actually the first time that I have read a post of his that had backed information that I knew to be true, but still none the less...even if they used it right now to make another movie...it wouldn't make me go buy one! He can go AMD4Life all he wants...if he gets stuck with a phenom or a 6400+ that is his problem.

    Best,

    3Ball
  8. <--runs like hell...



    2 secs later..

  9. Grimmy said:


    First you steal my gas can, then you burn up, blow up, or otherwise get rid of this foolish thread. Well, all I can say is....Good work. :sol:
  10. :lol:

    Well... I could have made it more easier, if I knew where to find this guy:

  11. Where are those people when you need them?
  12. *Yawn.*

    In other news, Pixar uses Intel.

    Guess what???

    I still don't care either way.
  13. Wow,

    I never new that AMD actually made fan heaters - well theyve been chucking out a lot of hot air lately...

    Maybe they have got a new amd heater comming out, as they need to make some money somewhere.......

    The films were still turds anyway, no where near as good as episodes 4 - 6.... and they proberly used z80's on those beauties..........

    Why dont this guy just sit around his amd chanting "its not fair" to the AMD fairies. "we all most had it.... "

    ps Fairies are not real
  14. Hellboy said:
    The films were still turds anyway, no where near as good as episodes 4 - 6.... and they proberly used z80's on those beauties........


    The effects were done by hand, frame by frame (rotoscoping). it was a very long tedious process.

    I have a theory that thunderman is really an alias / alter ego for BM. Think about it... do you ever see them in the same thread?
  15. thunderman said:


    0/10

    Troll harder
  16. nice comments Grimmy...I use my parents old AMD as a space heater
  17. Episode 3 wasn't that great.
  18. metrazol said:
    Episode 3 wasn't that great.


    I liked it the best of 1 - 3, but 3 and 2 were close. 1 was terrible, but the original 4 - 6 > all, especially 5 and 6. imo!

    Best,

    3Ball
  19. thunderman said:


    hey newbie no one really cares just enjoy the freaking movies for god sake
  20. so what, storyline makes a good movie, not chips
  21. I never got why people didn't like Episode 1. I thought it was excellent (maybe not the best out of the six), and the fight scene at the end is arguably the best fight from any of the movies.
  22. I thought there were all really good films. Of course, ESB and RoTJ are exceptional peices of film history. People will be watching those movies until we really have lightsabers.
  23. jkflipflop98 said:
    I thought there were all really good films. Of course, ESB and RoTJ are exceptional peices of film history. People will be watching those movies until we really have lightsabers.


    Let's hope that happens soon.
  24. SEALBoy said:
    Let's hope that happens soon.



    They will be powered by 3 GHz K10s.. just the heat coming off of them focused into a beam. Intel isn't up to it.
  25. DId you know that Star Wars Episodes 4, 5, and 6 did NOT use Intel or AMD? And they were better than the new ones by 1,000,000%!

    End of story.
  26. Empire4Life :bounce:
  27. SEALBoy said:
    I never got why people didn't like Episode 1. I thought it was excellent (maybe not the best out of the six), and the fight scene at the end is arguably the best fight from any of the movies.


    Well I didnt dislike 1, but in relation to the others it wasnt that great. I hated JarJar, but Darth Maul is my fave char in star wars so it is a delima lol.

    Best,

    3Bal
  28. thunderman said:


    ROFLMAO you must have gone hunting the web looking for Intel insults, shows how much of a fanboy you are.

    Here have some free interesting facts about AMD, Intel, and you.
    *AMD was a smaller company that actually made cpus for Intel before breaking away.
    *Intel's Pentium 1 had a math bug issue and had to be recalled (some anyhow).
    *AMD beat Intel to the 1ghz barrier, and actually scaled better back then, pre-pentium 4.
    *AMD's K5 was there first in house design and was a flop like the Phenom - late and too slow.
    *AMD's K6 design was actually another companies design bought up - NexGen 6x86
    *AMD's K7 design success was thanks to the Alpha design team and fsb design (DDR).
    *AMD's K8 design took the main elements from old designs - check out Intel Timna and see where Fusion comes from.
    *DELETED
    *AMD's Phenom follows like the K5 - in house design, late and lacking.
    *K8 was the first design that actually made AMD some money.
    *DELETED
    *Intels Israel team brought us the greatest cpus and designs the world has ever seen
    *AMD Will always be second best, Intel is king, same as ATi vs Nvidia

    Meh thats all that comes to mind atm...

    Peace out DELETED
  29. 3Ball said:
    Well I didnt dislike 1, but in relation to the others it wasnt that great. I hated JarJar, but Darth Maul is my fave char in star wars so it is a delima lol.

    Best,

    3Bal



    Ya, seriously dude, Darth Maul is the baddest villan I think I've ever seen. He's way scary.
  30. At least my articles are current any my intentions are to discuss the information, not start a flame war.

    As others' have said:

    YAWN
  31. In other news, the rest of Earth's population that actually have a clue are waiting with baited breath for the new Intel 45nm offerings. . .
  32. jkflipflop98 said:
    In other news, the rest of Earth's population that actually have a clue are waiting with baited breath for the new Intel 45nm offerings. . .


    I, for one, is waiting eagerly to see how mobile Penryn performs. A 2.8Ghz Penryn with lower than 35W sounds really nice.
  33. lol I suppose he'd still find useless info after AMD dies. A true fanboy indeed.
  34. Sure, they used AMD. That's all good, but what we want to REALLY know is......

    wait for it..........

    whose keyboard did they use huh? Whose Keyboard??!!!
  35. Quote:
    Apparently Intel is having trouble at 45nm, but they say the reason for delay is they have no competition. LOL. At least they are good liars.


    Compared to AMD's recent track record, Intel are practically saints when it comes to PR honesty. :lol:

    40% faster anyone? :pt1cable:
  36. Quote:
    Apparently Intel is having trouble at 45nm, but they say the reason for delay is they have no competition. LOL. At least they are good liars.


    Actually, everything has just been "rumored". Apparently, the reason Intel possibly delayed Yorkfield in order to provide better support for OEMs and motherboard manufacturer. The 45nm process is at the moment, superb (evidently from the on-time release of 45nm dual cores)

    http://www.xbitlabs.com/news/mainboards/display/20071221231218_Mainboards_Found_Guilty_of_Delaying_Intel_s_New_Quad_Core_Microprocessors.html
  37. Quote:
    Apparently Intel is having trouble at 45nm, but they say the reason for delay is they have no competition. LOL. At least they are good liars.


    never give it a rest huh? its like you have "no life."
  38. chaosgs said:
    DELETED Yeah Amd USED to be on top, and no, no Amd4life and no Intel4life, A greaterproduct4life.


    A GREAT PRODUCT for life.......AT A GOOD PRICE.
  39. Lucas Films used AMD, I remember when AMD was gloating about this like it was some sort of Nobel Peace prize.

    Macs now use Intel and according to Apple, Macs are better than PC although the just of their commercials are aimed at Vista. Except how Macs(which use the same components as PCs now mind you) can run Vista faster than a PC.

    It doesn't matter what they use as long as they do a good job. Do you think anyone cares what PC they used to make FF7: Advent Children? No. Just that it looks damn good. Thats what matters.

    But I am sure that Lucas Films will use AMD even though Phenom underperforms compared to Intel.

    BTW, they only use what will encode the films the fastest and with the best quality.
  40. TechnologyCoordinator said:
    At least my articles are current any my intentions are to discuss the information, not start a flame war.

    As others' have said:

    YAWN



    :lol:

    Gotta love your posts........ :kaola:

    [CPUs] 24/7 WallSt: Ruiz Needs to Go

    [CPUs] Tom's: Phenom Fails Most AM2 Compatibility Tests

    [CPUs] AMD: Tier-Ones, Channel In Same Boat On Barcelona

    CPUs] AMD Closes Below $8: Wall Street not Happy with Delays

    [CPUs] The Inquirer: How AMD turned Barcelona into a right royal mess

    [CPUs] AMD Issues "Stop Ship" Order for Opterons; TLB Errata Cripples K10

    [CPUs] Chestnuts Roasting on an Open Phenom Fire - 140 Watts?

    [CPUs] AMD Drops QuadFX Like a Bad Habit

    [CPUs] AMD Stock Plummets 25% in Response to K10 Release

    [CPUs] Barcelona Doesn't Deliver - Benchmarks Deemed Non-Compliant

    [CPUs] Barcelona Availability Issues?

    [CPUs] AMD Earnings Announcement Today

    [CPUs] Sharikou Pushes Back BK Prediction

    [CPUs] Hal Licino: How AMD's Failures Are Triggering An Intel Monopoly
  41. sailer said:
    And if they made another Star Wars and used AMD Barcelona chips, so what? The chips don't make the movie, people do. I seem to remember how some movies used to be made using Apple computers,


    Yes, Babylon 5 during it's second and third seasons, but it used Amiga's for it's first season. I'm an AMD fan because Phenom isn't as bad as Intel fanboy's claim. It's core for core, faster than X2 and even if AMD is stuck in the same market position as the K62 budget days, that's not horrible. Once they get their debt paid off they can do R&D again and maybe Intel will become complacent.

    AMD copies Intel, Intel copies AMD. Intel forces Netburst on the corporate and Dell home buyer world for years, AMD uses too much hype pushing the Phenom out the door. The world continues to turn.

    I'll just never forgive Intel for their Netburst days (even though I owned a Northwood and have been given an old Prescott to play around with as a 3rd PC). AMD meets my budget and my needs right now. So, I think I'll go Phenom with a 3850 come income tax time. At least my ASUS 690G board has the bios available to make Phenom work, and that $199 price is tempting.
  42. 3Ball said:
    Well I didnt dislike 1, but in relation to the others it wasnt that great. I hated JarJar, but Darth Maul is my fave char in star wars so it is a delima lol.

    Best,

    3Bal


    The Roger Roger robots were rubbish, the storms should of been introduced in 2 or three and show how they were developed / or trained up etc.

    Darth Maul was a mouldy baked bean with a cocktail stick, a character that wouldnt seem out of context with the Power Rangers, absolutely not effective enough, they chose the wrong actor for that one, yeah he pulled a menacing face but it wasnt acted out well like ok the Maul plot we put it in but couldnt be bothered with it.... The red and black makeup was a joke that looked like someone at a tattoo / body modding place had made a killing.
    The Maul should have left a wound in everyones heart - add a bit of spinning heads and spewing green acid sick, not a double ended batton dancer with i love mum tattooed on hes head.

    And what was the makeup on that queen character - looked like some one went to town on Natalie Portman with a Sesame Street big bird sitting on her face with a tummy upset.


    Episode one for me was Lucas's most embarissing moment...( appart from the Ewok adventure )

    Epsiode 2 got better but 3 got a bit lovey dovey..... and a couple of kids got knocked out...

    for me the best bit was in 2 going through the mine field with that explosion. what effects - but after seeing the origionals then going to the new ones it has no comparisons....
  43. Yup. All based on current data backed up with links to sources.

    Not one of those is based on an article that is SEVERAL YEARS OLD, like this thread is.

    caamsa said:
    :lol:

    Gotta love your posts........ :kaola:

    [CPUs] 24/7 WallSt: Ruiz Needs to Go

    [CPUs] Tom's: Phenom Fails Most AM2 Compatibility Tests

    [CPUs] AMD: Tier-Ones, Channel In Same Boat On Barcelona

    CPUs] AMD Closes Below $8: Wall Street not Happy with Delays

    [CPUs] The Inquirer: How AMD turned Barcelona into a right royal mess

    [CPUs] AMD Issues "Stop Ship" Order for Opterons; TLB Errata Cripples K10

    [CPUs] Chestnuts Roasting on an Open Phenom Fire - 140 Watts?

    [CPUs] AMD Drops QuadFX Like a Bad Habit

    [CPUs] AMD Stock Plummets 25% in Response to K10 Release

    [CPUs] Barcelona Doesn't Deliver - Benchmarks Deemed Non-Compliant

    [CPUs] Barcelona Availability Issues?

    [CPUs] AMD Earnings Announcement Today

    [CPUs] Sharikou Pushes Back BK Prediction

    [CPUs] Hal Licino: How AMD's Failures Are Triggering An Intel Monopoly
  44. Hellboy said:
    The Roger Roger robots were rubbish, the storms should of been introduced in 2 or three and show how they were developed / or trained up etc.

    Darth Maul was a mouldy baked bean with a cocktail stick, a character that wouldnt seem out of context with the Power Rangers, absolutely not effective enough, they chose the wrong actor for that one, yeah he pulled a menacing face but it wasnt acted out well like ok the Maul plot we put it in but couldnt be bothered with it.... The red and black makeup was a joke that looked like someone at a tattoo / body modding place had made a killing.
    The Maul should have left a wound in everyones heart - add a bit of spinning heads and spewing green acid sick, not a double ended batton dancer with i love mum tattooed on hes head.

    And what was the makeup on that queen character - looked like some one went to town on Natalie Portman with a Sesame Street big bird sitting on her face with a tummy upset.


    Episode one for me was Lucas's most embarissing moment...( appart from the Ewok adventure )

    Epsiode 2 got better but 3 got a bit lovey dovey..... and a couple of kids got knocked out...

    for me the best bit was in 2 going through the mine field with that explosion. what effects - but after seeing the origionals then going to the new ones it has no comparisons....


    lol, I just like the fact that he was by far the most athletic and most skilled fighter out of all of them. We all know he only died because he was the bad guy. I thought he looked cool. lol, but I guess thats just me. I also felt his entrance to the ending fight was epic. I have never seen so many people afraid of one man in my life (even when they have 2 jedi standing right next to them and they all have weapons). lol

    Best,

    3Ball
  45. I don't doubt Maul's athleticity... as for his skill, yeah taking on a Jedi Knight and almost Jedi Knight together was pretty cool. Doubt he'd last against Yoda or Windu though...
  46. yipsl said:
    I'm an AMD fan because Phenom isn't as bad as Intel fanboy's claim. It's core for core, faster than X2 and even if AMD is stuck in the same market position as the K62 budget days, that's not horrible. Once they get their debt paid off they can do R&D again and maybe Intel will become complacent.

    AMD copies Intel, Intel copies AMD. Intel forces Netburst on the corporate and Dell home buyer world for years, AMD uses too much hype pushing the Phenom out the door. The world continues to turn.

    I'll just never forgive Intel for their Netburst days (even though I owned a Northwood and have been given an old Prescott to play around with as a 3rd PC). AMD meets my budget and my needs right now. So, I think I'll go Phenom with a 3850 come income tax time. At least my ASUS 690G board has the bios available to make Phenom work, and that $199 price is tempting.


    Yes, K10 is faster than K8 clock for clock, however, AMD finds itself in a unique position: They have negated their own product.

    The K10 is faster than K8 clock for clock, but AMD has yet to stabilize K10 at clockspeeds high enough for it to exceed K8 in sheer speed. With the higher heat generated per die due to the extra 2 cores, and the shortcomings of SOI at 65nm, it will be a long time, (if ever using SOI) that K10 will equal or exceed 90nm K8 in clockspeed or clockspeed potential. AMD needs to get Phenom stable at 3GHz (and not on 1 or 2 cherry picked engineering samples used as demonstrators under controled lab conditions), consistanly enough for viable retail presentation in order for it to compete against 90nm K8

    In regards to core count, supreme commander kiddies uneducated whining's aside, there are few applications that will actually make use of 4 cores now, or in the near future. This negates the need for a quad and thus the usefulness for the slower K10s for the majority users. In those ~2% of applications where core count does matter (graphics rendering, video editing, etc) the cores tend to get loaded to max capacity, setting the stage for TLB bug to occur....meaning that where K10 could be advantageuous to a user because of its higer core count, it negates its usefullness due to its own flaws. In short, the TLB bug negates K10s single notable advantage at this time.....core count. Perhaps B3 will "solve" all the problems, but given AMDs recent track record for PR vs reality, B3 falls into the same catagory C2D B2 did in 2006.... wait and see what the product itself does, not what the promoters claim it will do. If someone wants an AMD quad, they should wait for B3, at the very least. If they just want an AMD, there is really no reason to buy a K10 right now when K8 is still a superior product.
  47. sailer said:
    And if they made another Star Wars and used AMD Barcelona chips, so what? The chips don't make the movie, people do. I seem to remember how some movies used to be made using Apple computers, and again, so what? But that did give rise to a lot of people who to this day claim that Apple is better at graphics than IBM based machines.

    No, using one chip or another, or one OS or another, doesn't prove a thing other than that was what was used at the time.


    for the most skilled users, e.g. guys like John Gaeta, who i
    mention because he will be familiar to people who watched
    the trailer at the end of the Matrix, they get whatever workstation(s)
    they want.

    one other aspect of 3D - people who have worked in the
    field a while have been burned more than once by promises
    made by various 3D software & hardware vendors. by
    burned i mean, the software vendors make a promise like,
    of course you can import a DEM file and get a *.wrl (VRML),
    and the program converts a huge DEM mesh into a one polygon
    VRML mesh.

    link to an article about Dan Gregoire, who did pre-vis work on
    some of the Star Wars films, the second series of 3. i mention
    him because AMD i think had a feature about him in their
    "power user" section on their website.
    http://www.pcmag.com/article2/0,2704,1961662,00.asp

    article describing the crew who worked on Star Wars
    http://www.starwars.com/community/event/celebration/news20050404.html

    i learned finite element analysis from a real skilled ANSYS
    VAR. he charged $100 an hour and had a 9 month backlog,
    in 1989. verrry skeptical of claims made by hardware &
    software vendors, very inclined to buy what he trusted.

    and a lot of those guys do trust AMD. they will have done
    some of the most exciting work of their lives on Opteron
    workstations in the 2003-2006 period. if they had a trusty
    AMD dual dual workstation a year ago, that's what they will
    use.

    until they feel the need for more speed.

    this is what i want. dual 771

    http://www.newegg.com/Product/Product.aspx?Item=N82E16813151048

    heck, they're both awesome (AMD & Intel) at the loaded
    workstation level. i don't know how the dual 771 quad core
    compares to a dual dual Opteron. the Harpertown quad
    core Intel's have been not in stock for a while.
    http://www.newegg.com/Product/Product.aspx?Item=N82E16819117149

    (that's the link for the 2.33 quad Xeon with the 12 MB cache)

    i'm not sure if that's caught up in the same process problems
    that are delaying the Q9450.
  48. It funny how thunder guy starts these topics but leaves them alone and never comes back.....
  49. yipsl said:
    I'll just never forgive Intel for their Netburst days (even though I owned a Northwood and have been given an old Prescott to play around with as a 3rd PC).


    Performance wise the netburst chips were just as (if not more) competitive to Athlon's at the time as phenoms are now in comparison to C2D. So I will never forgive AMD for the Phenom days. Price aside you could buy a decent pentium 4 and OC it quite well for $200 so I mean...price wasn't that bad. I know it was high for how good the chips are, but I think that it is alittle contradictory to say...I will never forgive Intel for Netburst when AMD is in a very similar boat with Phenom, and then say that you will be buying from them. Maybe you should consider via or a power pc processor...lol, Just a thought!

    Best,

    3Ball
Ask a new question

Read More

CPUs AMD Intel Product