Computer History: From The Antikythera Mechanism To The Modern Era

Commodore Amiga

Many believe the Amiga line included the best home computers ever released. The first Amiga was introduced in July 1985. The A1000 model in its basic edition had 256KB of RAM and was supported with an optional color monitor. Commodore used the famous 16-bit Motorola 68000 CPU for the first Amiga models, clocked at 7.16 MHz for the U.S. version and at 7.09 MHz for the European model.

Despite the rather weak CPU, Amiga had amazing graphics and audio capabilities thanks to its dedicated circuits, called Denise (graphics) and Paula (audio). In addition to these two circuits there was also a third (initially called Agnus and after its upgrade renamed Fat Agnus), which provided fast RAM access to the other circuits, including the CPU.

Agnus also incorporated a chip called blitter, and it was responsible for boosting the 2D graphics performance of Amiga. In other words, the blitter played the role of a co-processor and was capable of copying a large amount of data from one region of the system's RAM to another, helping to increase the speed of 2D graphics rendering.

The A1000 was also equipped with a 3 ½" floppy drive and its operating system (OS) was the AmigaOS, which was multitasking while the IBM compatible PC's DOS was single tasking.

The AmigaOS kernel was loaded from a floppy disk to a board that was installed inside the A1000, which had 256KB of capacity. From the moment the kernel was loaded (also known as Kickstart), the 256KB of RAM was transformed to write-protected.

During this time, some designers held strong ties to their creations, which was clearly the case with Amiga. The first A1000 carried the signatures of their makers inside the chassis, along with the footprint of a dog that belonged to one of the team members (Jay Miner). The production of the A1000 stopped in 1987, when this model was replaced by the A500, which was a huge success, and the flagship A2000.

The new Amigas used an upgraded version of the AmigaOS and the A500 was equipped with 512KB of RAM, which could be upgraded up to 9.5MB. That's right, the Amiga had an open platform and could be upgraded rather easily by the standards of this period. Hands down, the A500 was the most successful product in Commodore's portfolio and it had huge sales, especially in the European market. A key role in Amiga's success was the huge software support, especially video games.

The Amiga computer had increased graphics capabilities and supported up to 640 x 512 resolution, and was able to depict up to 4096 colors simultaneously in a special mode called HAM, which was engaged only in static images. It also had strong audio capabilities, which, combined with the enhanced graphics, is what made this computer popular not only among gamers, but also professional graphics designers, video clip production studios and the film and music industry in general (although most music artists preferred the Atari ST computers, because of the input-output midi ports).

Amiga was one of the first computers with an open architecture. It had two expansion slots, one on the side and one on the bottom. All of its integrated circuits (ICs) weren't installed directly onto the motherboard, but were mounted onto bases so they could be easily removed and replaced with enhanced ICs.

The same applied to the CPU, which could be easily replaced with the more powerful Motorola 68010. The CPU could be upgraded with the much stronger 68020, 68030 or the 68040 through the sided expansion port. The memory upgrade was also possible (up to 1MB) directly on the motherboard, however only Fat Agnus supported the additional memory. Moreover, Amigas could accept a hard drive or special boards that simulated IBM-compatible PCs, so they could run the corresponding software.

The last Amiga from Commodore was the A400T, released in 1994, just before the company declared bankruptcy. It might sound strange that a company that built one of the most popular computers went bankrupt. The domination of the IBM compatible PCs sealed the fate of home computers. Microsoft also played a significant role in this, with its Windows OS, which offered the highly desired graphical interface.

Nonetheless, Amiga computers (especially the first ones) were way ahead of their time and had a strong following, which continued to support the company long after Amiga production stopped.

MORE: Intel & AMD Processor Hierarchy

Summary
Create a new thread in the Reviews comments forum about this subject
This thread is closed for comments
19 comments
Comment from the forums
    Your comment
  • Memnarchon
    Wow. I am glad they decided to name it just as a "bug".
    Imagine if they found a bug like "parastratiosphecomyia stratiosphecomyioides" and leave it like that?
    Today we would be like: "Hey, Skyrim is a good game but these parastratiosphecomyia stratiosphecomyioides don't let me enjoy it."
    xD
  • CaedenV
    I have a mouse signed by Doug Engle art :D one of my most prized nerd possessions.
    Sadly the mouse is nothing special. He was speaking at my friends college and she stole a mouse from their pc lab and had him sign it for me lol
  • bit_user
    Seems like Seymore Cray sure liked wires! I'd read point-to-point wiring was to reduce the path length (which allows higher clock speed). Might've made sense in the first gen or two, but what's that mess of wires in the pic with a 1995 caption? If he was still using so much wiring in his Cray-3, no wonder they weren't reliable.
  • bit_user
    Quote:
    NeXT Cube

    For comparison, we should mention that a typical PC during this period had 640KB of RAM, upgradeable up to 4MB, and used mostly the 8086, 8088 and 80286 Intel CPUs; only the high-end PCs used the faster 80386. PCs supported screen resolutions of 640 x 350 maximum, with 16 colors or 720 x 348 monochrome.
    Not quite. Check out when IBM introduced the PS/2 - also 1988. Along with it came the MCA bus and VGA graphics, sporting up to 256 colors (from a palette of 262,144) and resolutions up to 640x480.

    Quote:
    In other words, the difference between a NeXT and a typical PC was quite dramatic.
    True, but it's a mistake to compare this with PCs. Its real target (at that price, anyway) was workstations. I'm no expert on old workstations, but I know some Sun models from the mid-1980's had 1280x1024 monochrome screens. Text on those things was beautiful, due to the lack of a shadow mask.

    A fun fact about the original NeXT "cube" is that its magnesium case was about 1 cm higher, on one side. Since they couldn't afford to re-cast it, they had to go to market with this flaw.
  • bit_user
    Nice article. Thanks!

    Also, thanks for the shot of the Maltron keyboard (love the design of the two-handed version, but far too expensive). It's too bad Space Orb didn't make the cut, especially given your closing note about peripherals.
  • Myrmidonas
    This is how articles about "Computer History" should be articulated and written. With respect for the human history of which its timeline is paralleled to. Trully remarkable, totaly educational in whole and very enjoyable to read article.

    Ancient part of computer history is often neglected since the very term "compute" is faulty identified as one of the scientific childs of the modern era. But Mr Aris Mpitziopoulos dodges this mistake successfully.
  • aldaia
    Going back to computer history, contrary to popular belief, Intel 4004 is not the first microprocessor, although it's the first microprocessor known to the general public. In 1968 Garrett AiResearch started the design of the CADC microprocessor for the US Navy's new F-14 Tomcat fighter. The CADC performed the function of controlling the moving surfaces of the aircraft and the displaying of pilot information. Previous Central Air Data Computers were mechanical designs. The final architecture was a 20-bit, multi-purpose, microprogrammed, pipelined, multi-processor, using MOS P-channel technology. First CADC’s were delivered to Grumman in early 1970. The CADC not only predated 4004 (F-14's where flying before the first 4004 was delivered) it also was a much more powerful and complex design. In 1971, one of the designers wrote a paper on the design which was approved for publication by Computer Design magazine. However, because of national security reasons, the U.S. Navy did not approve this paper for publication until April 21, 1998. For this reason, the CADC remains fairly obscure in spite of its historical importance.
  • klubar
    Great article! But I think you missed a couple of seminal machines. To name a few... the IBM 360/370 -- which were some of the most successful machines ever ... and introduced the concept of software compatibility. Also, the 370's were the first machine that ran virtual machines. I remember running VM several level deep. Also, it was a very popular machine for the "cloud" (or SaS) of its time -- then called timesharing -- VM370/TSO.

    To other historically important machines were the PDP-8 and PDP-11. The PDP-8 could support 20 or so timesharing users on 8K of memory (really). There were the early minicomputers.

    And of historical interest -- the IBM 1620 -- which was a "decimal" machine that did its arithmetic via lookup tables.
  • bit_user
    Anonymous said:
    The CADC not only predated 4004 (F-14's where flying before the first 4004 was delivered) it also was a much more powerful and complex design. In 1971, one of the designers wrote a paper on the design which was approved for publication by Computer Design magazine. However, because of national security reasons, the U.S. Navy did not approve this paper for publication until April 21, 1998. For this reason, the CADC remains fairly obscure in spite of its historical importance.
    A lot of technological "firsts" did happen in the defense sector, but it's almost irrelevant. The commercial sector had to invent it independently, and they were subject to normal commercial pressures of the day. While the military innovations can lead to successors, they often become evolutionary dead ends. I'm sure a few exceptions exist, where defense contractors got some tech declassified and commercialized, but that's not the norm.

    aldaia, I don't mean to pour cold water on your post. It was informative and well-written, so I up-voted it. I'm just thinking that, in the context of a timeline article, it seems to me that the narrative is really tracking the evolution of the technologies leading to our modern machines. In the case of somebody doing something in relative isolation, while it's an impressive accomplishment, it's not a good fit for a small article like this.
  • Aris_Mp
    To be frank I didn't know about the CADC's existence till now.

    As for the IBM's 360 I do mention them in the article, since IBM actually established its name into the mainframes market with this line. I also talk about the PDP-8 and PDP-11.
  • bit_user
    Any thoughts about doing operating systems? That one should be fun.
  • CatsMum
    And the Olivetti Programma 101? The first desktop computer, sold in large numbers in the USA during the early to mid 1960s.
  • 10tacle
    I love these articles as one who grew up being a part of the PC and console gaming world since 1982 and the Apple II. My first console was the Atari 2600 in 1983 as a kid for a Christmas gift like millions of others. I kept that until 1991 when I bought a Sega Genesis, the first tech/gaming purchase ever with my own money. Then came a Nintendo 64 in 1997, and in 1998 my first gaming PC, a Dell D333 Pentium II with Nvidia's first AGP video card, the Riva 128. It's just been amazing watching how the past 20+ years of PC tech and console gaming have evolved.
  • Michael_335
    You failed to mention Alan Turing. Really?
  • Aris_Mp
    There is a whole section about Alan Turing. Check page 7.
  • editorsthocp
    Here is maybe an idea of having an addenda to you article. A small list of other cmputer museums in the rest of the world. Like Bletcheley, Paderborn, Paris, London, Warshaw, Athens, Munich. Since the globe has become really small this might be interesting for readers traveling and like to pay a visit. All these museums have a kind of unique collection reflecting developments in their country.
    A few axamples. Bletchley is very interesting because they are very active in restoring wartime computers. In Londen there is Babbage's machine constructed from the original designs and other pre-binary machines. In Munich there is a Univac still intact and Warsaw you'll find early machines from the east block.
  • WyomingKnott
    "Although some companies like Apple managed to obtain some ideas that initially started at Xerox PARC, Xerox failed to utilize many of them; that's one of the reasons why today Xerox's contributions are not well known."

    !!!!!!!

    Some truly great inventions came out of PARC; some truly great minds innovated freely. Sometimes I wonder how much was invented there and lost. PARC is legendary.
  • gue22
    Thanks for the article!
    One glaring omission: Apple Lisa and the Mac!
    Can't believe I overlooked it reading twice.
  • 10tacle
    Anonymous said:
    Thanks for the article! One glaring omission: Apple Lisa and the Mac! Can't believe I overlooked it reading twice.


    The Macintosh is referenced on page 23: (Apple I And II: Switching PSU And The Lack Of Cooling Fans). The Lisa was a sales failure. That's why that wasn't mentioned as a notable advance even though it did have a few firsts like the first attempt at an Apple OS user interface. However, the $10K price tag (US) in 1983 - which is $24K in today's dollars - made it entirely impracticable for most businesses.

    It was impracticable especially for households which at that time could buy a complete Apple II computer with monochrome monitor and dot matrix printer for $3K. I know...my parents had one and I used it as a kid for word processing and gaming.