Sign in with
Sign up | Sign in

IBM's System/360 Series

A Complete History Of Mainframe Computing
By
IBM's System/360 Series

When most people think of a mainframe, they think of the System/360 family of computers from IBM, arguably the most important computer architecture created. In many ways, it is similar to 8086 processors in that it created the standard for an industry and spawned a long line of descendants that are still alive and thriving to this day. One big difference is that IBM actually intended the System/360 to be important, unlike the 8086, which gained an importance its creator could never have foreseen. In fact, as many of you know, Intel even tried to kill off this instruction set with the Itanium.

But let's get back to the matter at hand. Prior to the System/360, IBM had something of a mess on its hands, having created many systems that were incompatible with each other. Not only did this make it more difficult for its customers to upgrade, but it also was a logistical nightmare for IBM to support all these different operating systems on different hardware. So, IBM decided to create what we almost take for granted today: a compatible line of computers, with differing speeds and capacities, but all capable of running the same software. In fact, in April 1964, IBM announced six computers in the line, with performance varying by a factor of 50 between the highest- and the lowest-end machines. This actually doubled the design goal of 25, which in itself posed many problems for IBM. Scalability of this magnitude was said to be impossible even by the infamous and brilliant Gene Amdahl. It was never a simple matter of just making something 25 times "bigger" than the smallest part and it really had to be completely re-implemented.

Today, it is common to disable parts of a processor, or underclock it to somehow compromise the performance. But back then, it was not economically feasible to create a high-end processor and artificially lower its performance for marketing purposes. So, IBM decided on the idea of adding "microprogramming" to the System/360, so that all members of the family used the same instruction set (except for the lowest-end, Model 20, which could execute a subset). These instructions were then broken down into a series of "micro-operations," which were specific to that system implementation. By doing this, the underlying processor could be very different, and this allowed scalability of the magnitude IBM wanted, and as mentioned, even exceeded it by two times.

This probably sounds familiar to you, since something similar has been implemented on x86 processors since the Pentium Pro (or really, NexGen Nx586). As mentioned, however, IBM planned this. The x86 designers did this because the instruction set was so poor, it could not be directly executed effectively. There was one very important advantage of this micro-programming that could not be easily implemented on a microprocessor. By creating new micro-programming modules, the System/360 could be made compatible with the very popular 1401, for the lower end machines, and even the 7070 and 7090 for the higher end System/360s. Since this was done in hardware, it was much faster than any software emulation, and in fact generally ran older software much faster on the System/360 than on the native machine, due to the machine being more advanced.

Some of these advances are still with us today. For one, the System/360 standardized the byte at eight bits, and used a word length of 32-bits, both of which helped simplify the design since they were powers of two. All but the lowest-end Model 20 had 16 general-purpose registers (the same as x86-64), whereas most previous computers had an accumulator, possibly an index register, and perhaps other special-function registers. The System/360 could also address an enormous amount of memory of 16 MB, although at that time this amount of memory was not available. The highest-end processor could run at a very respectable 5 MHz (recall that is the speed the 8086 was introduced at 14 years later), while the low-end processors ran at 1 MHz. Models introduced later in 1966 also had pipelined processors.

While the System/360 did break a lot of new ground, in other ways it failed to implement important technologies. The most glaring deficiency was that there was no dynamic address translation (except in the later model 67). This not only made virtual memory impossible, but it made the machine poorly suited for proper time-sharing, which was now becoming a possibility with the increasing performance and resources of computers. Also, IBM eschewed the integrated circuit, and instead used "solid-logic technology," which could roughly be considered somewhere between the integrated circuit and simple transistor technology. Conversely, on the software side of things, IBM was perhaps a bit too ambitious with OS/360, one of the operating systems designed for the System/360. It was late, used a lot of memory, was very buggy, lacked some promised features, and more than that, continued to be buggy long after it was released. It was a well known, high visibility, and dramatic failure, although IBM eventually did get it right and it spawned very important descendants.

Despite these issues, the System/360 was incredibly well-received and over 1,100 units were ordered in the first month, far exceeding even IBM's goals and capacity. Not only was it initially successful, but it proved enduring and spawned a large clone market. Clones were even made in what was then the Soviet Union. It was designed to be a very flexible and adaptable line, and was used extensively in all types of endeavors, perhaps most famously the Apollo program.

More importantly, the System/360 started a line that has been the backbone of computing for almost 50 years, and represents one of the most commercially important and enduring designs in the history of computing.

See more See less
Ask a Category Expert

Create a new thread in the Photo reports comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 71 comments.
This thread is closed for comments
Top Comments
  • 12 Hide
    Ramar , June 26, 2009 7:39 AM
    Wonderful article, thanks Tom's. =]

    Killed a good hour of my day, and I very much enjoyed it.
  • 11 Hide
    pugwash , June 26, 2009 8:17 AM
    Good article, however although not quite "Complete". There is no mention of Collosus (which was used to break Enigma codes from 1944) or The Manchester Small-Scale Experimental Machine (SSEM), nicknamed Baby, which was the world's first stored-program computer which ran its first program in June 1948.
  • 10 Hide
    1ce , June 26, 2009 7:55 AM
    Really cool. One observation, on page 7 I think the magnetic drum is rotating 12,500 revolutions per minute, not per second....If my harddrive could spin at 12,500 revolutions per second I'm sure it could do all sorts of amazing things like flying or running Crysis.
Other Comments
  • 12 Hide
    Ramar , June 26, 2009 7:39 AM
    Wonderful article, thanks Tom's. =]

    Killed a good hour of my day, and I very much enjoyed it.
  • 10 Hide
    1ce , June 26, 2009 7:55 AM
    Really cool. One observation, on page 7 I think the magnetic drum is rotating 12,500 revolutions per minute, not per second....If my harddrive could spin at 12,500 revolutions per second I'm sure it could do all sorts of amazing things like flying or running Crysis.
  • 11 Hide
    pugwash , June 26, 2009 8:17 AM
    Good article, however although not quite "Complete". There is no mention of Collosus (which was used to break Enigma codes from 1944) or The Manchester Small-Scale Experimental Machine (SSEM), nicknamed Baby, which was the world's first stored-program computer which ran its first program in June 1948.
  • 2 Hide
    neiroatopelcc , June 26, 2009 9:11 AM
    So the ABC was in fact the first mobile computer? The picture does show wheels under the table at least :)  But I guess netbooks are easier to handle, and have batteries
  • 2 Hide
    dunnody , June 26, 2009 10:11 AM
    I am with pugwash - its a good article but why does it seem like it is a bit US centric, no mention of Alan Turning or "Baby" and the Enigma code cracking machines of Bletchley Park
  • 3 Hide
    Anonymous , June 26, 2009 11:47 AM
    Err what about the Zuse Z3?
  • 2 Hide
    candide08 , June 26, 2009 12:48 PM
    I agree with others, in that I am surprised that there was not even a mention of a Turing machine or other very early "computers".

    Surely they qualified as Mainframes of their times?
  • 2 Hide
    Anonymous , June 26, 2009 1:11 PM
    It's a shame that multiplication, addition and division benchmarks are not persistently noted throughout the article.

    I know that now a days it's very much dependent on software design, but it would still be nice to follow the progression in terms of calculation power of the machines.
  • 2 Hide
    theholylancer , June 26, 2009 2:05 PM
    25 pages??? i love ad block but damn this is annoying
  • 2 Hide
    vinnyny , June 26, 2009 2:20 PM
    Where can we get an 80/80 of this article without all of the noise? No PDF?
  • -4 Hide
    scook9 , June 26, 2009 2:27 PM
    So.....can it play Crysis?

    Out of curiosity, since its a metric I am more familiar with, what would the TeraFLOPS rating be in the newest and bestest from IBM. And how much would one of those bad boys set you back in the wallet.

    Was a very educational and interesting article.
  • 0 Hide
    lamorpa , June 26, 2009 3:04 PM
    "The 704 was quite fast, being able to perform 4,000 integer multiplications or divides per second. However, as mentioned, it was also capable of doing floating point arithmetic natively and could perform almost 12,000 floating-point additions or subtractions per second. More than this, the 704 added index registers, which not only dramatically sped up branches, but also reduced program development time (since this was handled in hardware now)."

    Many of these statements are sure to be wrong. 1) For sure, it would not be faster at floating point than integer. 2) Index registers have to do with memory addressing, not branching.
  • 7 Hide
    ta152h , June 26, 2009 3:33 PM
    First, I agree with the title being misleading, and I apologize for it. It was never intended to be a complete guide, which would be virtually impossible. I don't know why that title was chosen.

    The choice of computers was U.S. centric, because computers were U.S. centric. I chose only one mechanical computer, and it was made by IBM, since they were the dominant company. To add more computers would have been boring, and none of them were important technological milestones. So, while they might be specifically interesting to you, I was of the opinion too many computers from the same time frame would be boring. I almost chose the EDSAC over the EDVAC, but, went with the first design over the first implementation.

    With regards to the index registers, "the IBM 704 added index registers and a “TSX” instruction that would branch to an address but leave the address of the TSX in an index register. A single unmodified branch could use that index register value to return."

    Loops involve branching, branching involves memory addressing.

    With regards to floating point vis-a-vis integer, you need to be more careful about what you're sure of. For one, multiplies and divides are generally slower, being much more complex. But, more to the point, this information is available directly from IBM.
  • 1 Hide
    Anonymous , June 26, 2009 4:08 PM
    As one who live the mainframe era from the 2k machines for $500K...this story is incomplete without the story of the competition that was the force behind the commercial introduction at a furious pace of things we take for granted today.

    Any mention of mainframes without the Honeywell H-800 series, the H200 series or Multics leaves out systems that have had a large influence on computing as we know it. The H-800 was one of the first multiprocessing systems of the late '50s, the H-200 was Honeywell's answer to the 1401 in the '60s and Multics merely contributed much of the hardware architecture for the Intel CPU used in today's PCs and foreshadowed UNIX and many of the development tools we use today. I saw no mention of GE and their 600-6000 series. And NCR. (Remember the term "BUNCH" as the competitors to IBM.)

    So starting in the '50s, you should also have the history of the BUNCH woven in even to their demise. Not every great idea originated from IMB (though many did).

  • 2 Hide
    jackshaftoe , June 26, 2009 5:01 PM
    What, and no mention of Lawrence Waterhouse and his work during WW2??? :p 
  • 2 Hide
    Anonymous , June 26, 2009 5:10 PM
    Nice article, it was fun to review that history. I would have added mention of the groundbreaking Cray machines, especially the seminal Cray-1 (and it's successor X-MP) as the first "supercomputer." The X-MP looked like a futuristic chaise lounge with the main circuits in a center column surrounded by a circular padded bench. They were so arranged to reduce interconnecting wire lengths, as the speed was limited by the time it took electrons to travel through the interconnects...a speed of light limitation! The later Cray-2 was unique in that it was completely immersed in a bath of liquid Flourinert to cool the dense circuitry.
  • 2 Hide
    jsloan , June 26, 2009 5:38 PM
    the first computer programmers were all women!

    http://abcnews.go.com/Technology/story?id=3951187&page=1
  • 2 Hide
    aspireonelover , June 26, 2009 5:38 PM
    Great Article! I learned something new today! I've never been so "into" the computer history before.
    Thanks Rich Arzoomanian for writing this article.
  • 3 Hide
    jsloan , June 26, 2009 5:54 PM
    all jokes aside, this is the best tom's hardware article i have read to date. thanks for taking the time, effort and expense for putting it together.
Display more comments