How long has overclocking been around?
Just a thought because I don't remember hearing about overclocking in the 386/486 days, but how long has overclocking PC's been around. I remember buying coprocessors to speed up that generation of PCs.
It was around then too, Using Jumpers on the Motherboard though, I remember overclocking my 486 DX-2 66mhz to a whopping 80mhz by upping the bus speed.. I kinda miss the days when You could Put any CPU in Any board.
Another Milestone was my Celeron 300A At 464Mhz, that was a nice overclock for it's time.
the 386's had the Turbo buttong too, which was a form of overclocking without actually overclocking -- that didnt make a lot of sense i know.. lol
Well, back in the Spring of '81 I remember one of my friends was doing all kinds of stuff to speed up a Trash (TRS-80). It may have even been a legitimate project for a hardware class (this was Georgia Tech). I don't remember the specs, but I know a soldering iron was involved.
I remember a Comdex in Atlanta in '86 (I think, maybe '85) where some vendor had an IBM XT clone set up running at 100MHz. It was running rings around the IBM AT set up next to it.
It has been arround since at least the 8086. I replaced the clock generator crystal on my Laser XT to run at 14.31 instead of the usual 10. It was actually 9.54 but rounded up to 10. Boy howdy was SOPWITH hard to play @ 14 Mhz. Not to mention Dad would have kicked my skinny AZZ if he ever knew I did that. After that I overclocked my 486sx 25 to 33, then my P90 to 133 (That is when I needed an active cooling fan that I screwed onto the passive heatsink). Hell every proc I have owned since then has been overclocked. Some more then others, some smell worse then others when they do actually die. Overclocking will never die!!!!!
Back in the mid-80's we used to install a variable crystal/oscillator replacement device into the "state of the art" (at the time) IBM PC/AT systems we sold as CAD workstations (AutoCAD - the grandaddy of PC CAD programs). The device had a pair of wires connected to a manual dial-type control. We'd literally rip out the clock crystal (it was a plug-in unit, as were most things back then), stick the device's wires into the appropriate pin receptacles (or alligator clip them onto the leads) and then start cranking the clock frequency until the system crashed. We'd then dial it down a notch and send it off to the client.
Most of the time we'd get at or near 10MHz before the AT's 286 chip would start to freak out - this on a system that shipped stock at 6MHz and later 8MHz for the "improved" model (original IBM PC ran at 4.77MHz IIRC).
Ah! The fun we had back in the day! No boring BIOS multipliers - just good old physical pliers and some manual tuning. :-)
Wow, this is quite the education for me. I love learning my computer history. All I know is that any time you can manipulate bus speed or frequency multipliers, you can either overclock or underclock a system (Turbo button anyone?).
So I guess as long as they've been making CPU's theres been over(under)clocking
Anyone remember the stink-up from the original Pentiums with unlocked multipliers?
I think i may even still have the article i read back then, a lot of shonky system dealers were buying low-specced parts and simply upping the multiplier. That's when intel introduced the idea of locking multipliers (or at least a range, you couldn't get many parts that took more than 66*3.5=233). At the time, i though that was still sick, being able to clock a p-120@166 or @200.
And then i found out something even more disturbing last year, when my 386 finally died, and i junked it to make an ipcop.org router with a p-120 (@200 ). Not only did i find out that my alleged *genuine* intel 386 was, in fact, an AM386 (tqfp, soldered to the motherboard), but that it was a 20MHz am386 running at 25MHz. (seems amd have always been good overclockers, that thing never crashed once...)
I'd have to say this was back in 96 or 97. This was the first real good overclocking motherborad, Asus P55T2P4 with HX chipset. You could OC the chipset though jumper settings to 75MHz from the standard 66MHz. But you had to buy the AT version because it had a different layout over the ATX. The other was the Abit IT5H that was acapable of 83MHz.
Check out these achives!
The oldest and most interesting example of overclocking that I know of is described in Richard P. Feynman's autobiography "Surely You're Joking, Mr. Feynman".
It happened in late 1943 or early 1944 at Los Alamos. And, while there were no computers involved, the principle is applicable. Given that the purpose of overclocking is to increase throughput of the CPU
The short version is that a series of mechanical calculating machines were being used in series to perform complex modelling calculations. Each machine would perform a single or at most 3 or 4 calculations, and the result was sent on to the next calculator. These machines used cards similar to punch cards. Feynman describes a situation in which multiple sets of different coloured cards were being sent through the chain in order to speed up the process.
It could also be argued that the modified proceedure is an early example of a multi-core processor, because the differrent sets of calculations were for seperate equations.
Incidentally, in computing, we use the term "de-bug" for the process of trouble-shooting and correcting software and hardware problems. I heard an interesting account of the origins of the term. Perhaps someone here would be interested in following this up. I don't feel any pressing need to.
Anyway, the story is that the RAM on the original computer (and some later models) consisted of small iron rings wrapped in wire, through which electricity was passed. I don't remeber the specific details of the operation of the memory. Now, according to the story, there were a very large number of these rings stacked on top of each other, and there were many stacks. The memory was in a large room or building that was not especially well-sealed. Insects would get into the room then into the rings, where they would get fried. The current was low enough that the body of the insect would not burn, and the rings were small enough that somethimg the size of a housefly would remain trapped in/on the ring. Result was a short circuit, leading to memory failure. It was necessary to remove the insects by hand. This was a long and tedious job. But it is interesting that the original use of the term de-bug was literal, as opposed to today's usage.
Oh yeah, as far as I know, the terms "core memory" and "core dump" also arose from the design and structure of the memory used in the computer.
Mind you, this story could very well be a computer world urban legend.
Anyway, the story is that the RAM on the original computer (and some later models) consisted of small iron rings wrapped in wire, through which electricity was passed. I don't remeber the specific details of the operation of the memory. Now, according to the story, there were a very large number of these rings stacked on top of each other, and there were many stacks. The memory was in a large room or building that was not especially well-sealed. Insects would get into the room then into the rings, where they would get fried. The current was low enough that the body of the insect would not burn, and the rings were small enough that somethimg the size of a housefly would remain trapped in/on the ring. Result was a short circuit, leading to memory failure. It was necessary to remove the insects by hand. This was a long and tedious job. But it is interesting that the original use of the term de-bug was literal, as opposed to today's usage..........
I have heard the same story.