The physical level...

Aethren

Distinguished
Dec 28, 2008
20
0
18,510
Greetings,


What I'm about to ask might make some ppl frown or yell "NOOOOB" at the top of their voice, but flame away, I'll still ask it:

How is a CPU capable of processing data, at the physical level?

Sure, I know it basicaly uses a bazillion semi-conducting transistors that switch on and off to modulate an input/autput variable of electrical signals... all said and done, they're made from a hundred silicon sheets hot molded by acids and UV light through photolithography, then packed into neatly aligned rows of perfect matrix and stacked into a wafer.

I'm also aware that they run on assembly-based binary systems and that they prefetch data from solid state memory and so on and so forth bla bla bla... frankly though, all a transistor does is let current pass or block it, given the polarity of the charge and the alignment of the "holes" and electrons in the PNP pattern, and this is predictable (e. g. we KNOW that if a negative charge is applied at a certain state it passes whereas a positive is blocked and vice versa for the other state).

My question is: how does a series of switches resolve mathematical algorythms from just being run through by current?

By analogy: If I know for a fact that throwing a rock on the ground causes it to break (positive passes if holes in right) and that throwing the same rock into water causes it to sink (negative passes if holes in left), then i dont see how throwing a million rocks is going to solve an equation/algoryth/addition or anything else for that matter.

Also, how does each individual transistor get "picked"? (e.i. recieve charge at the right time in the right order to reply in a specific way)

There surely aren't enough pins on a socket to select even a small group, let alone single transistors.

I've been messing with computers since I was six... I had a 386, a 486 DX2, a Pentium 75 (100 if on "turbo" lol), a K6, a Pentium 3 and possibly messed with every single processor after that... I OC, I set ram timings, I ajust clock rates, I fine tune memory bandwidth and I match FSBs, I take apart and mount again, I basically know the thing inside out for all practical purposes, but it baffles me to think that a series of switches can actually process data... I'm pretty convinced this s**t is alien man O_O AHuHAuHAUh

Well, if there are any CPU engineers among us, I'm all ears for a coherent answer...


Thanks, and g'night...
 

WR

Distinguished
Jul 18, 2006
603
0
18,980
Well, it seems you understand how the basic field-effect transistor works. CPUs have no moving parts, so we're talking entirely about electricity. DC (direct current) electricity, for that matter.

When electricity passes through an ON transistor, the charge that flows through can be used to modulate one or more other transistors. That way, the output of one transistor is the input of another transistor. Very few of the transistors are controlled directly by the pins on the CPU package. Many of the pins are there to supply static power (1) and ground (0), but that still leaves a few hundred left over to handle data.

By combining two or more transistors in series - connecting the source for one to the drain of another - you form a basic AND gate that permits current flow only if all transistors are switched on. By combining them in parallel - connecting all the sources to a common input and all the drains to a common output - you get an OR gate, where any one ON transistor lets the current through. Lastly, by using ground (0) as source and connecting the drain to power (1) through a resistor, you get a NOT gate.

Just like binary 1's and 0's are the building blocks of all digitized data, AND, OR, and NOT gates are the building blocks of all the complex logic and SRAM circuits you see in a modern CPU. Engineers use concise symbols to represent these gates rather than drawing out individual transistors. There are variations in the basic transistor that permit current flow in only one direction, or deal with threshold voltages, or tweak power consumption, yields, and high frequency operation, but the basic purpose is the same.

For starters, try googling "binary adder" and "binary multiplier" to see how the basic gates are formed into circuits that perform basic math. There are decades of optimizations in a modern IC; that would be way too much detail to cover here.
 
Start off with a more simple CPU such as the Z80. Look at the internal architecture and the instruction set. Look at the ALU (pronounced arith- met-ic logic unit). Look at how the control bits derived from the instruction decoder interact the the input data and ALU. Look at flag bits. Look at simple assembly language programs. Look at ... well, you get the idea.

The Z80 is a good place to start because the instruction set is hardwired instead of embedded in micro code. What's micro code?
 

Aethren

Distinguished
Dec 28, 2008
20
0
18,510
First off, tzank you very much for the replies.

Okay, the basic gate structure I can deal with (partly lol), but I still fail to grasp the concept of designing a logical pattern that is flexible enough to take "any" input and work out the resulting requested floating-point operations from a matrix of (predictable, no less) logic gates... although it does start to make a very tiny bit of sense.

This is making me feel pretty much brainless...

I'll try to find large, detailed schematics of a Z80, since getting the actual thing is a bit far fetched unless I break into a museum... or Intel ^^


"XX"

On a further note, I stumbled accross a LOT of s*it concerning the "upper limits of processing power due to quantum limits imposed by planck scale, from which point no matter can be further compressed", or so it states. It goes on to explain how an atom can compute (picture this) by assuming a "probable" state of "1", "0", or "any infinitismal step between one and zero possible within a sphere geometry in regions of over-planck size around it".... meaning: a transistor with (here it gets nasty) INFINITE states that can PROCESS... following this logic (which I assume is the driving logic behind the over-inflated concept of "Quantum Computing") ANY matter can process by deriving an output from specific signal stimuli... (woah, this is tiresome to even imagine...)

So on a random note (I'm just extrapolating wildly here), it seems all matter can process data given a certain condition (hence the Omega Point Theory, look it up on wikipedia...), and I am thus an @sshole for even mentioning this...

A Q-bit... seems like random poking of matter that "goes right" by chance...

Probability is overrated...


Thanks fellas, and please do disconsider the mind-boglingly, eye-numbing text from the "XX" onward ^^
 

Aethren

Distinguished
Dec 28, 2008
20
0
18,510
Hope you all have a pleasant night, and I'll spend the rest of my week with a Z80 up my @ss so as to gain some insight from it as it moves, or I may read schematics, whichever is less painful... (did I mention wiring it prior to afore-mentioned execution of possibly failed insertion?)

G'night, and dont think about quantum mechanics... it'll shrink your di*ks off...
 

Aethren

Distinguished
Dec 28, 2008
20
0
18,510
Oh and I meant "thank you", not "tzank you" on the first line of prior post... (damned french keyboard drives me nuts)
 

4745454b

Titan
Moderator
I know this is entirely of no real help, but the ones and zeros in a CPU are the same as a punch card. I to didn't start with computers until I was about 8 and had my own 286. (hand me down from my dads office. At nine I was replacing the 286 motherboard with a 386. Never looked back.) My dad tells me stories about punching his own cards, but I to don't really understand how 0 and 1 = TF2. Frankly, I don't care.
 

Aethren

Distinguished
Dec 28, 2008
20
0
18,510
Which is also a mystery, by the way... I cant see how rubber-band connected pieces of scrap metal perforated at set intervals is anything at all...
 

cadder

Distinguished
Nov 17, 2008
1,711
1
19,865
One of the simplest arithmetic operations is adding 2 8-bit bytes. You can probably find diagrams of how this would work. Multiplication is a little bit more complicated. I believe things like trigonometry and logarithms are actually implemented as series, IOW a LOT of multiplication and adding. Of course the CPU also has to take the contents of one memory address and copy them somewhere else, things like that. Remember that the CPU doesn't think in terms of alphabet, just numbers. Software and other processors take care of interfacing with the keyboard and monitor, etc. You can do a lot with this, and we did in the old days. CPU designers have gotten more clever since then, and a LOT faster, but unless you want to get into designing CPU's, the concepts from 20 years ago should satisfy you for now.
 

V3NOM

Distinguished
Jun 18, 2008
2,599
0
20,780
getting kind of off topic here, but WOW! i love this site now :p

One pound of DNA has the capacity to store more information than all the electronic computers ever built;­ and the computing power of a teardrop-sized DNA computer, using the DNA logic gates, will be more powerful than the world's most powerful supercomputer. More than 10 trillion DNA molecules can fit into an area no larger than 1 cubic centimeter (0.06 cubic inches). With this small amount of DNA, a computer would be able to hold 10 terabytes of data, and perform 10 trillion calculations at a time. By adding more DNA, more calculations could be performed.
DNA COMPUTERS!
see more here: http://computer.howstuffworks.com/dna-computer.htm

here's the answer to your quantum computing questions! http://computer.howstuffworks.com/quantum-computer.htm

:sol: kickass site if a little old... talking about P4's and 10GB hard drives in some of the examples :lol:
 
"I'll try to find large, detailed schematics of a Z80, since getting the actual thing is a bit far fetched unless I break into a museum... or Intel ^^ "

All you need is a detailed block diagram of the Z80. Once you have a general idea of what is going on, you can either work downward into how the various blocks are built or upward to the more abstract software. An '80's era book on computer architecture will help you.

My first computer was the '70's era HAWK Air Defense System computer. It had a non-byte oriented 24 bit word, 8K x 25 bit memory, and built with small scale TTL logic. It had three operating modes: Program Run, Single Instruction, and Single Clock Mode. In SCM, you could trace each fetch and execute cycle through all the logic. Sometimes we had to do that to repair the damned thing. Lots of circuit cards, lots of interconnections.

I do not miss the good old days. I'm 62. I also do not miss the old vacuum tube days. Unfortunately, I am still dealing with vacuum tubes. Vacuum tubes are the only way to generate what I regard as practical amounts (hundreds of kilowatts) of microwave (radar) power.

And quantum computing????

 

V3NOM

Distinguished
Jun 18, 2008
2,599
0
20,780
quantum computing? something the size of your hair (what hair? :D) performing calculations probably millions/billions/trillions times faster than your vacuum tubes :lol:
 

croc

Distinguished
BANNED
Sep 14, 2005
3,038
1
20,810


Not to rain on your theory, but a digital computer will never be as fast as an analogue computer for analogue calculations... Nor will a digital amplifier and CD ever get as nice a sound as a vinyl record and a vacuum tube push-pull amp. Digital is not the end-all-be-all in a basically analogue world. A sliderule will get you to 99.5% accuracy of a trig function almost as fast as a 3+GB CPU, if you can still remember how to use it.

Sorry for going so far off topic.
 

croc

Distinguished
BANNED
Sep 14, 2005
3,038
1
20,810


I try not to use links to wikipedia, but in this case they do have some good information...

http://en.wikipedia.org/wiki/Z80

G'night...
 

spathotan

Distinguished
Nov 16, 2007
2,390
0
19,780
The thing about vinyls and cd's really is true. Those old big clumsy records just have that crisp sound to them, espically through some nice stereo speakers. Almost makes me want to get a USB turntable.
 

croc

Distinguished
BANNED
Sep 14, 2005
3,038
1
20,810
On our side of the world, Aussie...

After Aethern gets his head around Z80's, maybe he should try a tri-state CPU like the TI 9904, eh?
 

Aethren

Distinguished
Dec 28, 2008
20
0
18,510
Not at all, dont worry about it.

science.box.sk had some interesting posts as to workings of quantum processors (i.e. using a single nitrogen atom floating in a 1cm cube of tetrachloryde solution would yield more processing power from its semi-infitismal states than all the computers ever made put together, as the Q-bit function nears infinity).

A q-bit, or quantum-bit can be thought of as a globe or sphere: Imagine a ball who's upper pole (or north pole) is "1" and who's lower (south pole) is "0"... now imagina how many other states are possible for a point to be on a 3d sphere other than the exact surface of precisely the north or south extremity. And now figure that as this ball moves around the near-infinite states are continuously shifted and randomized. And now figure that for each specific possible state of probability a Q-bit exists and does not exist at the same time (it has the "potential" to be there) and this at near lightspeed (electron displacement in probable L,P,Q pauling states and all that sh*t). And now (damn, thats getting too long...) imagine each one of these equals a "clock" cycle of "2 to the power of 13"-bit bandwidth...

Presumably, once this processing is made possible there will be no further need to develop any other CPU ever again (in terms of power), only may to improve its use.

But this is so far into the future its not even worth mentioning yet (but I did, therefore I am an @sshole again).

 

Aethren

Distinguished
Dec 28, 2008
20
0
18,510
Note: Quantum procesing is BOTH digital AND analogue...

PS: Wow, I am graduating in @sshology tonight with my meaningless posts ^^