Sign in with
Sign up | Sign in
Your question

The physical level...

Last response: in CPUs
Share
December 28, 2008 7:44:21 PM


Greetings,


What I'm about to ask might make some ppl frown or yell "NOOOOB" at the top of their voice, but flame away, I'll still ask it:

How is a CPU capable of processing data, at the physical level?

Sure, I know it basicaly uses a bazillion semi-conducting transistors that switch on and off to modulate an input/autput variable of electrical signals... all said and done, they're made from a hundred silicon sheets hot molded by acids and UV light through photolithography, then packed into neatly aligned rows of perfect matrix and stacked into a wafer.

I'm also aware that they run on assembly-based binary systems and that they prefetch data from solid state memory and so on and so forth bla bla bla... frankly though, all a transistor does is let current pass or block it, given the polarity of the charge and the alignment of the "holes" and electrons in the PNP pattern, and this is predictable (e. g. we KNOW that if a negative charge is applied at a certain state it passes whereas a positive is blocked and vice versa for the other state).

My question is: how does a series of switches resolve mathematical algorythms from just being run through by current?

By analogy: If I know for a fact that throwing a rock on the ground causes it to break (positive passes if holes in right) and that throwing the same rock into water causes it to sink (negative passes if holes in left), then i dont see how throwing a million rocks is going to solve an equation/algoryth/addition or anything else for that matter.

Also, how does each individual transistor get "picked"? (e.i. recieve charge at the right time in the right order to reply in a specific way)

There surely aren't enough pins on a socket to select even a small group, let alone single transistors.

I've been messing with computers since I was six... I had a 386, a 486 DX2, a Pentium 75 (100 if on "turbo" lol), a K6, a Pentium 3 and possibly messed with every single processor after that... I OC, I set ram timings, I ajust clock rates, I fine tune memory bandwidth and I match FSBs, I take apart and mount again, I basically know the thing inside out for all practical purposes, but it baffles me to think that a series of switches can actually process data... I'm pretty convinced this s**t is alien man O_O AHuHAuHAUh

Well, if there are any CPU engineers among us, I'm all ears for a coherent answer...


Thanks, and g'night...

More about : physical level

a b à CPUs
December 28, 2008 8:07:00 PM

i never thinked of cpus that way haha, I'm bookmarking this and will check the replies tmrw
December 28, 2008 9:39:45 PM

Well, it seems you understand how the basic field-effect transistor works. CPUs have no moving parts, so we're talking entirely about electricity. DC (direct current) electricity, for that matter.

When electricity passes through an ON transistor, the charge that flows through can be used to modulate one or more other transistors. That way, the output of one transistor is the input of another transistor. Very few of the transistors are controlled directly by the pins on the CPU package. Many of the pins are there to supply static power (1) and ground (0), but that still leaves a few hundred left over to handle data.

By combining two or more transistors in series - connecting the source for one to the drain of another - you form a basic AND gate that permits current flow only if all transistors are switched on. By combining them in parallel - connecting all the sources to a common input and all the drains to a common output - you get an OR gate, where any one ON transistor lets the current through. Lastly, by using ground (0) as source and connecting the drain to power (1) through a resistor, you get a NOT gate.

Just like binary 1's and 0's are the building blocks of all digitized data, AND, OR, and NOT gates are the building blocks of all the complex logic and SRAM circuits you see in a modern CPU. Engineers use concise symbols to represent these gates rather than drawing out individual transistors. There are variations in the basic transistor that permit current flow in only one direction, or deal with threshold voltages, or tweak power consumption, yields, and high frequency operation, but the basic purpose is the same.

For starters, try googling "binary adder" and "binary multiplier" to see how the basic gates are formed into circuits that perform basic math. There are decades of optimizations in a modern IC; that would be way too much detail to cover here.
Related resources
a c 172 à CPUs
December 28, 2008 11:16:01 PM

Start off with a more simple CPU such as the Z80. Look at the internal architecture and the instruction set. Look at the ALU (pronounced arith- met-ic logic unit). Look at how the control bits derived from the instruction decoder interact the the input data and ALU. Look at flag bits. Look at simple assembly language programs. Look at ... well, you get the idea.

The Z80 is a good place to start because the instruction set is hardwired instead of embedded in micro code. What's micro code?
December 29, 2008 12:44:03 AM

First off, tzank you very much for the replies.

Okay, the basic gate structure I can deal with (partly lol), but I still fail to grasp the concept of designing a logical pattern that is flexible enough to take "any" input and work out the resulting requested floating-point operations from a matrix of (predictable, no less) logic gates... although it does start to make a very tiny bit of sense.

This is making me feel pretty much brainless...

I'll try to find large, detailed schematics of a Z80, since getting the actual thing is a bit far fetched unless I break into a museum... or Intel ^^


"XX"

On a further note, I stumbled accross a LOT of s*it concerning the "upper limits of processing power due to quantum limits imposed by planck scale, from which point no matter can be further compressed", or so it states. It goes on to explain how an atom can compute (picture this) by assuming a "probable" state of "1", "0", or "any infinitismal step between one and zero possible within a sphere geometry in regions of over-planck size around it".... meaning: a transistor with (here it gets nasty) INFINITE states that can PROCESS... following this logic (which I assume is the driving logic behind the over-inflated concept of "Quantum Computing") ANY matter can process by deriving an output from specific signal stimuli... (woah, this is tiresome to even imagine...)

So on a random note (I'm just extrapolating wildly here), it seems all matter can process data given a certain condition (hence the Omega Point Theory, look it up on wikipedia...), and I am thus an @sshole for even mentioning this...

A Q-bit... seems like random poking of matter that "goes right" by chance...

Probability is overrated...


Thanks fellas, and please do disconsider the mind-boglingly, eye-numbing text from the "XX" onward ^^
December 29, 2008 12:48:55 AM

Hope you all have a pleasant night, and I'll spend the rest of my week with a Z80 up my @ss so as to gain some insight from it as it moves, or I may read schematics, whichever is less painful... (did I mention wiring it prior to afore-mentioned execution of possibly failed insertion?)

G'night, and dont think about quantum mechanics... it'll shrink your di*ks off...
December 29, 2008 12:50:57 AM

Oh and I meant "thank you", not "tzank you" on the first line of prior post... (damned french keyboard drives me nuts)
December 29, 2008 12:59:37 AM

hehe quantum computing ftw! have fun trying to understand what it means..
a c 87 à CPUs
December 29, 2008 1:08:53 AM

I know this is entirely of no real help, but the ones and zeros in a CPU are the same as a punch card. I to didn't start with computers until I was about 8 and had my own 286. (hand me down from my dads office. At nine I was replacing the 286 motherboard with a 386. Never looked back.) My dad tells me stories about punching his own cards, but I to don't really understand how 0 and 1 = TF2. Frankly, I don't care.
December 29, 2008 1:16:43 AM

No I'm not ^^
December 29, 2008 1:19:42 AM

Which is also a mystery, by the way... I cant see how rubber-band connected pieces of scrap metal perforated at set intervals is anything at all...
a b à CPUs
December 29, 2008 1:22:25 AM

One of the simplest arithmetic operations is adding 2 8-bit bytes. You can probably find diagrams of how this would work. Multiplication is a little bit more complicated. I believe things like trigonometry and logarithms are actually implemented as series, IOW a LOT of multiplication and adding. Of course the CPU also has to take the contents of one memory address and copy them somewhere else, things like that. Remember that the CPU doesn't think in terms of alphabet, just numbers. Software and other processors take care of interfacing with the keyboard and monitor, etc. You can do a lot with this, and we did in the old days. CPU designers have gotten more clever since then, and a LOT faster, but unless you want to get into designing CPU's, the concepts from 20 years ago should satisfy you for now.
December 29, 2008 1:24:56 AM

getting kind of off topic here, but WOW! i love this site now :p 

Quote:
One pound of DNA has the capacity to store more information than all the electronic computers ever built;­ and the computing power of a teardrop-sized DNA computer, using the DNA logic gates, will be more powerful than the world's most powerful supercomputer. More than 10 trillion DNA molecules can fit into an area no larger than 1 cubic centimeter (0.06 cubic inches). With this small amount of DNA, a computer would be able to hold 10 terabytes of data, and perform 10 trillion calculations at a time. By adding more DNA, more calculations could be performed.

DNA COMPUTERS!
see more here: http://computer.howstuffworks.com/dna-computer.htm

here's the answer to your quantum computing questions! http://computer.howstuffworks.com/quantum-computer.htm

:sol:  kickass site if a little old... talking about P4's and 10GB hard drives in some of the examples :lol: 
a c 172 à CPUs
December 29, 2008 1:27:27 AM

"I'll try to find large, detailed schematics of a Z80, since getting the actual thing is a bit far fetched unless I break into a museum... or Intel ^^ "

All you need is a detailed block diagram of the Z80. Once you have a general idea of what is going on, you can either work downward into how the various blocks are built or upward to the more abstract software. An '80's era book on computer architecture will help you.

My first computer was the '70's era HAWK Air Defense System computer. It had a non-byte oriented 24 bit word, 8K x 25 bit memory, and built with small scale TTL logic. It had three operating modes: Program Run, Single Instruction, and Single Clock Mode. In SCM, you could trace each fetch and execute cycle through all the logic. Sometimes we had to do that to repair the damned thing. Lots of circuit cards, lots of interconnections.

I do not miss the good old days. I'm 62. I also do not miss the old vacuum tube days. Unfortunately, I am still dealing with vacuum tubes. Vacuum tubes are the only way to generate what I regard as practical amounts (hundreds of kilowatts) of microwave (radar) power.

And quantum computing????

December 29, 2008 1:32:29 AM

quantum computing? something the size of your hair (what hair? :D ) performing calculations probably millions/billions/trillions times faster than your vacuum tubes :lol: 
December 29, 2008 1:49:08 AM

V3NOM said:
quantum computing? something the size of your hair (what hair? :D ) performing calculations probably millions/billions/trillions times faster than your vacuum tubes :lol: 


Not to rain on your theory, but a digital computer will never be as fast as an analogue computer for analogue calculations... Nor will a digital amplifier and CD ever get as nice a sound as a vinyl record and a vacuum tube push-pull amp. Digital is not the end-all-be-all in a basically analogue world. A sliderule will get you to 99.5% accuracy of a trig function almost as fast as a 3+GB CPU, if you can still remember how to use it.

Sorry for going so far off topic.
December 29, 2008 1:57:44 AM

Aethren said:
Hope you all have a pleasant night, and I'll spend the rest of my week with a Z80 up my @ss so as to gain some insight from it as it moves, or I may read schematics, whichever is less painful... (did I mention wiring it prior to afore-mentioned execution of possibly failed insertion?)

G'night, and dont think about quantum mechanics... it'll shrink your di*ks off...


I try not to use links to wikipedia, but in this case they do have some good information...

http://en.wikipedia.org/wiki/Z80

G'night...
December 29, 2008 1:58:27 AM

good afternoon! :D 
December 29, 2008 2:00:33 AM

The thing about vinyls and cd's really is true. Those old big clumsy records just have that crisp sound to them, espically through some nice stereo speakers. Almost makes me want to get a USB turntable.
December 29, 2008 2:02:33 AM

On our side of the world, Aussie...

After Aethern gets his head around Z80's, maybe he should try a tri-state CPU like the TI 9904, eh?
December 29, 2008 2:03:20 AM

:kaola: 
December 29, 2008 2:09:42 AM

Not at all, dont worry about it.

science.box.sk had some interesting posts as to workings of quantum processors (i.e. using a single nitrogen atom floating in a 1cm cube of tetrachloryde solution would yield more processing power from its semi-infitismal states than all the computers ever made put together, as the Q-bit function nears infinity).

A q-bit, or quantum-bit can be thought of as a globe or sphere: Imagine a ball who's upper pole (or north pole) is "1" and who's lower (south pole) is "0"... now imagina how many other states are possible for a point to be on a 3d sphere other than the exact surface of precisely the north or south extremity. And now figure that as this ball moves around the near-infinite states are continuously shifted and randomized. And now figure that for each specific possible state of probability a Q-bit exists and does not exist at the same time (it has the "potential" to be there) and this at near lightspeed (electron displacement in probable L,P,Q pauling states and all that sh*t). And now (damn, thats getting too long...) imagine each one of these equals a "clock" cycle of "2 to the power of 13"-bit bandwidth...

Presumably, once this processing is made possible there will be no further need to develop any other CPU ever again (in terms of power), only may to improve its use.

But this is so far into the future its not even worth mentioning yet (but I did, therefore I am an @sshole again).

December 29, 2008 2:11:42 AM

Also, Wax cylinders play beautifuly too ^^
December 29, 2008 2:15:57 AM

Note: Quantum procesing is BOTH digital AND analogue...

PS: Wow, I am graduating in @sshology tonight with my meaningless posts ^^
December 29, 2008 2:22:29 AM

i think double Wr answered you accurately,
For starters, try googling "binary adder" and "binary multiplier" to see how the basic gates are formed into circuits that perform basic math. There are decades of optimizations in a modern IC; that would be way too much detail to cover here.


I myself can't also grasp how CPU's today work but as Wr said, the basic is AND, OR, NOT gates. With those gates, one can make simple basic integer math computations. But before you can go there, you also have to learn the binary system and how it counts to decimal, example
00 - 0
01 - 1
10 - 2
11 - 3
you'll need more bits to count more numbers. :) 
that's how they process data/numbers.

as you can see, learning just the basics is already difficult.
you seem to love computers, should had a computer engineering college course where you will learn all the basics. And that course doesn't include some higher language programming. The computer industry is big,
big enough to make bill gates and intel ceo one of top 10 richest in the world.
December 29, 2008 2:35:04 AM

Remember the computer is nothing more than a nice box of electronic circuits with out an OS of some kind. You need some kind of code/instruction or OS to make it all work.

Think of DNA as the program for our bodies.

Or just read this.

http://homepage.cs.uri.edu/faculty/wolfe/book/Readings/...
December 29, 2008 2:52:06 AM

regarding quantum computing, i've been reading those topics since 1996 in highschool and i just say that they're too far to have any use for the consumer for a long time.
all i want to see, is improved CPU and GPU that can handle CRYSIS easily, at a low price of course! oh, also greatly improved HD speed or SSD pricing! :) 
a c 86 à CPUs
December 29, 2008 3:01:58 AM

Computers work in binary . 111111's and 000000's

Any number can be expressed in binary . Any instruction can be expressed in binary , or in a series of binary numbers etc Color[ as an example ] in most computers is 32 bit . So each color is defined by a string of 1's and /or 0"s 32 digits long .
This gives you plenty of color range LOL

Transistors are binary . Either current flows , or it does not .
December 29, 2008 10:51:54 PM

isnt that just amazing though? me just typing this message right now, millions of transistors are flicking on and off... 1's and 0's flying through my computer, a modem, kilometres of phone line to the THG server.... just astounding
January 3, 2009 12:12:27 PM

Astounding, and even though it almost always works, we're still very pissed when we BSOD out of some failed overclock =D

The consumer is never happy... now bring me my Octo-Core >=[

=D
a b à CPUs
January 3, 2009 10:09:09 PM

croc said:
Not to rain on your theory, but a digital computer will never be as fast as an analogue computer for analogue calculations... Nor will a digital amplifier and CD ever get as nice a sound as a vinyl record and a vacuum tube push-pull amp. Digital is not the end-all-be-all in a basically analogue world. A sliderule will get you to 99.5% accuracy of a trig function almost as fast as a 3+GB CPU, if you can still remember how to use it.

Sorry for going so far off topic.


Well, if you want an accurate reproduction of the original sound, there is no competition. CDs FLATTEN vinyl any day of the week, not to menition have higher dynamic range and better frequency response (Well, technically, both are capable of >20kHz, but in practice, very few records actually contain info up to above about 15kHz, especially after they have been played a few times, while CDs contain everything up to 22kHz). CDs also have a lower noise floor. SACDs and DVD audio are even better. Sure, vinyl has a unique sound, but that is actually because of its inaccurate reproduction. Of course, most modern CDs are mastered terribly, which means they sound terrible, but that isn't a fault of the technology itself.

Tube amps and transistor amps are both roughly equivalent in accuracy now, but the older tube amps actually added distortion (which caused the "tube" sound). Also, in a double blind test, I doubt you or anyone else could hear the difference between a good modern tube and transistor amp of similar power and distortion ratings, if both are played at below the rated power (they distort differently if used past the rated power, but I would hope that you aren't typically doing that to your amps). In the cases that there are audible differences, the differences are actually because the vacuum tubes are adding distortion, while the solid state amps are acting as a near perfect theoretical amplifier, with almost no change in the signal aside from the signal power.
!