Hi all,
I’m looking for an explanation of how CPU’s work. I’m going to explain roughly what I know already (I say that with some scorn, as on this level I know very little), and a brief explanation of the level I’m at in general.
Let’s say, I’m using Windows XP, and I tell the computer, factorise this cubic equation (perhaps a small macro written in Excel). In a general explanation, executing this command, will have the CPU draw the information from the RAM on what to do, it will then start decoding the macro within Excel. It will return the lower level commands to memory and repeat until it is in the form it likes (because the language is VBasic and 1’s and 0’s are required?).
Once it has its 1’s and 0’s it will start executing the commands, trying various factors until the conditions are met. It will then return the solution to memory and re-process them until they are in a form in which Excel likes, finally displaying them on the screen.
That’s a rough, how thing’s work in general, from what I can gather. Now the CPU bit, which is even less clear to me.
Electricity flows towards the CPU from the RAM in fits and starts, i.e. 1’s are electricity, and 0’s are no electricity? As these first bits of 1’s and 0’s enter the CPU, it recognises what it must do with the rest, somehow and then somehow points the rest of the electricity in the right direction. Almost like on off switches in a puzzle, each combination of on and off, results in a new set of on and offs. But how? What makes this actually happen?
This last bit is really hazy and I constantly struggle to put into words. I have a feeling I know what happens, but nothing concrete. For years I have wondered what exactly happens, now I want to find out.
My level of education is thus: post graduate with up to, an equivalent of a Masters. Specialities in business, accounting, maths and education. I’m really dyslexic though (reading technical information that I am unfamiliar with is torture), so researching is my weakest point, and why I’ve asked here rather than look through the impossibly colossal WWW. So basically, I’m a smart girl mostly and I can understand most levels of explanation, but where there are heavy uses of unfamiliar terminology, my head’s going to explode.
For anyone with an interest in this sort of thing, I figure it could be quite cool topic, which also does not mention CPU manufacturers. There are two topics also really, how computers work on a general level, with operating systems and programs etc. Then what physically happens on the CPU level.
P.S. if you don’t like the topic idea or think it’s too dumb, please simply PM me with your insults and leave the thread clean
I’m looking for an explanation of how CPU’s work. I’m going to explain roughly what I know already (I say that with some scorn, as on this level I know very little), and a brief explanation of the level I’m at in general.
Let’s say, I’m using Windows XP, and I tell the computer, factorise this cubic equation (perhaps a small macro written in Excel). In a general explanation, executing this command, will have the CPU draw the information from the RAM on what to do, it will then start decoding the macro within Excel. It will return the lower level commands to memory and repeat until it is in the form it likes (because the language is VBasic and 1’s and 0’s are required?).
Once it has its 1’s and 0’s it will start executing the commands, trying various factors until the conditions are met. It will then return the solution to memory and re-process them until they are in a form in which Excel likes, finally displaying them on the screen.
That’s a rough, how thing’s work in general, from what I can gather. Now the CPU bit, which is even less clear to me.
Electricity flows towards the CPU from the RAM in fits and starts, i.e. 1’s are electricity, and 0’s are no electricity? As these first bits of 1’s and 0’s enter the CPU, it recognises what it must do with the rest, somehow and then somehow points the rest of the electricity in the right direction. Almost like on off switches in a puzzle, each combination of on and off, results in a new set of on and offs. But how? What makes this actually happen?
This last bit is really hazy and I constantly struggle to put into words. I have a feeling I know what happens, but nothing concrete. For years I have wondered what exactly happens, now I want to find out.
My level of education is thus: post graduate with up to, an equivalent of a Masters. Specialities in business, accounting, maths and education. I’m really dyslexic though (reading technical information that I am unfamiliar with is torture), so researching is my weakest point, and why I’ve asked here rather than look through the impossibly colossal WWW. So basically, I’m a smart girl mostly and I can understand most levels of explanation, but where there are heavy uses of unfamiliar terminology, my head’s going to explode.
For anyone with an interest in this sort of thing, I figure it could be quite cool topic, which also does not mention CPU manufacturers. There are two topics also really, how computers work on a general level, with operating systems and programs etc. Then what physically happens on the CPU level.
P.S. if you don’t like the topic idea or think it’s too dumb, please simply PM me with your insults and leave the thread clean