I worked for Intel's Microprocessor Research Labs (from 2001 to 2006) doing microarchitecture research.
Someone requested an AMA from an EE with CPU design experience, so here I am!
(1) How much of today's CPU architecture is basically the same as the first CPUs ever designed?
The overall pipeline of a modern CPU would be familiar to a mainframe architect from the 60's. The main development since then was the branch predictor - credited to Jim Smith in 1981.
Of course, everything is bigger now
(2) Are components like logic gates still individually laid out during planning, or are past component designs stored in some database to be called upon for use in new designs? If so, how often do they have to be revised to suit the needs of new processors?
Intel is not like most design houses. There is (or, at least, was) still much done by hand. Most places use libraries of circuits, with tools combining them to fulfill the requirements of a high level specification (hardware design language - HDL).
These libraries must be reevaluated for every new process (45 nm to 32 nm, etc). Depending on how aggressive you are with labor, you have many different implementations which tradeoff size and speed.
(3) With such intricate circuits, how do you keep track of what effect an individual design element will have on the overall operation of the CPU?
There are many levels of abstraction to help deal with complexity. An architect has a very high level view - looking at pipeline stages. A logic designer is concerned with his block, with specific ins and outs, and the function to perform. A circuit designer is converting a specific block of HDL into transistors.
Software handles the overall simulation (although only small pieces can be simulated at high detail).
(4) As I understand it, modern computers are still using essentially the same BASIC developed in the 60s converted to machine code to execute instructions. Do you think the future progression of CPU technology will require going back to the beginning and designing processors that can utilize a new, and perhaps more efficient, high-level programming language (if such a thing could exist)?
BASIC is actually a high-level programming language! (From a CPU point of view).
It's true that modern assembly language is similar to that of the original machines (actually, even a CISC like x86 is simpler than many that used to exist - think Vax).
It's unlikely that this will change.
There is a lot of work required to make a new instruction set, and there is usually little to gain. Remember, anything that is done in software will be much harder to do in hardware. A Python interpreter is a complex piece of code, you wouldn't want to try and do it in hardware.
What do you think will happen as transistors approach the single atomic level and are subject to the effects of quantum mechanics?
Quantum effects have been a problem for a long time. When they start to dominate, that will be the end of transistors as we know them. We will have to move to rod based computing (like a tinker toy - except done with carbon nanotubes) or spintronics or something even wilder.
I believe economics will limit us before physics does. A modern fab costs a lot of money, and it is becoming harder to charge a premium price for CPUs.
Feel free to ask for more on any of these subjects or anything else.
Proof submitted to mods.