Why computers advanced so fast in past 50 years

First of all, they have not advanced that much at all. Mostly due to poor coding.
If we could code chips to read hexagesimal instead of binary for example, THEN it would be a revolution.
If we get to use Graphene, THEN it will be a revolution.

So far all we have been doing is making things smaller and pushing more energy into it to make it go faster, and you might remmber when was the first time that that totally stoped when intel tried to make a 4 Ghz 1 core CPU :D.

Now we got multiple cores, and still most programs cant use more than 2 cores anyway.

Wait till some genious like in its day Carmack creates a programing language for multiple cores. THEN we might see a revolution.
 
Its not just computers,
its a phenomenon I call accelerated exponential evolution,

Things used to take 20 years or more between improvements/revolutions then as technology engineering and manufacturing progressed and understanding became more evenly spread,
progress itself became easier to accomplish so more people were progressing faster and sharing the knowledge which in turn made more of the species by proxy smarter

with more and more people working on projects, things got even faster and things took a little less time to be 'improved' upon, 15 years say,
then ten, then five and eventually we hit the limits of possibility with current materials/manufacturing, so we improve those to accomplish what we need them to or find/invent new materials that fit the bill,
I dont want to do anyones homework for them but your question cannot be answered in a simple statement so asking in these (or any other computer sites) forums wont get you the answers you want/need
you need to look into things in a bit of depth and discover what it was that made X or Y leap in development possible
Sorry, but nothing good comes easy man,
Moto
 
^Same as Theory of relativity :) it was once firm stature, now more of a loose guideline hehe
http://io9.com/5829403/moores-law-may-soon-be-broken
but as I said earlier, its a case of 'oh, we hit the limit', /stop.
and someone will come along later and say 'oh, its an issue, not a brick wall' and work to find a way around the problem and then Cpu tech will progress to the next 'limit',
I would love to be able to watch humanities progress from sludge to whatever it ends up being all in one go :)
Moto
 
Don't think I've read much Baxter, I may have but I read voraciously hehe,
but I've read a fair bit of sci-fi/Sci fantasy over the years and even watched as a lot of it becomes fact, which btw is awesome when you realise it :)
It happens a lot more these days due to A.E.E. obviously, something 'hi tech' I saw in a film as a 15 year old may have taken 20 years to become reality but nowadays you see something in a film and two years later they're in shops lol

I particularly like the culture series by Iain Banks, but I'd happily settle my own self in a Firefly-type 'Verse :p
Moto
 
Simple: Government pouring money into R&D, mostly though universities [research grants] and DARPA.

Even then, since the switch to digital circuits back in the late 60's, the tech itself really hasn't changed, just the speeds [more transistors due to smaller fabrication processes].