Sign in with
Sign up | Sign in
Your question

The future of computers...

Last response: in CPUs
Share
September 6, 2006 1:18:32 AM

Unfortunately, the world of physics is meeting the world of computers, and Moore’s Law is no more. For as long as I existed life in computers meant that my computer today would be half as fast as a new computer in a year and a half from now...

Today is different, Intel has stepped up with a 30-40% increase in speed and its considered a BIG deal. Yikes... If you told me that 8 years ago I would have said Intel sucks, but today the properties of silicon, electrons, heat and vibration are all coming together to produce a world were the only significant gains will come in more processors, and better software.

Our best hope for processors is a movement away from silicon, but don’t expect anything anytime soon.

Historically things moved so fast, that Intel and other Micro-possessing companies could take short cuts in certain areas to save development cost, while being unhindered in other areas to make up the room. Now things have gotten so small and vibrate so fast that the basic properties of the materials just can’t handle them. What Intel did in their Duo Core 2 isn’t anything new, they just took everything they did in the past and optimize it. And there is allot to left to optimize.

Why does the Duo Core 2 go so fast? Intel is constantly turning off the parts of the processors that are not being used. When my power bill is too high, I turn of the lights when I’m not in a room. I guess I should work at Intel too! What a simple idea that makes a huge drop in temps and voltages.

What all this results into computers that will be useful for many more years then we have ever seen in the past. It also means that gains in processing power will continue to decline exponentially.

Life isn’t over, solid state memory, and more cores will mean more speed. Hard Drives will grow in size and thus also in speed. Light based storage continues to grow, though that too is hitting a limit. Graphics processors will continue to explode, add a processor, and you can now calculate twice the pixels without facing as much diminishing returns. More Pipelines mean more calculations. LCD and Organic Displays show promises of Huge Monitors with many applications running together. REAL desktops that exceed 30 inches will become common place. Games will grow brilliant in detail. Life is grand for the graphics world.

The software world will have to step up and start optimizing every aspect of every activity possible. As life slows down, programmers will be given the time they need to work towards perfection.

I remember programming applications for companies, and facing issues where we could have done things better but didn’t. We knew that in a few years the computers will be significantly faster, and we won’t have to deal with the problem.
On another note:

With the brilliant Sound Blaster X-Fi series, we now have another core dedicated specifically for sound processing, and what’s great is that it works WONDERFULLY!

In truth I already have so many processors in my machine that it would be futile for me to try and count them. I have a processor on my soundcard, one on my video card, one to encrypt data on Wi-Fi channel, two on my main machine, one to control the memory and system resources (Northbridge), and many (USB, SATA, IDE, Southbridge) to control resources I choose to use.

If anything is for sure, is that what I say now will not be true sometime later. I hope to hear from you all on what you think, will happen…

Mike

More about : future computers

a b à CPUs
September 6, 2006 1:52:09 AM

Quote:
Unfortunately, the world of physics is meeting the world of computers, and Moore’s Law is no more. For as long as I existed life in computers meant that my computer today would be half as fast as a new computer in a year and a half from now...

Today is different, Intel has stepped up with a 30-40% increase in speed and its considered a BIG deal. Yikes... If you told me that 8 years ago I would have said Intel sucks, but today the properties of silicon, electrons, heat and vibration are all coming together to produce a world were the only significant gains will come in more processors, and better software.

Our best hope for processors is a movement away from silicon, but don’t expect anything anytime soon.

Historically things moved so fast, that Intel and other Micro-possessing companies could take short cuts in certain areas to save development cost, while being unhindered in other areas to make up the room. Now things have gotten so small and vibrate so fast that the basic properties of the materials just can’t handle them. What Intel did in their Duo Core 2 isn’t anything new, they just took everything they did in the past and optimize it. And there is allot to left to optimize.

Why does the Duo Core 2 go so fast? Intel is constantly turning off the parts of the processors that are not being used. When my power bill is too high, I turn of the lights when I’m not in a room. I guess I should work at Intel too! What a simple idea that makes a huge drop in temps and voltages.

What all this results into computers that will be useful for many more years then we have ever seen in the past. It also means that gains in processing power will continue to decline exponentially.

Life isn’t over, solid state memory, and more cores will mean more speed. Hard Drives will grow in size and thus also in speed. Light based storage continues to grow, though that too is hitting a limit. Graphics processors will continue to explode, add a processor, and you can now calculate twice the pixels without facing as much diminishing returns. More Pipelines mean more calculations. LCD and Organic Displays show promises of Huge Monitors with many applications running together. REAL desktops that exceed 30 inches will become common place. Games will grow brilliant in detail. Life is grand for the graphics world.

The software world will have to step up and start optimizing every aspect of every activity possible. As life slows down, programmers will be given the time they need to work towards perfection.

I remember programming applications for companies, and facing issues where we could have done things better but didn’t. We knew that in a few years the computers will be significantly faster, and we won’t have to deal with the problem.
On another note:

With the brilliant Sound Blaster X-Fi series, we now have another core dedicated specifically for sound processing, and what’s great is that it works WONDERFULLY!

In truth I already have so many processors in my machine that it would be futile for me to try and count them. I have a processor on my soundcard, one on my video card, one to encrypt data on Wi-Fi channel, two on my main machine, one to control the memory and system resources (Northbridge), and many (USB, SATA, IDE, Southbridge) to control resources I choose to use.

If anything is for sure, is that what I say now will not be true sometime later. I hope to hear from you all on what you think, will happen…

Mike


I enjoyed reading your post. It is imperative you understand defeating Russia had nothing to do with computers. I attribute that to the fact we had mandatory gym class on MWF.
September 6, 2006 1:57:12 AM

What is this?
Related resources
September 6, 2006 2:00:25 AM

Quote:
Why does the Duo Core 2 go so fast? Intel is constantly turning off the parts of the processors that are not being used. When my power bill is too high, I turn of the lights when I’m not in a room. I guess I should work at Intel too! What a simple idea that makes a huge drop in temps and voltages.


Isn't that more relevant to power consumption than to speed?
September 6, 2006 3:59:33 AM

Moores law is directly related to Speed... In fact he specifically stated that CPU speeds have been increasing at a rate of 2x per 1.5 years... it is a comon misconception that it was 2x per 2 years...

I met and had dinner with the man... he was very nice...

Transistor counts have NOT followed a predictable path... Infact, transistor counts didnt have much to do with speed in the early days... (Remember RISC)

Power consumption has become linked to speed... because you cant fry eggs on your processor.

Energy in the form of electric, and resistance (usually found in vibration (khz, mhz, ghz)) = heat

if you run parts of the processor that are not beign used, then the heat of the processor goes up... and as any overclocker knows, heat is the enemy of speed.

Edited: I was wrong... From everything I can find... even from very reliable sources (which Wikipedia isnt always) it seems that he did mention transistor count... But his law also seemed to be directly related to the speed of the computers...
September 6, 2006 4:02:18 AM

There seems to be, as there has always been, talk about the limits of silicon. If not silicon, then what? And why not stick with silicon? Make the best CPU you can make, make it the smallest and most thermal efficient, and then just build more and more cores. If silicon has a limit, change the limiting factor to number of cores. Software could follow if the need/demand/desire were there. CPU's would no longer be measured by ghz or cache or instruction sets, CPU's would be sold and targeted and sought after with the number of cores they have. Power would become an issue once a CPU had say, 128 cores, so something like a "speed-step" would have to be in place to turn the cores on and off as needed.

But really, how far can this go? At what point would it just be crazy? How much (super computers and servers and the such aside) does a computer really need? Looking at a computer 10 years ago and taking a brief second to look at even a not-so-great computer of today one could only think "The sky is the limit...". But, I guess from what I have read, silicon is the limit.
September 6, 2006 4:04:37 AM

Yes, I know that cooler processors can have higher clock speeds, and I know that heat can become a threat, but by your theory shouldn't AMD's Cool'n'Quiet make their CPU's run faster? Under load, the temperatures will still be the same, thus underclocking and undervolting a CPU while not in use shouldn't affect its performance under load. Sure it'll be cooler when idling, but I fail to see how that could lead to an increase in speed while the CPU is actually being used.

The main reason for the increase in speed is the newly designed architechture, is it not?
September 6, 2006 4:15:20 AM

Quote:

I enjoyed reading your post. It is imperative you understand defeating Russia had nothing to do with computers. I attribute that to the fact we had mandatory gym class on MWF.


LoL.

Quite right.
September 6, 2006 5:04:03 AM

What I think will happen is the exact opposite of what you seem to think is good, that we'll do away with all the separate coprocessors, though higher integration and faster P2P busses they'll no longer be needed and instead we'll move further towards higher performance:watt and smaller size, smaller cost.

Having so many chips to do what software can do is wasteful, if we can do it in software fast enough.
Actually, we may already be able to do most of these tasks faster with the CPU, but marketing departments will trick people into paying $50 more for a part instead of putting that $50, and $50 per two other parts towards a $150 more expensive/faster CPU. The busses are holding us back at the moment, but as they get faster, things like dedicated video memory will matter less and less. We see only the beginnings now with Turbo Cache technology. Well there was AGP Aperture memory but in it's prime the busses were too slow.

Right now, it's mostly a gamer mindset that is pushing the industry forward, most people have no desire to upgrade their systems, they merely want it to keep working (most meaning in the Real World, not on a computer oriented web forum). Ironically Windows Vista and later bloatware will force some to move on, but I predict more and more people will resist this because the OS must necessarily evolve onto the point where the average use doesn't justify continual purchases.

Something else the future will bring is far more substantial- alternatives to windows will become user friendly enough that average people can use it, without help. Driver architectures will evolve to support this, and people will start asking themselves "Why did I buy an 8GHz PC with 6GB of memory and 4TB storage to do email and web surf?"

Not everyone thinks like this of course, but the majority DO, and the majority of systems sold are low-end.

What does the future hold? Eventually hardware will be mature enough that we can isolate ourselves from it more, as we do with things like the insides of our VCRs or DVD players. That too is more than a few years off, but with higher integration, smaller size and lower cost, it will come. Along with that there will be tighter integration with software, firmware, and eventually the hard drive is gone - not just replaced with a screw in drive that uses flash memory instead, but entirely unnecessary once memory density is high enough to fit a sufficient amount on a card (in a slot).

I mostly ignored what the OP wanted, to centralize the discussion around the CPU - because I don't think CPU evolution matters nearly as much as anything else.
September 6, 2006 5:39:46 AM

"There seems to be, as there has always been, talk about the limits of silicon. If not silicon, then what? And why not stick with silicon? "

From what I have read it is not too much of a streach to say that one day we will see organic materials and parts used in conjunction with computer technology. Think about it, how fast do your nerves send messages throughout your body? This is just one of those things I read about and I am just throwing it out there for discussion, if anyone has anything to add feel free.

If you want to read about this subject this article is very very interesting!!

http://www.cnn.com/2003/TECH/09/19/wow.tech.life.comput...

Quote from article:
"Interfacing biology with silicon computers and getting neurons -- or nerve endings -- and cells to talk to chips maybe the most practical application, but at this stage the main stumbling block for bio-molecular computers is their production."
September 6, 2006 6:03:59 AM

Quote:
...Think about it, how fast do your nerves send messages throughout your body?...


LOL, not to undermind your point, but I just remembered bio class - many basic nerve signals travel about 200km/hour (or about 120mph, so ~twice highway driving speed).. thats not all that fast. But yes, the computational power of the nervous system, the brain in particular, could be a promising model to manufacture bio-computational chips/devices off of.
September 6, 2006 1:32:18 PM

I dont think the brain has much real "computing" power... what it has is decision making... If you doubt this, tell me in less then 1 second what 7/11 is to 255 decimal places. It also is real multitasking. Imagine a computer that is constantly doing millions of things all at the same time. Then feed it a diet of carbs, vitamines, prozak and other necessities.

I agree with the fact that there are other materials that could take its place. I remember years ago IBM was working with copper.
September 6, 2006 2:21:48 PM

the future of computer processors is organic or fibre optic, there is already a fibre optic processor in existence and there is research into using real organic material to preform processor calculations aka using brain material grown in labs to store and use information passed to it via electronic signals. this is the future and once we hit either of these we will probably then hit the limit of how fast things can go.
September 6, 2006 2:28:50 PM

Quote:
I dont think the brain has much real "computing" power... what it has is decision making... If you doubt this, tell me in less then 1 second what 7/11 is to 255 decimal places. It also is real multitasking. Imagine a computer that is constantly doing millions of things all at the same time. Then feed it a diet of carbs, vitamines, prozak and other necessities.

I agree with the fact that there are other materials that could take its place. I remember years ago IBM was working with copper.


although true i think they have a way to "train" the brain to do it, remember learning your times tables? when you first started you wernt very good after a few days/weeks you could get the answer in a min. if the organic material is trained in a similiar way. the only prob it will make mistakes occasionally, so its not the best i think ill stick with fibre optic. i cant remember the site off the top of my head but the processor was 2cm high and 15cm wide. they where hoping to shrink it down to the same size as current processors within 10yr, so thats like 2017 for them to be refined then mabye another 10yrs or so for retail so its not that far off.
September 6, 2006 2:50:05 PM

I still like my idea of the multi-multi-multi cpu. Isn't that what a Cell processor is? From what I've read it is, or it at least seems to be. Stats for the PS3 Cell call for 3 to 12 times the computing power of an Opty 248.
And what about RISC, or AI, or VR? I could talk about this stuff forever. :) 
September 6, 2006 2:57:23 PM

Quote:
Moores law is directly related to Speed... In fact he specifically stated that CPU speeds have been increasing at a rate of 2x per 1.5 years... it is a comon misconception that it was 2x per 2 years...

I met and had dinner with the man... he was very nice...

Transistor counts have NOT followed a predictable path... Infact, transistor counts didnt have much to do with speed in the early days... (Remember RISC)

Power consumption has become linked to speed... because you cant fry eggs on your processor.

Energy in the form of electric, and resistance (usually found in vibration (khz, mhz, ghz)) = heat

if you run parts of the processor that are not beign used, then the heat of the processor goes up... and as any overclocker knows, heat is the enemy of speed.


Incorrect, Moore's law does not link to speed, Moore's law refers to complexity of the chip, ie, more transistors.

Here's Moore's actual statement:

Quote:
The complexity for minimum component costs has increased at a rate of roughly a factor of two per year ... Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost will be 65,000. I believe that such a large circuit can be built on a single wafer.


It's all about chip complexity and number of transistors. While the double of transistors "effectively" doubled speed from the early 70s into the mid to late 90s, it was not what Moore's law was about.

As this link shows, the number of transistors has approximately doubled every 2 years on the waffer. Moore's Law has held.
September 6, 2006 3:01:22 PM

The laws of diminishing returns applies to the Multi-Multi-Multi-Multi-etc CPU standard, and the laws of expodential increases in heat and power also applies.

They have been doing this Multi computer strategy for a long time, they are called supercomputers, and they produce enough heat to warm a hundred houses, while also requiring the power of some towns in america.

I remember awhile ago MIT students where given a task to build a computer that runs as fast as the current top end home computers without spending more then $1000

They got about 30 old computers, networked them together and clustered them with Linux software... The money was spent on network gear. The computers they got dumpster diving, and the power consumption was huge, while the heat build up would have warmed my house in winter (I live in rochester).

Mike

PS My favorite MIT experiment was to find out how far they could toss a car... they tossed it 5 Miles... :)  Maybe we should take all that crappy used up Military equipment and start tossing it at our enemies...
September 6, 2006 3:05:48 PM

Organic? Like, alive? Something somewhat *alive* in my computer? Could you imagine what some people would think if they ever thought there was a living thing in their computers?? Someone would say that it was alive to the point of being aware and it would be a bad hollywood repeat where the robot learns he's a real boy. Then we would be scared and try to kill the machines, but they would turn on us (Terminator and iRobot). And then... ahh nevermind.

I also think the computer industry is ever changing not for the need of faster computers, because let's face it if there were C2D everywhere, there wouldn't need to be a faster computer for many years. The industry is just that, an industry, they want your money. Computers will evolve and change and be whatever it will take to get your dollar. Intel and AMD has guys right now sitting in a lab somewhere in a bomb bunker with tattoo's on their foreheads that say "Whatever it takes!". Just to get your money.
September 6, 2006 3:28:55 PM

Quote:
How about nanotech and quantum computers?

I heard that they successfully added 1+1 in a drop of liquid.

http://computer.howstuffworks.com/quantum-computer2.htm

But nanotech sounds promising and the timeline is supposedly ~10 years to see some real products.


i forgot about quantum computers but they just give me a headache i believe the first one they built gave the answer to there question before they asked it and before they turned it on 8O :?

edit ok mabye not, dunno if im mixed up on this one or not im sure there was some computer that did that.
September 6, 2006 4:02:31 PM

The answer was 42!
September 6, 2006 4:46:44 PM

Quote:
The answer was 42!


Now they are stuck in an attempt to find the question to the answer
September 6, 2006 7:24:02 PM

Quote:


I remember awhile ago MIT students where given a task to build a computer that runs as fast as the current top end home computers without spending more then $1000

They got about 30 old computers, networked them together and clustered them with Linux software... The money was spent on network gear. The computers they got dumpster diving, and the power consumption was huge, while the heat build up would have warmed my house in winter (I live in rochester).

Mike

PS My favorite MIT experiment was to find out how far they could toss a car... they tossed it 5 Miles... :)  Maybe we should take all that crappy used up Military equipment and start tossing it at our enemies...


Where would one find information on MIT experiments...im highly interested.
September 6, 2006 8:40:24 PM

Personally, I see no reason for Moore's law or it's implications for computer calculations/second to fail. Silicon will inevitably be replaced - the chances that all of these alternatives, from carbon nanotubes to quantum computers, will fail to surpass silicon based lithography methods seem fleetingly small.

I suspect computers will continue to not only grow faster, but smaller - in ten years I expect to buy jackets with computers built into them (not cheap jackets, granted) that will help me navigate while on a backpacking trip. My shoes may tell me how far I've jogged (by speaking, wirelessly, into a small speaker embedded in or attached to my ear).

In another 10-15 years I expect to be able to purchase a device that rests within my eye and projects into my visual field information (in the form of text and images superimposed over my vision). Computers are already, as another poster has stated, more than capable of doing simple things - gaming and other demanding processes require more computing power, but nothing will stop us from developing networked, decentralized "distributed computing" systems with components embedded in nearly everything, from our clothing and furniture to our appliances and specialized computers (for things like gaming).

Regarding the brain's power - estimates have placed the power of the brain to calculate at around 10 to the power of 14-16 caclulations per second. Current computers process around 10 to the ninth cps. Why can the brain analyze language, images, and concepts so much better, and why can computers do math so much faster? The brain is organized in a highly parallel design and it excels at pattern recognition, which is why you can listen to someone and easily understand what they are saying, and understand what is expected of you, but yet you cannot calculate simple math problems at a speed even close to what a computer can accomplish.

This will change, of course - within 20 years I would expect computers to reach the level of human computation (which still exceeding human abilities in various specific applications, of course - such as simple calculations). In around this time-scale the turing test will be passed, and you'll start to look forward to calling tech support, as a computer that can reason like a human while simultaneously having rapid access to the wealth of information available on the internet (and the ability to analyze said information at rates that humans, without enhancement, will never compete with) will solve your tech-support or billing problems much better than some underpaid, grumpy human.

If you're really interested in taking an in-depth look at an analysis of the history and future of computers (and human kind), read Ray Kurzweil's "The Singularity is Near." His evidence is more than compelling, and, in my opinion, his conjectures about the future are right on the mark.

One more thing - we might be tempted to say that dual-core CPUs and GPUs don't double performance (because they don't, in real-world applications), and therefor question Moore's law - but this is inaccurate, in my opinion. We are currently at the edge of a certain type of computing, with fast single-core CPUs driving things. Software is begining to be developed to take advantage of multiple cores, and with quad cores coming out in a year or so, and 32-core cpus coming in a few years, we're really going to start seeing some massive improvements in processing power. We just need software to catch up. Once you design a multi-threaded application, the steps required to optimize it for use with 4 cores, instead of 2, and so on, are smaller than those requird to change from single-threaded to multi-threaded.
September 6, 2006 9:26:14 PM

Quote:
My shoes may tell me how far I've jogged (by speaking, wirelessly, into a small speaker embedded in or attached to my ear).


You can already buy shoes that do something like that. One of my friends has them. I think its basically a pedometer / gyroscope that talks wirelessly to a device in his pocket. When you hook it up to your computer, it can send the information to a website that will track all of the data for you and allow you to compare yourself to others. I think its a Nike product actually.
September 6, 2006 9:29:56 PM

what you said is cool and all, but I don’t think it will happen because of one reason. the computer is only going to be as smart as the person or persons that build it and program it. the computer could then be as smart as humans but I don’t think it will be smarter.

The one advantage a computer will have is the ability to perfectly "remember" what it has been told such as maybe a solution to a common reoccurring problem that might pop up 20 years later when every person has forgotten it.

However if such a computer can be made that could perform such tasks as you mentioned, it would benefit mankind because we would not have to be bothered by having to remember and re-teach previously learned information. That would free us up to find new things. Example if you ever watch the tv show survivor you can see how much trouble some have with making a fire. That task of making fire has been simplified in our society so much that no one really knows how to do it "the old fashioned way"
September 6, 2006 9:49:01 PM

Quote:

The main reason for the increase in speed is the newly designed architechture, is it not?


Core 2 Duo has it's roots in the Pentium M, which in turn has its roots in the Pentium 3 architecture, so the main reason is not necessarily a new architecture. it probably has more to do with the clock speed and ability to dynamically allocate cache to each core (plus there's a lot of cache to go around) I hadn't heard of the ability to shutdown parts of the processor that were not in use but the resulting savings in heat and power consumption probably contribute to a higher possible clock speed.
September 6, 2006 9:51:35 PM

You don't understand Moore's Law (Which isn't a law, it's a self-fulfilling prophecy!) which has NOTHING to do with speed. Anyone who says so is a fool, and knows-not their history.

Moore's Law in fact, relates to the CHEAPEST PRODUCTION PROCESS (Not to the general top-of-the-line process) and the fact that every 2 years (or so) the number of transistors per square inch doubles. This is still happening today, and Moore's Law is in no danger of falling over any time soon.

The speed issue is a totally different thing, and it is more related to thermal dissipation, and internal resistance than transistor count. In order to maintain low heat, next generation transistors have to be lower power, and more efficient due to the higher density.

Of course, Moore's "law" will expire (bercause it relates to transistor counts) when we transition to the next computing paradigm. First it was gears, then valves, then transistors, and then integrated circuits (featuring transistors). However, the lifetime of transistors as the main components inside CPU cores is pretty limited form here on it. There will be variations on the basic idea before it finally expires (Tri-gate transistors etc.) but eventually, you can't make them any smaller.

So, it'll be necessary to abandon the transistor and photolithography completely at that time.

It's anyone's guess what will replace them, but I'm personally betting on self-assembly of Carbon Nanotubes as being the way forward. These will most probably be quantuum computers, and hence very much faster at certain jobs (Like Encryption and Decryption) than computers today.

What we can predict with almost certainty though, is that over-all computing power will continue to increase at an exponential rate for the forseeable future.

Transistor count, and CPU frequency has about as much to do with computing power in the future, as perfect vacuums and glass tube production have today. i.e. None. What's important is IPC and the total efficiency of your architecture.
September 6, 2006 9:54:29 PM

Quote:
I dont think the brain has much real "computing" power... what it has is decision making... If you doubt this, tell me in less then 1 second what 7/11 is to 255 decimal places. It also is real multitasking. Imagine a computer that is constantly doing millions of things all at the same time. Then feed it a diet of carbs, vitamines, prozak and other necessities.

I agree with the fact that there are other materials that could take its place. I remember years ago IBM was working with copper.


Actually, many consider the brain to be the most powerful computer on the planet by several orders of magnitude.

The ability to decide requires making infinite (well, not really but virtually) other decisions in reaction to the environment. How could anyone ever code all those business logic and even today's computer can't process all that information that quickly. And yet, I can decide that of all the food in the world, an ice cream sundae is what I want to eat right now.
September 6, 2006 10:30:16 PM

Programming optimized multithreaded applications is very hard. You often end up with huge bottlenecks with times that tie up one cpu, while the others wait for another thread to start. This programming is a true artform and not so much of a science.

And Ice cream is too cold for my likeing... I want Chocolate cream pie !

I hope all you optimists are right... I need to beleive that spending $2000 every few years is a bit crazy, all so I can play some games?

If X-Box or Playstation came out with a keyboard and mouse controls, I wouldnt buy a new pc in about 5-10 years. Unfortunately control pads suck for First Person control.

I would like to say that the next big trend will be and IS the combinding of TV, Audio Recievers, Video Decoders, Video Recorders, and all of the controls (and damn remotes) into a solution that includes your home pc.
September 6, 2006 10:39:01 PM

I agree with Mobius in that Moores Law does not influence speed of electronic devices, but....it does relate to the fact that more transistors will allow a device to do more things. It was not very long ago that a processor could only perform one operation per clock cycle, that was most likely the time period when speed equalled faster/better. Then processors architecture changed to allow multiple operations per clock cycle, that was when "processor rating" came into play...what can it do in a given period of time?

All this time engineers knew that in order to go faster they needed to reduce the voltage that the processors run on, HERE IS WHY: a transistor is simply a switch, you can switch it on and off faster if you reduce the voltage needed to do so. Older processors were fed five volts, todays processors are fed as little as 1.3volts and have to be monitored closely or the transistors won't switch like they are supposed to. All this gets tied to the software that tells transistors when to be on or off, sloppy software can waste clock cycles and make a very fast piece of hardware seem slow, the flip side of that is that software purposely written for a specific device and debugged properly can be surprisingly fast...consider that car airbags are sensor and sofware driven.

Heat in electronics causes resistance, resistance can be overcome with more voltage but at a cost of increasing heat generated. There is a finite point where voltage or current will exceed the devices ability to dissipate the heat and it will fail.

Just a few things to think on.

P.S. The transistor theory as written here has been greatly simplified and does not take into account new multi-state transistors.
September 6, 2006 10:44:10 PM

Quote:
Programming optimized multithreaded applications is very hard. You often end up with huge bottlenecks with times that tie up one cpu, while the others wait for another thread to start. This programming is a true artform and not so much of a science.


I think that a lot of the difficulty is just that we have been thinking in linear terms of coding for so long, we just need to change the way we do things. Writing code in C++ or Java, creating threads is still not the best it can be. We need to write languages and get used to coding asynchronously. I do agree that there will always be bottlenecks, but the more experienced coders get and the better education becomes in this regard, the less often these will come up.
September 6, 2006 10:46:39 PM

Quote:
You don't understand Moore's Law (Which isn't a law, it's a self-fulfilling prophecy!) which has NOTHING to do with speed. Anyone who says so is a fool, and knows-not their history.

Moore's Law in fact, relates to the CHEAPEST PRODUCTION PROCESS (Not to the general top-of-the-line process) and the fact that every 2 years (or so) the number of transistors per square inch doubles. This is still happening today, and Moore's Law is in no danger of falling over any time soon.


Much you say is true... but the transistors per square inch does not and will not continue to grow, as long as silicon is still in place... at around 9-11 nm the electrons jump across the silicon onto the paths that are next to them.

I would also agree that it isnt truely a Law, but that is what its been called for years. I didnt event the english language, I only try to use it (and often times bad at that).

Quote:
Programming optimized multithreaded applications is very hard. You often end up with huge bottlenecks with times that tie up one cpu, while the others wait for another thread to start. This programming is a true artform and not so much of a science.


I think that a lot of the difficulty is just that we have been thinking in linear terms of coding for so long, we just need to change the way we do things. Writing code in C++ or Java, creating threads is still not the best it can be. We need to write languages and get used to coding asynchronously. I do agree that there will always be bottlenecks, but the more experienced coders get and the better education becomes in this regard, the less often these will come up.

An artform is something you accomplish that has no right answers. Science is based on truth and evidence. The world of multithreaded applications interacting with other multithreaded applications that interact with even more multithreaded applications becomes too complex of a problem to solve, especially when you add a big black box called the user to the equation.

Anyways, Ill let the discussion go on for now... I tend to want to talk about everything that is said...
Anonymous
a b à CPUs
September 6, 2006 10:54:45 PM

Oh jeez...
September 6, 2006 11:18:14 PM

I think that there will come a time when there is no longer a need to push CPU's to the envelope, even though there will be room to...

Honestly, how fast can a webpage load? How realistic can a 'game' get? The hardware might reach a point where there will no longer be a need to go so fast, simply because it is all as good as it can be. For example...a few years ago opening 'My Computer' would take a few seconds, but now on almost any decent system, the process is instataneous. No matter how much faster a CPU gets, processes can only be done but so fast. Its a difficult point to explain...but basically, if the process is instantaneous, why does a cpu need to be faster?
September 6, 2006 11:19:14 PM

Quote:
Programming optimized multithreaded applications is very hard. You often end up with huge bottlenecks with times that tie up one cpu, while the others wait for another thread to start. This programming is a true artform and not so much of a science.


I think that a lot of the difficulty is just that we have been thinking in linear terms of coding for so long, we just need to change the way we do things. Writing code in C++ or Java, creating threads is still not the best it can be. We need to write languages and get used to coding asynchronously. I do agree that there will always be bottlenecks, but the more experienced coders get and the better education becomes in this regard, the less often these will come up.

An artform is something you accomplish that has no right answers. Science is based on truth and evidence. The world of multithreaded applications interacting with other multithreaded applications that interact with even more multithreaded applications becomes too complex of a problem to solve, especially when you add a big black box called the user to the equation.

Anyways, Ill let the discussion go on for now... I tend to want to talk about everything that is said...[/quote]

I definately agree that coding is not a science per se. It is more of a technical trade. There are no 'correct' answers when coding anything beyond the simplest program. Again though, this is a little off topic.
September 6, 2006 11:23:17 PM

Quote:
I think that there will come a time when there is no longer a need to push CPU's to the envelope, even though there will be room to...

Honestly, how fast can a webpage load? How realistic can a 'game' get? The hardware might reach a point where there will no longer be a need to go so fast, simply because it is all as good as it can be. For example...a few years ago opening 'My Computer' would take a few seconds, but now on almost any decent system, the process is instataneous. No matter how much faster a CPU gets, processes can only be done but so fast. Its a difficult point to explain...but basically, if the process is instantaneous, why does a cpu need to be faster?


I can agree with you that a simple user interface will only go so fast (1 refresh of the monitor to be precise) and then any addition speed will go unnoticed. I do not agree that a game will ever be (at least in the next 25 years) realistic to the point where it can not be improved. Coding will be the limitation on this, that is the only reason that hardware could get to a point where it is not that important (since it will be so fast).
September 7, 2006 2:14:50 AM

35 inch monitors that wrap around your desk and that fill your eyes with wonderfull super high resolutions at very fast response rates. No more need to MAXIMIZE windows to see the data, but just put it in the place that best meets your needs.

Now thats power:) 

I think Ill buy that 35 Carat flawless diamond soon:) 
September 7, 2006 2:47:12 AM

processors - multi-core .......going to Terahertz
Graphics - Multi GPU
Physics - multi PPU

with those 3 CPU-GPU-PPU in one package

Memory- Higher clock frequency DDR, QDR-Quad data rate????
Harddisk - Optical intefaces between motherboards
- TeraBytes, Petabytes, Exabytes
September 7, 2006 5:28:43 PM

no need for 35 inch monitors, I read a while ago that "near-eye-displays" are/were equivalent to 61 inch screens and no need for desk space (or a desk).
September 8, 2006 10:42:08 PM

and will there be any 3rd,4th Cpu producer except AMD and Intel like Cyrix in older times which has been bought by VIA ,still producing and out of race now,if there be it will be good for competition and for the prices,for example why doesnt Nvidia doesnt show it self in Cpu arena?or why doesnt IBM,Sun etc. develop a x86 architechture cpu that we can use on our pcs?
September 8, 2006 11:18:37 PM

Quote:
Did you know that 89.72% of all quoted statistics are false?



This is 100% true
September 8, 2006 11:34:11 PM

Quote:
Groveling_Wyrm wrote:
Did you know that 89.72% of all quoted statistics are false?



This is 100% true


Did you guys know that 57.03% of all quoted statistics are made up on the spot?
September 8, 2006 11:49:49 PM

Quote:
The future of computers...

Unfortunately, the world of physics is meeting the world of computers


I really did hesitated to post on this thread, because you seem to be so ignorant on the subject (don't take this offensively: Ignorance is what we all have, before knowledge).
Anyway, do yourself a favour: read some more threads, more books & more articles before you start a thread like this, as you don't even seem to grasp the present of computers.

Again, don't take my advice offensively; just trying to be helpful.


Cheers!
September 8, 2006 11:54:43 PM

Quote:
The future of computers...

Unfortunately, the world of physics is meeting the world of computers


I really did hesitated to post on this thread, because you seem to be so ignorant on the subject (don't take this offensively: Ignorance is what we all have, before knowledge).
Anyway, do yourself a favour: read some more threads, more books & more articles before you start a thread like this, as you don't even seem to grasp the present of computers.

Again, don't take my advice offensively; just trying to be helpful.


Cheers!

How can we really take you seriously. I mean from your pic, you seem to be no more than 7 years old or so. I'm sorry, but I won't be called ignorant by a small child.
September 9, 2006 2:59:31 AM

Claims of others ignorance without adding to the conversation, making a statement or showing a small amount of intelligence... hmmm... enough said

This must mean the man has a crystal ball... I want one of thoose:) 

Any claim he could make would be as much speculation as the rest of us, for that is what this thread is about... speculation! If he was so smart he wouldnt be on Tomshardware forums but running some large company producing the products of the future... Instead he trolls, others posts... Have fun guy!

We have MP3's on our computers today, but the quality sucks... I would like to see better and higher quality sound stored on computers... start adding entire video collections to your computers, then on demand supply the entire house with this content. Luckily this isnt the future but is happening now, you just need to go to a local consumer electronic store. Also why is all audio (except for duel disks) in Hi-Fi, and not seround sound...

In my 20 year of computer experiance, the future of consumer based computers hasnt been so uncertain.

---------

The one thing I am absolutely certain of is the business world. It has a LONG way to grow using the technologies of today. They just havent had the vision or the know how of what to do... they try their best, slopping through the chaos of the world, but slowly REAL progress is made.

Sure business computing will continue to provide more data, faster, and more integrated. Consumer data will be shared with suppliers data, and vise versa. This is happening every day at a very fast rate.

Many things we dont see. IBM, CISCO, ORACLE, XEROX, all build products that none of us physically see in our personal lives. Well thats not absolutely true, you can see the IBM logo on the cash registers of the stores we shop at every day. The real money is the how thoose cash registers work with the rest of the business. As inventory is sold, supplies are sent out automatically. Predictions are made, and more inventory... The back ends talk to the suppliers who use other software (probably IBMS, SAP, ETC) to make their products and to talk to their resources. It goes all the way down to the people standing in the animal auction houses buying sheep for the wool for the shirts. And of course every company need to have their own customized software to niche out the market. Soon the replacement items you are GOING to buy will be shipped before you enter the store.

Xerox we see at our companies, but suprise, most of the large Xerox copiers you use every day at work arent even owned by the company you work for. Xerox leases them out, charges a monthly fee then charges a per-page fee. Xerox handles the maintenance and supplies the paper and toner.

CISCO builds most of our networks. We use their machines everytime we call a bank and get stuck in the large phone systems (yes they make thoose phone systems). We dont see Cisco when we connect to tomshardware, but its there sending our stupid claims that we know whats up and everyone else are idiots.

These systems grow larger every day at a VERY fast rate. I built systems for small companies... I built systems that would pump sewage from tanks, and then report to the company how much sewage is in and how much is leaving. Pumps can be controlled and adjusted based on everything from when it rains, to the 8 in the morning dump you just took before reading this.

I built systems for Xerox, which would store thousands of documents detailing how each customer would be charged for their service and copiers. Before my software, there was a floor dedicated to these documents, and a staff assigned to maintaining them... you would think this happened decades ago, but it didnt...

Most applications I have writen are things that should have been done 20 years ago, yet only now does someone say... hmmm we shouldnt have 60 people maintaining this data when we could put it on computers and have access to it in mear moments...

Most computers are used for this, and these applications are one of the biggest reasons our country produces items so efficently. IBM doesnt need to join the small Microchip industry, when there are buckets of money to be made in simply making Business Software.

Business men seek out the big gains first, then they work on the smaller ones... They build big, then shrink... Layoffs, followed by hires. New Computer Systems that end up profitable, and badly planed and poorly exicuted systems that drag on for years. Dominos has been working for 10 years on a Point of Sale system that will meet all their chain stores needs, and have yet to come up with one:( 

But, Life in consumer electronics are so much more fun:)  And yet VERY unpredictable... Do we really need HD/BlueRay DVD's, when most people dont even notice the difference of VHS and DVDS?

Mike

PS, I have fun talking about computers, so shut up, or say something... Doesnt matter if its dumb or not... Isaac wrote about robots... yet I havent seen a Robot yet... was he stupid? If so then he is not only stupid but he is rich.
September 9, 2006 7:13:47 AM

On a very practical note, read Thomas Friedman's "The World is Flat." Half the book is about the how current computing is affecting our world.
!