Sign in with
Sign up | Sign in
Your question
Solved

Intel predicted 10ghz chips by 2011

Last response: in CPUs
Share
a c 133 à CPUs
March 13, 2010 5:01:27 AM

http://www.geek.com/articles/chips/intel-predicts-10ghz...



I know its really old article from 2000 but what ever happened to 10ghz chips are they still working on this will it happen anytime soon. Now I know intel went the core approach instead but is this still in the future? Is the need for a 10ghz chip just not necessary?


JUST SO EVERYONE KNOWS I HIT BEST ANSWER BY ACCIDENT MEANT TO HIT REPLY.


So I still want to hear ev1s opinion on this Thanks
a c 122 à CPUs
March 13, 2010 5:18:18 AM

That was from back when they used a main architecture called "Netburst". It was great for high clock speeds because it had extremely long pipelines but the overall IPC dropped a lot.

Thats why they had CPUs back in the Pentium 4 days that were stock at 3.8GHz with a stock fan keeping it happy and could overclock pretty high for the time and the process used (130nm and 90nm) compared to now where a overclock is normally 3.8-4GHz on the stock settings using 45/32nm.

There is even a record holding Pentium 4 thats clocked at 8GHz out there.

but since the IPC drop caused it to be a worse performer, Intel dropped Netburst, took what they learned and applied it to the Pentium III Coppermine based CPU that had better IPC but troubles achieving higher than 1GHz clocks. But they revamped it and now that same based arch is what the Core 2 and Core i series run on. It is also linked to the first Pentium based CPU in arch.

So in the end, we probably wont have a 10GHz CPU. Its a nice goal but I doubt it will come anytime soon, nor by 2011.
a c 133 à CPUs
March 13, 2010 5:21:29 AM

Yea i already new that about the Pentium 4s and all that but i forget where i read it that they were experimenting with something other then silicon that allows them to reach higher clock speeds on less volts. I read this maybe a year ago.


Edit: found what it was it was called gallium arsenide
Related resources
a c 105 à CPUs
March 13, 2010 5:24:59 AM

intel predicts the death of the GPU every year also....can't wait to play Crysis 2 on the GMA 5000!!!
a c 131 à CPUs
March 13, 2010 5:35:03 AM

As well said by one of the comments, the actual computing power is much faster than the same speed of different architectures a few years back. Intel probably thought they would be able to keep going with netburst but hit thermal limits and other problems.

What I love about that article is some of the comment predictions:

What I'd like to know is if, by 2011, it will be necessary to have expansion cards (i.e. graphics cards, sound cards, or whatever) at all. The future computer I envision (and maybe it's just a coolaid trip) would be a box, with ports, that could be configured to do any number of functions, with the functionality added just-in-time… I'd like to see systems where _all_ resources could be dedicated to a specific task if the user so desired. The current model of dedicated components isn't terribly efficient. - by TropicalPunch
---

Hopefully, by 2011, we won't even NEED ports on our machines… everything should be wireless. Wireless keyboards, mice, printers are all common today; imagine 2011. Wireless video and chips that use so little power that you can power a whole PC with 2 AAA batteries for months.
Also with technology like LEP at our fingertips, imagine printing a display on a sheet of (polymer)paper and circuits on the back side of that sheet. You'll have an interactive sheet of paper that can be minipulated with input or just have it hold a lot of information. Either way, I doubt we'll be using clunky beige boxes on our desks for things like email and web browsing. - by Big Picture
[at least the boxes aren't beige anymore]
---

I think that thinking about where the desktop pc is going to be in ten years is a little short-sighted. We're already moving away from the desktop with all manner of specialty appliances that focus on one area of computing. From game consoles to internet appliances to cell phones with email, we're getting away from the all-in-one computing unit. I, for one, doubt that in 10 years computing will still revolve around a single Central Processing Unit and function as we now know.
Think how computers have changed in the last ten years. The internet totally came out of left feild for most, yet is probably one of the biggest phenomenons of the century.

I think that in ten years, no one will even think to care. - by ee
---

[love this one]
doesn't matter the speed of intel's chip in 2011 because motorola and ibm will already have chips out by then that are more effective, use energy better, and run applications ready for the new iMac's that are introduced at the 2011 MacWorld. They'll run at somewhere near 7GHz and still be faster than an 11GHz or even 128GHz sh*t that intel puts out.. - by StickWithApple
---

Back in grad school I worked on computing with light and transistors that had ten states (0-9 or base 10) rather than 2 (0 and 1 or binary). Anybody who doesn't think that these types of technology won't be commercially available by 2011 is kidding themselves. In addition, new OS capability to scale up and out will radically change how we compute. Maybe clock speeds will only be 10 GHz by then but dozens and dozens of processors may coexist on a single chip that process data in base 10 (or hex) instead of base 2, effectively performing hugely more complex computations with fewer transistors and (relatively) lower clock speeds than would currently be needed. I have seen the future and it ROCKS!!…. (Oh, Windows 2010 is still a slug…. Peace. - by Sinful
---

The point is, within ten years, we won't be using silicon-based computers. They'll be made obsolete by DNA/protein type bio-computers or maybe molecular computers.
- by MonkeyMan
---

What Intel seems to be saying is that they plan to still produce silicon chips a few years after other technologies have taken over.- by MonkeyMan
---

t sounds like Intel is saying to people, that a CPU with the power of a 10GHz Pentium 3 will be out in 10 years…Help me out.
- by Dooda Shaggy
---

Lastly, the processor really isn't the bottleneck…it's the dang OS. Well, that and the fact that Intel's architecture is from the 1960s (SISC is SOOO archaic!). - by Perplex
---

One word my friends: distributed clock architecture. Instead of having one central clock, which limits the speed of all the components, you have clocks throughout the chip. Much as I detest the PowerPC architecture, IBM has got a PowerPC chip running at 4.5 Ghz equivalent in the lab with this new technique.
- by Desert Eagle
---

Here is my prediction by 2011 technology that will be common place.
Clock speed at 25 GHZ:
Design rule down to .018 microns (only 18 nanometers)
more decentralized connections leads to shorter and hence even more transistors. Resulting in perhaps 1000 ALUs with system memory on chip. Result 25 trillion instructions per second complete computer on a chip around a square inch for less than $500. - by Peter
---

All we'll care about at that point will be packaging, because all system functions will be all but invisible. Video Card? I should hope that the visual interface is integrated into the microprocessor itself for bus speed capabilities. And yes, Apple will still reign supreme innovator in the ease of use and aesthetic packaging department! Idjut! - by MacJedi
---

My prediction: in 10 years mainline systems will be made from 10's – 1,000's of inexpensive (small die size, $1 chips) that have 1GB DRAM, 4 CPU's, and 4-16 10Gbs serial I/O's, integrated on a single die, running at whatever GHz works with respect to power constraints. One such chip will be sufficient to support (built into) a screen running 3D virtual reality Java apps. But a lot of them will be needed to support manipulation of gigapixel video derived from multi-headed cameras consisting of 1,000 $1 CMOS imager chips. - by outside-the-box
---

My ideal view of a future computer would be massive multiprocessing – say, multiple 10-GHz chips each with their own memory and bus, working in parallel or each of them running different programs when I am multitasking.
---

We all better start worrying about secondary
RF radiation from 500 MHz and up.
---

I seem to recall that the technical journals have been predicting “absolute no-way-around-it” physical limits on the performance of electronics (and specifically, transistors) ever since I started reading them in the 1970's. Strangely enough, these same journals also carry articles about clever new devices that blow the predicted limits away.
No matter how many times this cycle is repeated, there seems to be no end to folks who are willing to start it over again with another set of “physical limits”. Perhaps someday they will be right, but by then their batting average will be about 1e-6.
---

And doesn't pay attention to news either. Check out this link, to an article from 2001:
It says that 10GHz chips will be out in 2005.
The saddest thing is that the article was also part of the geek news.
GEEZE GUYS PAY ATTENTION! - by Noob
---

a c 133 à CPUs
March 13, 2010 5:39:04 AM

Yea i was getting a good laugh at them too myself I like the one that basically said the computer will be invisible. :lol: 

Best solution

a c 122 à CPUs
March 13, 2010 5:46:58 AM
Share

saaiello said:
Yea i already new that about the Pentium 4s and all that but i forget where i read it that they were experimenting with something other then silicon that allows them to reach higher clock speeds on less volts. I read this maybe a year ago.


Edit: found what it was it was called gallium arsenide


Not sure about that. I know Intel is working with fibre solutions.

ct1615 said:
intel predicts the death of the GPU every year also....can't wait to play Crysis 2 on the GMA 5000!!!


Yea but they might be right. If they can get Larrabee to work it might cause that. Even Gabe Newell of VALVe is excited about Larrabee.

saaiello said:
LOL that does sound good. I was bored and saw some old article about this and was just wondering if there was any new thoughts on this.

gallium arsenide
http://www.logitech.uk.com/gallium_arsenide.asp


Hah. Use poison. I can see it now:

New toys from China that have chips using galium arsenic. Would be fun.
a c 133 à CPUs
March 13, 2010 5:51:06 AM

Best answer selected by SAAIELLO.
a c 133 à CPUs
March 13, 2010 5:51:51 AM

oops meant to hit reply but hit best answer lols well jimmy i guess your best answer.
a c 133 à CPUs
March 13, 2010 5:54:20 AM

jimmysmitty said:
Not sure about that. I know Intel is working with fibre solutions.



Yea but they might be right. If they can get Larrabee to work it might cause that. Even Gabe Newell of VALVe is excited about Larrabee.



Hah. Use poison. I can see it now:

New toys from China that have chips using galium arsenic. Would be fun.



I thought the larrabee was dead or put on hold indefinatly.
a c 172 à CPUs
a b å Intel
March 13, 2010 10:54:19 AM

"Design rule down to .018 microns (only 18 nanometers)"
That's the only thing that came close to being right.

And gallium arsenide has been around for ages.
March 13, 2010 11:15:02 AM

Wow, those 10 year old comments are SOOOO fun to read. People bowing before their 400 MHz P3s....we've even surpassed the Core 2 age now.
!