Sign in with
Sign up | Sign in
Your question

Has technology come to a stand still?

Tags:
Last response: in CPUs
Share
a b à CPUs
January 7, 2010 3:48:28 PM

I am talking specifically about CPUs, when will we break the thz barrier? will ever mature past ghz? before we used to make chips to be faster now we make them smarter, so will just continue to make chips smarter or will we make them be faster the fastest people run their CPUs right now is around 4.8 ghz when will we go past 5 ghz and break the thz barrier.

moreover will we need that speed? and for what?

More about : technology stand

January 7, 2010 4:00:01 PM

software needs to catch up before we really need to go much further.
m
0
l
a b à CPUs
January 7, 2010 4:44:00 PM

yeah so when SW catches up will it need speed or smarts?
m
0
l
Related resources
a b à CPUs
January 7, 2010 4:48:07 PM

Upendra09 said:
yeah so when SW catches up will it need speed or smarts?



seems like all the gaming software companies have given up on pcs considering how much better our hardware is than consoles.


Why cant Bayoneta be written for the pc, that looks like it could be very good.


PC's are in a stale mate Upy, we got all this tech, never had processors so fast, ATI 5800 is a new mile stone, still waiting for Nvidia GT300 series and the games on the pc are sparse. All the major game manufacturers are putting pc games last it seems. Only a few now are dedicated pc developers. Maxis for one - anyone else think of any.
m
0
l
January 7, 2010 4:49:04 PM

Upendra09 said:
yeah so when SW catches up will it need speed or smarts?



storage speed..hdd are too slow
m
0
l
a b à CPUs
January 7, 2010 4:51:55 PM

Until we can mass produce chips that dont produce so much heat at the voltage required at the 3+ ghz range, We will only be making smarter chips. Although there are chips out there (made by ibm) that can do 350 GHz (room temperature) and has been over clocked to 500GHz (thanks to nearly absolute zero temperature)

http://www.tomshardware.com/news/ibm-500ghz-chip,2992.h...

(almost a 4 year old article)
m
0
l
a b à CPUs
January 7, 2010 5:18:17 PM

pat said:
storage speed..hdd are too slow




good point..


i/o to the hard disk is still the biggest bottle neck now hitting pcs.

SSD drivers are far too expensive and sadly for engineering companies who repair machines far too reliable but far to small at the moment.


someone should make a hybrid drive working with the ssd as a type of cache..

a 64 gb cache would rocket if used right on a 1tb byte drive with a internal memory of 64mb. - maybe thats the answer... windows could be setup to use the internal ssd and expand data, files and progams over the magenetic disk....

oh well back to reading posts on how good amd is some where completely different.
m
0
l
a b à CPUs
January 7, 2010 6:55:09 PM

Even with solid state, Hard Drive speeds are too big a bottleneck, and frankly, Windows isn't a good multi-CPU environment (I partly blame DLL's; static libaries would solve a lot of syncronization issues within windows...).
m
0
l
a b à CPUs
January 7, 2010 6:59:49 PM


warmon6 said:
Until we can mass produce chips that dont produce so much heat at the voltage required at the 3+ ghz range, We will only be making smarter chips. Although there are chips out there (made by ibm) that can do 350 GHz (room temperature) and has been over clocked to 500GHz (thanks to nearly absolute zero temperature)

http://www.tomshardware.com/news/ibm-500ghz-chip,2992.h...

(almost a 4 year old article)


a'right that is what i was looking for a good reason why we aren't going faster, so do we need a less dense material or more dense material than silicon

Hellboy said:
good point..


i/o to the hard disk is still the biggest bottle neck now hitting pcs.

SSD drivers are far too expensive and sadly for engineering companies who repair machines far too reliable but far to small at the moment.


someone should make a hybrid drive working with the ssd as a type of cache..

a 64 gb cache would rocket if used right on a 1tb byte drive with a internal memory of 64mb. - maybe thats the answer... windows could be setup to use the internal ssd and expand data, files and progams over the magenetic disk....

oh well back to reading posts on how good amd is some where completely different.


why don't they make the link between those two things smarter?

and no i am not an AMD fanboy, just wait and see i will recommend an Intel processor as soon as it makes sense

what is HKMG? (High K metal Gate) and why is it so controversial
m
0
l
a b à CPUs
January 7, 2010 7:03:40 PM

gamerk316 said:
Even with solid state, Hard Drive speeds are too big a bottleneck, and frankly, Windows isn't a good multi-CPU environment (I partly blame DLL's; static libaries would solve a lot of syncronization issues within windows...).


Why?
m
0
l
a b à CPUs
January 7, 2010 7:12:12 PM

Upendra09 said:
a'right that is what i was looking for a good reason why we aren't going faster, so do we need a less dense material or more dense material than silicon


Well that i can not say. I dont have a good knowledge of what materials wont build up heat quickly nor produce it so quickly.

Although if i had a guess, i would say maybe something denser but thiner would do but as i said. I dont know this so well so, don't take my word for what would be a better material than silicon.
m
0
l
January 7, 2010 7:33:38 PM

Upendra- HKMG stands for high k metal gate. It reduces leakage in transistors and can possibly boost performance per transistor considerably (when its designed well).

Also there are experimental diamond chips (instead of silicone) that can be clocked at 80Ghz with 300 micro meter transistors. The reason being that they can take extreme amounts of heat.

Imo the future of computers won't be the traditional design. It will either be:
Stacked carbon nano tube chips
biological computers
quantum computers- by far has the most potential.
m
0
l
a b à CPUs
January 7, 2010 8:03:40 PM

yeah i have heard of biological CPUs made out of our DNA it was on TH a few months ago

what do people mean when they say gate last and gate first? JDJ always talks about it. and what is the "gate" and what does it do

we don't have a lot of diamond though do we? and if we mass produce them the cost of diamond will be less than dirt

carbon nano tubes are still being developed and look promising

what do you mean by quantum?

yeah i know google is my best friend but here i can get answers that are somewhat simplified instead of reading PDFs of jargon that i don't understand and besides i don't know a lot of physics yet
m
0
l
January 7, 2010 8:40:40 PM

gamerk316 said:
Even with solid state, Hard Drive speeds are too big a bottleneck, and frankly, Windows isn't a good multi-CPU environment (I partly blame DLL's; static libaries would solve a lot of syncronization issues within windows...).

Then dont use old OS, use W7, not XP, as the global changes are in W7, not XP
m
0
l
January 7, 2010 9:17:12 PM

Upendra09 said:
yeah i have heard of biological CPUs made out of our DNA it was on TH a few months ago

what do people mean when they say gate last and gate first? JDJ always talks about it. and what is the "gate" and what does it do

we don't have a lot of diamond though do we? and if we mass produce them the cost of diamond will be less than dirt

carbon nano tubes are still being developed and look promising

what do you mean by quantum?

yeah i know google is my best friend but here i can get answers that are somewhat simplified instead of reading PDFs of jargon that i don't understand and besides i don't know a lot of physics yet


Quantum computers are very strange. Instead of each bit representing either a 1 or 0 at any time, they can represent both a 0,0; or a 1,0; or a 0,1;or 1,1; at any given time. Its strange, but it means that the computing power increases exponentially. Currently the largest quantum computer is only 16 bits, but still it is pretty powerful.

Also there is a company called Gemesis who has a machine that uses heat and pressure to make diamonds, but its currently expensive.
m
0
l
Anonymous
a b à CPUs
January 7, 2010 9:38:39 PM

Technology will never stand still, only brains.
m
0
l
January 7, 2010 9:41:45 PM

Quote:
Technology will never stand still, only brains.


And trees. They like to stand still.
m
0
l
a b à CPUs
January 7, 2010 9:52:13 PM

My vote is for brake technology as most likely to be at a stand-still :p 

m
0
l
a b à CPUs
January 7, 2010 10:07:26 PM

Firstly we need to drop x86, and then we need to drop the transistor.

Technological advancement for CPUs comes down to a few factors:

1) Money
2) Competition (poor competition stagnates innovation)
3) Demand for performance
4) Power consumption
5) Physical size limitations

If we ignore the first 3, since they are obvious, the latter two are what have driven us to where we are today. If vacuum tubes did not use so much power, and weren't as big, they could still be useful for computation. But because demand for performance was growing - and making computers the size of cities but slower than an Atom and more power-hungry than the whole TOP500 combined wasn't practical - there needed to be a technological shift. Hence the discrete transistor. But you can't run Crysis on a computer that requires billions of discrete transistors because power consumption and physical size would prevent it. So the IC was born.

But since then we've only had relatively minor innovations. We've gone from microns to nanometres but we're still using transistors to run our computers as we have been for 40 years. At some point we're going to need a technology shift, and it can't come soon enough.
m
0
l
January 7, 2010 10:12:58 PM

Toots and the Maytalls did a song with Willie Nelson called:

Still's Still Moving to Me

but Toots best will always be

Get Up, Stand Up

Now I am going to be singing that for the rest of the day.

Somehow there could be great relevance, or even revelance, in these words; and I will let you know if I find it.

But on topic; we could be waiting for a cold snap. Or did we just have one of those.

:D 
m
0
l
January 7, 2010 10:18:10 PM

randomizer said:
Firstly we need to drop x86, and then we need to drop the transistor.

Technological advancement for CPUs comes down to a few factors:

1) Money
2) Competition (poor competition stagnates innovation)
3) Demand for performance
4) Power consumption
5) Physical size limitations

If we ignore the first 3, since they are obvious, the latter two are what have driven us to where we are today. If vacuum tubes did not use so much power, and weren't as big, they could still be useful for computation. But because demand for performance was growing - and making computers the size of cities but slower than an Atom and more power-hungry than the whole TOP500 combined wasn't practical - there needed to be a technological shift. Hence the discrete transistor. But you can't run Crysis on a computer that requires billions of discrete transistors because power consumption and physical size would prevent it. So the IC was born.

But since then we've only had relatively minor innovations. We've gone from microns to nanometres but we're still using transistors to run our computers as we have been for 40 years. At some point we're going to need a technology shift, and it can't come soon enough.


I think getting rid of x86 would be great. Then we could also get rid of you know who, and the stifling of innovation; such as is being discussed in this topic.

Nice post Random
.
m
0
l
a b à CPUs
January 7, 2010 10:27:53 PM

sighQ2 said:
I think getting rid of x86 would be great. Then we could also get rid of you know who, and the stifling of innovation; such as is being discussed in this topic.

Nice post Random
.

I don't believe Intel is stifling innovation in the grand scheme of things. x86 is stuck here because it is so widespread and dumping backwards compatibility is a big no-no these days. Switching architecture is going to be a painful process. If anyone is stifling innovation in the IC arena, it's the market, not the manufacturers. Intel is stepping on other x86 competitors but they themselves are not stifling innovation of other architectures. Look at Itanium. It was an Intel brain child but simply didn't take off because the market rejected the product.

Furthermore, Intel certainly aren't stifling innovation of completely new technologies. The manufacturers know that the silicon-based IC can't be relied on forever, so they are working on R&D into other areas as are universities. They want to be the first to the market with a new technology that will be accepted so that they can dominate the market, like x86 does at the moment. But I still think that the market is going to reject anything new until it is absolutely necessary to change, and therefore I firmly believe that the market is stifling innovation more than anyone else.
m
0
l
January 7, 2010 10:31:01 PM

yannifb said:
Also there is a company called Gemesis who has a machine that uses heat and pressure to make diamonds, but its currently expensive.




The production of synthetic diamond has been going on for quite some time, and by several companies/entities.

http://en.wikipedia.org/wiki/Synthetic_diamond

Unfortunately the production is far too cost prohibitive to see any kind of consumer level computer/tech related products any time soon. All we can do is hope for more companies to become interested in advancing the technology.
m
0
l
January 7, 2010 10:36:38 PM

I cant help but think the next great thing wont be from anybody we would expect. It will come from some company or group of people who think outside the box and do not necessarily pay mind to logical progression of tech.
m
0
l
January 7, 2010 10:36:50 PM

mtyermom said:
The production of synthetic diamond has been going on for quite some time, and by several companies/entities.

http://en.wikipedia.org/wiki/Synthetic_diamond

Unfortunately the production is far too cost prohibitive to see any kind of consumer level computer/tech related products any time soon. All we can do is hope for more companies to become interested in advancing the technology.


I know, but the whole reason Gemesis and another company apollo are so important is because the reason they built the machines that make diamond is to research future cpu tech.

Saw it on the science channel, lol :D 
m
0
l
January 7, 2010 11:21:20 PM

yannifb said:
I know, but the whole reason Gemesis and another company apollo are so important is because the reason they built the machines that make diamond is to research future cpu tech.

Saw it on the science channel, lol :D 


Sad I missed that, and I watch a lot of Science/TLC. Here's to hoping that these companies have 'holy grail' type breakthroughs sooner rather than later ;) 
m
0
l
a b à CPUs
January 8, 2010 12:48:47 AM

Quote:
Technology will never stand still, only brains.


Like yours? Why aren't you using jennyh any more?

fazers_on_stun said:
My vote is for brake technology as most likely to be at a stand-still :p 


I nominate you as the funniest guy in this thread and the CPU Forums


randomizer said:
Firstly we need to drop x86, and then we need to drop the transistor.

Technological advancement for CPUs comes down to a few factors:

1) Money
2) Competition (poor competition stagnates innovation)
3) Demand for performance
4) Power consumption
5) Physical size limitations

If we ignore the first 3, since they are obvious, the latter two are what have driven us to where we are today. If vacuum tubes did not use so much power, and weren't as big, they could still be useful for computation. But because demand for performance was growing - and making computers the size of cities but slower than an Atom and more power-hungry than the whole TOP500 combined wasn't practical - there needed to be a technological shift. Hence the discrete transistor. But you can't run Crysis on a computer that requires billions of discrete transistors because power consumption and physical size would prevent it. So the IC was born.

But since then we've only had relatively minor innovations. We've gone from microns to nanometres but we're still using transistors to run our computers as we have been for 40 years. At some point we're going to need a technology shift, and it can't come soon enough.


What is IC? Integrated Chipset? what's wrong with transistors? what else can we use?

what is the point of going smaller and smaller? it creates less heat and uses less power but this seems counter intuitive wouldn't something that small and that powerful create more heat? and possibly self destruct by melting itself? i mean all that power is so concentrated and the heat is all in one place.
m
0
l
a b à CPUs
January 8, 2010 1:27:13 AM

IC = Integrated Circuit. Transistors have nothing wrong with them except that they are ancient technology. What else can we use? You tell me, that's what the researchers are trying to find out. I'm sure if you go back to the 30s and 40s they'll ask the same questions. "What's wrong with vacuum tubes? What else can we use?"

The transistors in an IC are not powerful on their own, we just have billions of them. And yes, that does produce ridiculous amounts of heat. Why do you think they sell 1.5kg heat sinks to keep them cool? :D  Pull off your heat sink and watch the temp go up about 50C and shut down in a second or two.
m
0
l
January 8, 2010 2:31:35 AM

Hellboy said:
seems like all the gaming software companies have given up on pcs considering how much better our hardware is than consoles.


Why cant Bayoneta be written for the pc, that looks like it could be very good.


PC's are in a stale mate Upy, we got all this tech, never had processors so fast, ATI 5800 is a new mile stone, still waiting for Nvidia GT300 series and the games on the pc are sparse. All the major game manufacturers are putting pc games last it seems. Only a few now are dedicated pc developers. Maxis for one - anyone else think of any.


In my eyes Valve are dedicated PC developers, yes I know they make stuff for consoles but we always come first and we get most things free that they don't.
m
0
l
January 8, 2010 2:36:13 AM

Quote:
what is the point of going smaller and smaller?


The smaller the chip is, the less distance the electricity has to travel, thus less voltage need to be applied to get the end result. (longer wire=more resistance). The less the voltage, the faster they can signal the 0's and 1's. Effectively, .5 volts drops to zero volts alot faster than 1 volt. Electricity has weight and actually takes a split second to stop flowing. The less voltage, the less potential there is and the faster the electricity drops. I think we need to explore fiber optics of some sort for the internal workings of a computer. Light is much easier to pulse at a stupidly fast frequency, hardly any loss and doesn't create much heat...not only that it's pretty much endless bandwidth.

Excuse my "quotes" new to this site....
m
0
l
a b à CPUs
January 8, 2010 11:08:12 AM

JAYDEEJOHN said:
Then dont use old OS, use W7, not XP, as the global changes are in W7, not XP


I'm simply making the argument that no matter how you code Windows, since only one copy of a DLL can exist at any point within the System, that if you have two programs that need access to said DLL, and because you can't have both access at the same time without the possibility of corruption, there will be a performance bottleneck. If you return to static libariries (IE: Every program has its own unique version of the DLL loaded into memory), that bottleneck goes away.

Besides, DLL's were simply a memory hack to get around the old 16-bit memory limit; having only global DLL's saved a ton of space...
m
0
l
a b à CPUs
January 8, 2010 1:02:14 PM

protokiller said:
In my eyes Valve are dedicated PC developers, yes I know they make stuff for consoles but we always come first and we get most things free that they don't.



Valve are not dedicated pc club - yes the make money selling pc games online.

And they are more pc than xbox.

but they still do xbox games :s


Creative Assembly does pc stuff only but i cant seem to rack my brain in to thinking who else is pc only
m
0
l
a c 172 à CPUs
January 8, 2010 1:13:58 PM

We will always have transistors as long as we are working with any kind of solid state electronics.
m
0
l
a b à CPUs
January 8, 2010 6:17:50 PM

randomizer said:
I don't believe Intel is stifling innovation in the grand scheme of things. x86 is stuck here because it is so widespread and dumping backwards compatibility is a big no-no these days. Switching architecture is going to be a painful process. If anyone is stifling innovation in the IC arena, it's the market, not the manufacturers. Intel is stepping on other x86 competitors but they themselves are not stifling innovation of other architectures. Look at Itanium. It was an Intel brain child but simply didn't take off because the market rejected the product.

Furthermore, Intel certainly aren't stifling innovation of completely new technologies. The manufacturers know that the silicon-based IC can't be relied on forever, so they are working on R&D into other areas as are universities. They want to be the first to the market with a new technology that will be accepted so that they can dominate the market, like x86 does at the moment. But I still think that the market is going to reject anything new until it is absolutely necessary to change, and therefore I firmly believe that the market is stifling innovation more than anyone else.


Actually one could make a logical case that Intel, by trying to do away with x86 in favor of Itanium, was pushing progress whereas AMD, by introducing x64 and hence prolonging x86, stifled it. And ditto with Larrabee vs. more-of-the-same GPUs from AMD...

IMO Moore's law will end in about 6 years after the 15nm node, so some alternative will have to be found or all those predictions of AI computers smarter than Einstein in 2050 will not come to pass. Which may be a good thing :p .
m
0
l
a b à CPUs
January 8, 2010 6:42:05 PM

jsc said:
We will always have transistors as long as we are working with any kind of solid state electronics.


True - I think Random was maybe referring to the old binary transistor logic model that has been around since 1970 or earlier, at least for ICs. Somebody once mathematically proved that the most efficient logic would be base 3, or ternary logic, since it is closest to the number "e" or 2.7128 which is the basis for natural logarithms. However, binary has a number of advantages in that its easy to implement, more reliable due to better noise immunity (a noise spike would have to rise to or exceed half the supply voltage Vin/Vout switching curve, or more if hysteresis is used) and there are huge libraries of binary logic available.

There are other types of processing circuits of course - analog computers which can solve differential equations directly, and which gave birth to such oddities as neural networks that model a nervous system by storing information in the form of weighted interconnections. However none of these have ever amounted to any significance compared to the binary digital logic circuitry that pervasively surrounds us.
m
0
l
January 8, 2010 7:58:17 PM

gamerk316 said:
I'm simply making the argument that no matter how you code Windows, since only one copy of a DLL can exist at any point within the System, that if you have two programs that need access to said DLL, and because you can't have both access at the same time without the possibility of corruption, there will be a performance bottleneck. If you return to static libariries (IE: Every program has its own unique version of the DLL loaded into memory), that bottleneck goes away.

Besides, DLL's were simply a memory hack to get around the old 16-bit memory limit; having only global DLL's saved a ton of space...

Sorry, but the newer OS do have better MTMC perf, and I agree with you also, maybe W8, as it may be only 64bit as rumored
m
0
l
a b à CPUs
January 8, 2010 8:08:29 PM

ru_1980 said:
Quote:
what is the point of going smaller and smaller?


The smaller the chip is, the less distance the electricity has to travel, thus less voltage need to be applied to get the end result. (longer wire=more resistance). The less the voltage, the faster they can signal the 0's and 1's. Effectively, .5 volts drops to zero volts alot faster than 1 volt. Electricity has weight and actually takes a split second to stop flowing. The less voltage, the less potential there is and the faster the electricity drops. I think we need to explore fiber optics of some sort for the internal workings of a computer. Light is much easier to pulse at a stupidly fast frequency, hardly any loss and doesn't create much heat...not only that it's pretty much endless bandwidth.

Excuse my "quotes" new to this site....


thank you, so by what your saying couldn't we just keep going smaller and smaller? to atomic sizes if we got that technology? an how could we use fiber optics for processors? or ICs?

jsc said:
We will always have transistors as long as we are working with any kind of solid state electronics.


what do you mean by solid state electronics?
m
0
l
a b à CPUs
January 8, 2010 8:43:36 PM

gamerk316 said:
I'm simply making the argument that no matter how you code Windows, since only one copy of a DLL can exist at any point within the System, that if you have two programs that need access to said DLL, and because you can't have both access at the same time without the possibility of corruption, there will be a performance bottleneck. If you return to static libariries (IE: Every program has its own unique version of the DLL loaded into memory), that bottleneck goes away.

Besides, DLL's were simply a memory hack to get around the old 16-bit memory limit; having only global DLL's saved a ton of space...




Not arguing with your point at all - I fully agree...


Having said that, I also feel a disturbannce in the force.. almost as if a billion souls cried out at once.... "Memory Hog"!! Windows Ate My Ramz!! OMG!!!"
m
0
l
a b à CPUs
January 8, 2010 8:51:12 PM

Upendra09 said:
I am talking specifically about CPUs, when will we break the thz barrier? will ever mature past ghz? before we used to make chips to be faster now we make them smarter, so will just continue to make chips smarter or will we make them be faster the fastest people run their CPUs right now is around 4.8 ghz when will we go past 5 ghz and break the thz barrier.

moreover will we need that speed? and for what?


I would predict that for software to catch up they will eventually be highly multithreaded and at some point CPU Manufacturers will be racing to the larger amounts of cores...

Just a wild guess...

Edit... It seems I was not looking very far...
m
0
l
January 8, 2010 9:13:07 PM

Scotteq said:
Not arguing with your point at all - I fully agree...


Having said that, I also feel a disturbannce in the force.. almost as if a billion souls cried out at once.... "Memory Hog"!! Windows Ate My Ramz!! OMG!!!"

lol
While I agree they do eat memory like a fat boy in a candt shop, its part of the CYA bad programming model that it has to play with as well, as alot of apps arent so tightly coded to efficiency Also, as we see more and more MT/MC, therell always be single threaded apps, and since the MC/MT approach goes against efficiency for single threaded perf, Intels turbo, and eventually AMDs solution is the right move for this scenario
m
0
l
a b à CPUs
January 8, 2010 10:31:18 PM

JAYDEEJOHN said:
lol
While I agree they do eat memory like a fat boy in a candt shop, its part of the CYA bad programming model that it has to play with as well, as alot of apps arent so tightly coded to efficiency Also, as we see more and more MT/MC, therell always be single threaded apps, and since the MC/MT approach goes against efficiency for single threaded perf, Intels turbo, and eventually AMDs solution is the right move for this scenario



While I agree with everything you have said in this thread, Dont say that in front of keith or zooty. :lol: 
m
0
l
a b à CPUs
January 8, 2010 10:36:05 PM

what is MT/MC?
m
0
l
a b à CPUs
January 8, 2010 10:37:36 PM

multi-thread/multi-core (at least, that's what I'd guess from context)
m
0
l
January 8, 2010 10:41:33 PM

Technology tends to come to standstills but then a new platform of technology is researched and the explosion of exponential improvement starts again until the limits of the new platform is reached. We've really squeezed all we can out of silicon processing technology now, the new research with water based CPU's looks really exciting especially when you see the theoretical speeds they can reach (i think we're talking about 20 GHz single core). However, right now the best direction for computer technology to take at the minute, especially processors, is the way GPU's are going with their stream processors.
m
0
l
a b à CPUs
January 8, 2010 11:07:59 PM

Realistically, I think the next shift in technology would be towards graphene-based transistors rather than something radically new. The (currently prototype) bilayer pseudospin FET (BiSFET - not the same as BISFET) can run with 2-3 orders of magnitude lower power consumption than a MOSFET, something that is really needed.
m
0
l
January 8, 2010 11:13:22 PM

OK, please everyone, consider your comments, and leave personal ones out, as I was just alerted to complaints. Im asking now, Ill deal later.

Now nack on topic, I couldnt care less 1 way or another if footy or whomever sided with me or against me. Companies do things which theyve forecasted as success, sometimes their HW solutions leave us wanting more, but if its in the right direction, still getting there is whats important, regardless of who comes out first, or best implementation.
So, AMD fans, tho some are misunderstood complaining about benches, turbo, or some type of it, will bee seen on AMD chips in the future, so those who think its just a gimmick, for DT, it certainly isnt, possibly for server, but certainly not for DT
m
0
l
a b à CPUs
January 9, 2010 1:55:14 AM

Griffolion said:
Technology tends to come to standstills but then a new platform of technology is researched and the explosion of exponential improvement starts again until the limits of the new platform is reached. We've really squeezed all we can out of silicon processing technology now, the new research with water based CPU's looks really exciting especially when you see the theoretical speeds they can reach (i think we're talking about 20 GHz single core). However, right now the best direction for computer technology to take at the minute, especially processors, is the way GPU's are going with their stream processors.


what do you mean by water based?
m
0
l
a b à CPUs
January 10, 2010 1:31:13 AM

that makes sense water would be a better conductor
m
0
l
a b à CPUs
January 10, 2010 2:17:41 AM

Water is a terrible conductor if it is pure.
m
0
l
!