Top-10 Hardware for 2006

We have posted the second installment of our year-end Top-10 series. This time, we are looking at the top-10 of the most significant hardware for this year. What do you think? Do you agree with our choices? Which ones would you add?
31 answers Last reply
More about hardware 2006
  1. Not much to say here... C2D pwns all...

    Everything else seems fairly vanilla, expected... maybe not the whole Apple-Intel and Dell-AMD... but meh... throw some competition out there...
  2. Intel C2D!

    I've heard that the culture at Intel is a brutal and fast paced one. You produce results and you do it quickly or you get out. AMD has awaken a giant that was sleeping soundly in its netburst bed. I think that for AMD the next couple years holds brief periods of matched performance with Intel, the rest of the time is catchup mode :lol: .
  3. Having talked to both AMD and Intel reps, I can say AMD is a friendlier nicer company, kinda like Jackie Chan, and Intel is a cooler, stricter company, like Bruce Lee...

    Of course, Bruce Lee and kick Jackie Chan's ass 100 ways to oblivion...
  4. Intel's Core is no big thing. Faster but just an evolutionary change. AMD will stomp Intel in two years or so, as they have seen a different path to tread and have more more clever engineers than Intel. Soon Intel will have 4, 8, whatever cores and wake up and say- Shit!, we need to be over there and catch up with what AMD is doing. the CPU race just flip-flops back and forth. I'm an AMD fanboy but I don't really pay attention to the CPU arms race. Neither at home nor at work do I need to order the fastest CPU on the shelf.

    Mac on Intel, and AMD in Dell pc's is by far most important change.

    Third is perhaps the Sansa e200 for being a better choice than iPod (Just got my wife a e250 for Xmas gift).

    Nvidia's 8800 card wasn't out of the blue, just expected and it arrived a month or two earlier than needed.
  5. shouldnt there be an option for other?
    not that i'd vote that way (is i'd say the 8800 or the C2D. though the 8800 and Dx10 wont really show there colours till sometime next year
  6. I would say that the blue ray would be best for computer users being it holds more data making it easier to back up data and alot of other stuff.

    I disagree with intel being dead last before conroe came out. Even though the pentium D wasnt the best on the market all my gamer friends still either bought it or a Pentium 4. In my eyes Intel has always been better than amd. easy overclock, faster. I dont know anyone with an Amd. Ill give amd another try next year and purchase a processor.\

    Must be the location i guess Texas loves Intel more.
  7. Quote:
    I would say that the blue ray would be best for computer users being it holds more data making it easier to back up data and alot of other stuff.

    I disagree with intel being dead last before conroe came out. Even though the pentium D wasnt the best on the market all my gamer friends still either bought it or a Pentium 4. In my eyes Intel has always been better than amd. easy overclock, faster. I dont know anyone with an Amd. Ill give amd another try next year and purchase a processor.\

    Must be the location i guess Texas loves Intel more.

    Stop, wait, you are a bit backwards. You purchase Intel when AMD was performing better and now you want to purchase AMD now that Intel has taken the performance crown? Masochism at its finest, hehe. Next year Intel will most likely still be the performance leader, buy Intel! In fact always buy Intel, they are just better! My Intel stock tells me so.
  8. People have to admit that Wii blew everyone away!

    Makes me wonder though... if PS3 had a larger supply and lower price, would run on par with Wii?


    I've heard that there are like a few PS3s in stores, but no one will buy them. Everyone wants the Wii!

    Now, who the h3ll saw that coming?!?!?! 8O
  9. I know intel is better. Man wish I was old enough for stock. ive played stock games and im real good at making money. But ima get amd when it starts (if it ever does) getting back up on its feet. i hope amd never goes out of buisness otherwise we will be in big trouble.
  10. Well even though the Core 2 was pretty amazing its more of an evolution not a revolution, an this is a computer forum which automatically equals a bias towards CPU's, but i don't think one should completely disregard the Wii. With nintendo being the odd man out in the console war, ahem IBM, Nintendo has suddenly propelled itself to a very competitive 2nd place console, and time will eventually tell whether it stays there or takes both the 360 and PS3 by the head. I commend nintendo for thinking out of the box and implementing features and controls that neither MS or Sony would have even of come up with.

    The best part about it all is that the specs on the Wii are inferior but yet it offers so many other things that it sells.

    One last thing both Sony, Microsoft, Intel and AMD can take note from Nintendo is there well prepared launch. They take the stores by storm and have been able to sell over 1,746,000+ consoles so far. Only if AMD or Intel could execute as well as Nintendo. Granted there are many other factors that effect a CPU launch from a Console launch its still something Intel and AMD could realize that when you got a hot product its good to have it in stores while its still hot.
  11. Yep, Core 2 Duo rocks. The price, performance and power consumption sets a new barrier. The DX10 card I think is not really that good since no OS and software that fully takes advantage of it yet. All this new stuff coming out this year is making me want to upgrade and made my system look outdated.
  12. Quote:
    had to say conroe as it has, or at least will re-ignite the CPU race which for the last few years has become a bit embarrasing for intel.

    it is what is likely to really help drive software dev's to finally change to multi-threaded software and make use of the new hardware now that the big name of CPU's has made a worthwhile product.

    oh and chaosGC, just stfu you are making a fool of yourself. the pentium d's were, like netburst a joke only doubly so. they were faster only in a Ghz way and not a computational one. jesus, get with the times.

    You need to STFU. GOD DAMN RETARD you think the pentium D's were worthless HELL NO. They paved the way for conroe w/o pentium D there is no dual core not conroe and amd may not have it either. AND THEY ARE STILL F-ING POWERFUL. Second place is never bad. Oh yeah netburst is a joke (sarcasm) it got us to where we are and is still good. wow conroe is UP TO 40% faster fucking wow i love conroe to but Pentium D was good competition for Amd. SO YOU STFU FOOL.
  13. The gap between AMD and C2D is bigger now than the gap between Pentium and AMD was back then. Besides, people still bought the Pentium D more than AMD's cpu. Intel has alwyas had the majority market share... probably always will.
  14. The pentium D's are all worthless, they are basically pentium III chips that have been rebadged. Only the big suckers bought them. If you are non-technical person, then you were likely to believe the hype and buy a pentium D, but at the same time you probably don't need anything more powerful then a P3 so i guess it all works out.
  15. I chose the Wii as most influential because it shows that innovation will always trump raw power. Not only do gamers love it, but people who never touched a game in their lives are fascinated with its ability to simulate real experiences. On top of all that, it only costs $250!! Think about that. That's less than most iPods, C2D cpus, or video cards out there.

    Sure C2D is amazing but in reality all it does is more FLOPs with less pwr than AMD. Wii is loads of entertainment right out of the box.

    Now can we get Nintendo to add HD support?

    Another thing. Stop trying to push MP3 players other than the iPod, Toms. That one you picked (I already forgot its name) is weak compared to many other technologies that became prime this year. How about fast response gaming LCD screens? Or even HDTVs with HDMI and over the air broadcasts in 1080i via ATSC. What about the 100 dollar laptop? MP3 players are nothing new and are overpriced IMO. I got an Audigy 4 for 70 bucks that can decode DVD Audio. Now put that kind of tech in a hand held and I'll be voting #1

    On a side note one tech that I think isn't realized yet but could be next year is the Cell processor/XDR Memory. I know it seems off now but it will rock if certain people/companies use it the right way.
  16. I remember when the first P4s came out, and even then it was obvious they were a bad move as the clock speed was so much higher for no more performance compared to the P3. It was only when they doubled the cache with Northwoods that P4 became reasonable performers.

    Intel banked on getting them to ridiculously high clock speeds to make up for the architecture, but didn't reckon with the power and heat issues. I recall reading about 5GHZ P4s, but it never happened.

    As Stranger says, Core 2 is an evolution of Pentium 3 / Pentium M, not P4.

    P4 was like the Neanderthals - an evolutionary dead end.

    Face it.
  17. I voted for the Core 2 Duo series of processors. It's with these processors that all came together for Intel. My second vote would go to the Nvidia 8800 series GPUs.

    Talking about the Netburst architecture, I wouldn't say it was all that bad an architecture. At least up until some point. And it had some benefits we can now also enjoy on the Core 2 Duos: it made Intel develop a faster front side bus, better prediction mechanisms, the Enhanced Speed Step Technology (originally to fix the Prescott's misbehaving), and gave them the necessary experience to fine tune the number of pipelines and the clock speed and ultimately the performance they can achieve. (For example the Core 2 Duos have slightly more pipelines than the Core Duos (notebooks). This, while done in a balanced way (unlike with the Prescott), can keep the same performance while enabling future clock speed increases, which are still necessary).

    It's no longer a Gigaherz clock race, but the Gigaherz will necessarily keep rising, although now a bit slower, because now we have the "optimization" factor, that has to be done before raising the clock speed.

    Don't get me wrong: I think the Netburst architecture was a temporary path for Intel to find out some important things they are now combining with the Pentium III/M/Core(notebook) architecture to bring us the Core 2 Duo.

    If they could have done it sooner ? I think yes, the could have skipped that huge mistake called Prescott, and have moved on to the Core microarchitecture.

    Talking about the Pentium 4 performance I have to say that you shouldn't forget that although the Willamette core was only worthwhile past 1.7 Ghz compared to the PIII, and even then the MD was a better choice, the Northwood core on the other hand was consistently considered the fastest processor compared to the AMD. It was so with the 2.2 Ghz, 2.4, 2.6 Ghz, the "potent" 2.8 many magazines referred to, and finally the 3.0Ghz.

    At this time very few people were aware of the power eficiency of the processors and many home users weren't deciding based on that, but we were on the verge of a turning point. This is exactly were we are nowadays with the graphics cards: at a turning point. At the end I will say a word on the graphics cards out there right now concerning this same matter.

    This is to say that at that time, altough the Northwood wasn't exactly power efficient, nobody said it was a bad processor just because of that. It also wasn't as flagrant as the Prescott.

    It was at that point that Intel started to phase in the Precott core, which in my opinion was the biggest and most stupid mistake they ever made. And they went even further with it by putting two of them to bring up the Pentium D line, instead of using an updated Northwood core.

    Let's look at the facts: the Northwood core (in the disguised Extreme Edition Gallatin form) carried over to socket 775 and went up to 3.46 Ghz on the 130nm process! (It was also the first processor to use the 1066 fsb the Core 2 Duo now uses), and was widely considered to be consistently better than the 3.73Ghz Prescott based Extreme Edition that followed.

    The Prescott went against all the rules when you talk about a smaller manufacturing process: Less heat and power consumption for the same clock speed.

    Prescott somehow behaved exactly the opposite: it was hotter, consumed more than 20% more power and was generally slower! Call this stupid. And it only went up to 3.8Ghz! The Northwood core on a 90nm process would have gone past that easily. A Northwood at 3.4 Ghz had a TDP of 89w. the Prescott at the same speed had a TDP of 103w and sometimes was as good as a 3.0/3.2 Northwood.

    To sum it up, Intel started their downfall by mid 2004 and all of 2005, at which time they were already preparing the Core 2 architecture. But Intel is no fool, they didn't loose a big market share in the consumer market. But they were not very far from that happening if they didn't react like they did.

    A word on the Brand itself: the word "reliability" is of big importance. And while Intel may have done some wrong things in the past, this is something they have a special feeling for. Take the Pentium 4's for example, even those without EIST (northwoods), if you take out the heatsink they will insert wait cycles (or throttle back with EIST) to preserve the chip from burning, and eventually they shut down the system. The equivalent AMD at that time would just have had many more chances of burning out.

    It's because of small things like these that I prefer Intel. AMD is still building it's reputation as an independent chip manufacturer. In 1995 they were still making clone 486's, so they're still young. They already have a word to say in the server market and are respected by gamers, so it will be interesting to see what they can accomplish more now that Intel is back in the game.

    A word regarding the power efficiency in graphic cards: we could apply the very example about the Netburst architecture around 2003 I wrote above to the graphics card segment. Comparatively, this is were we are right now regarding graphic cards. At a turning point.

    Almost nobody is complaining about the tremendous power requirements graphic cards have, just like nobody said the Northwood core was power hungry: it just wasn't a question yet in the minds of many people. And the Northwoods were considered the best processors compared to AMD and nobody said "yes, but they consume more power, so the AMD is actually the better one".

    Nvidia did a great job with the 8800GTX because it is better than two 7900GTX and consumes only 8% more than one. Does having two power connectors make the 8800GTX a bad card? No, but in a year's time, probably. Not before long, Nvidia and ATI/AMD will start to get complaints about this and will have to start implementing power saving features like Intel's SpeedStepp or AMD's Cool n'Quiet, and adopt a better manufacturing process and even architecture. I surely hope to see ATI benefit from AMD's 65nm process and start applying it to their GPU's to reduce power consumption.


  18. You can't really argue against C2D not being number 1. It is simply the best computer product this year.

    I am however somewhat miffed that the nVidia 8800 series didn't make it in the top 3. The 8800 GTS/GTX is doing for graphics what the C2D is doing for CPU's.
  19. Anyone complaining about the C2D chips being placed in the #1 position based on the idea that they are an evolutionary development of ??? (whatever) is missing the point. All cpu's are an evolutionary development of the 1972 Intel 8008.

    I voted for the Conroe cpu because speed, computational power, thermal efficiency, availability, overclockability, and most importantly, price all converged into one product. Next year, who knows? But this year, there is no doubt.

    The same comments are true for the 8800 video cards: "Next year, who knows? But this year, there is no doubt."

    This year, I am 3 for 10: E6600, 8800 gts, and Sansa E260 mp3 player.
  20. I was considering voting core2duo, but then realized that although the technology is a large leap forward, and has incredible overclocking potential, I would not be able to make use of it.

    Although undoubtedly a faster processor will net faster gaming with a high-end video card and cpu, I will never be purchasing a high end cpu OR video card, and therefore I am far better off investing in a value-segment video card every year or two. I am seeing less and less visual difference between processor generation performance, because it is generally more evident in tedious processes now like video format conversion, etc. If I was planning a new motherboard and cpu purchase currently, there is no doubt I would save my pennies for a low-end core2duo, but I only do that every 3 years or so. Is it any surprise that processors will be faster every 3 years? when I am ready, I will choose the appropriate brand at that time.

    Personally, the Wii will make a far bigger impact on my life, and therefore I have to rank it higher. It's not the most powerful, but I feel it is more innovative, creative and I will actually see the benefit 100% (even if I bought a core2duo, I doubt I would appreciate its impact as much).
  21. has to be the C2D it totally transformed the CPU market and took all positives away from AMD

    whilst you could argue that the 8800 is an equal performance jump the cost increase is to, where as Intel released the C2D at very competative prices
  22. Quote:
    has to be the C2D it totally transformed the CPU market and took all positives away from AMD

    I totally agree that the C2D is the most significant CPU technology of our day. However, a faster CPU doesn't mean much. It's just a cycle perpetuating itself. One company makes CPUs more efficient and do more work faster then the other company follows suit. This will be happening for ages (if nuclear war doesn't kill us all first).

    The top ten was supposed to include any technology was it not. I may be missing the point. Does the number one slot always go to the best CPU of the year?

    This kind of frustrates me a bit because it seems like Nintendo did so much more with so much less than any other technology out there. You could also argue that the Wii will be significant for a much longer period of time if history is any indicator of how long a CPU architecture is significant. In two years everyone will be replacing their C2D with something else but I'll still be playing Wii.
  23. Perhaps a tie then. Wii has shown the world how fun it can be to play games on a watered down system. Intel has shown us that they have seen the light and moved on from Netburst :P
  24. Actualy, I think the most revlolutionary technology of 2006 is LCD's. They save so much space. They are wonderful. I am glad they were made in 2006. I have 2. I vote for an LCD and not for the C2D.
  25. Quote:
    I totally agree that the C2D is the most significant CPU technology of our day

    I dont agree. I think moving to 32bit processing was more significant... but I guess it depends whose day you are talking about.

    And yes, while CPU's upgrade, change sockets, get faster smarter hotter,.. The consoles win out being the most steadfast and unchanging while still being enjoyable 2 years down the line (or more).

    Xbox360 Q4 2005 x1 as fast
    Xbox365 Q1 2006 x1.2 as fast
    Xbox375 Q3 2006 x1.3 as fast
    Xbox400 Q1 2007 x1.5 as fast (all previous now outdated and non supported)
    Xbox450 Q2 2007 x1.9 as fast (xb400 declared beta and not supported)
    Xbox490 Q3 2007 x2.2 as fast
    Xbox590 Q4 2007 x2.6 as fast (all previous versions now outdated and not supported)

    Edit: Stranger... I was joking (use of broken gramar, short sentances... mocking the thread as well). Yar.
  26. How do you like your Dell? Work has us using 2 of those on our computers...and (asside from being overkill) they are great looking monitors.
  27. Yea, LCDs are pretty nice. I got a Samsung 19" WideScreen LCD when I built my computer and for Christmas, I'm getting a Sceptre (sp) 32" HDTV, HDCP, LCD Widescreen! =). Now, just need more desktop monitors with HDCP support. Then again, I don't mind using the 32" for a computer monitor :P
  28. will we actually need hdcp supporting monitors to use an internal hd/dvd SATA drive? I would think they would just hdcp to your video card if anything, then have regular DVI/HDMI output?

    If I will need an HDCP cable run to my PC monitor for HDDVD from my internal player or video card, then I'm not going to spend any money on a monitor yet which will not even play them (despite having sufficient resolution). time for another CRT?
  29. Just tell them that a bigger monitor will improve productivity %1000 percent, cause you will stop working untill they get you the one you want.
  30. Quote:
    I remember when the first P4s came out, and even then it was obvious they were a bad move as the clock speed was so much higher for no more performance compared to the P3. It was only when they doubled the cache with Northwoods that P4 became reasonable performers...

    I remember that.

    Two models clocked at 1.4 GHz and 1.6 GHz. And the former could barely beat a P3 at 1 GHz. (In some cases it performed below the P3)
  31. Its the Seagate Barracuda 7200 which would create shockwaves in the coming years

    Wait and Watch
Ask a new question

Read More

Hardware Components