I voted for the Core 2 Duo series of processors. It's with these processors that all came together for Intel. My second vote would go to the Nvidia 8800 series GPUs.
Talking about the Netburst architecture, I wouldn't say it was all that bad an architecture. At least up until some point. And it had some benefits we can now also enjoy on the Core 2 Duos: it made Intel develop a faster front side bus, better prediction mechanisms, the Enhanced Speed Step Technology (originally to fix the Prescott's misbehaving), and gave them the necessary experience to fine tune the number of pipelines and the clock speed and ultimately the performance they can achieve. (For example the Core 2 Duos have slightly more pipelines than the Core Duos (notebooks). This, while done in a balanced way (unlike with the Prescott), can keep the same performance while enabling future clock speed increases, which are still necessary).
It's no longer a Gigaherz clock race, but the Gigaherz will necessarily keep rising, although now a bit slower, because now we have the "optimization" factor, that has to be done before raising the clock speed.
Don't get me wrong: I think the Netburst architecture was a temporary path for Intel to find out some important things they are now combining with the Pentium III/M/Core(notebook) architecture to bring us the Core 2 Duo.
If they could have done it sooner ? I think yes, the could have skipped that huge mistake called Prescott, and have moved on to the Core microarchitecture.
Talking about the Pentium 4 performance I have to say that you shouldn't forget that although the Willamette core was only worthwhile past 1.7 Ghz compared to the PIII, and even then the MD was a better choice, the Northwood core on the other hand was consistently considered the fastest processor compared to the AMD. It was so with the 2.2 Ghz, 2.4, 2.6 Ghz, the "potent" 2.8 many magazines referred to, and finally the 3.0Ghz.
At this time very few people were aware of the power eficiency of the processors and many home users weren't deciding based on that, but we were on the verge of a turning point. This is exactly were we are nowadays with the graphics cards: at a turning point. At the end I will say a word on the graphics cards out there right now concerning this same matter.
This is to say that at that time, altough the Northwood wasn't exactly power efficient, nobody said it was a bad processor just because of that. It also wasn't as flagrant as the Prescott.
It was at that point that Intel started to phase in the Precott core, which in my opinion was the biggest and most stupid mistake they ever made. And they went even further with it by putting two of them to bring up the Pentium D line, instead of using an updated Northwood core.
Let's look at the facts: the Northwood core (in the disguised Extreme Edition Gallatin form) carried over to socket 775 and went up to 3.46 Ghz on the 130nm process! (It was also the first processor to use the 1066 fsb the Core 2 Duo now uses), and was widely considered to be consistently better than the 3.73Ghz Prescott based Extreme Edition that followed.
The Prescott went against all the rules when you talk about a smaller manufacturing process: Less heat and power consumption for the same clock speed.
Prescott somehow behaved exactly the opposite: it was hotter, consumed more than 20% more power and was generally slower! Call this stupid. And it only went up to 3.8Ghz! The Northwood core on a 90nm process would have gone past that easily. A Northwood at 3.4 Ghz had a TDP of 89w. the Prescott at the same speed had a TDP of 103w and sometimes was as good as a 3.0/3.2 Northwood.
To sum it up, Intel started their downfall by mid 2004 and all of 2005, at which time they were already preparing the Core 2 architecture. But Intel is no fool, they didn't loose a big market share in the consumer market. But they were not very far from that happening if they didn't react like they did.
A word on the Brand itself: the word "reliability" is of big importance. And while Intel may have done some wrong things in the past, this is something they have a special feeling for. Take the Pentium 4's for example, even those without EIST (northwoods), if you take out the heatsink they will insert wait cycles (or throttle back with EIST) to preserve the chip from burning, and eventually they shut down the system. The equivalent AMD at that time would just have had many more chances of burning out.
It's because of small things like these that I prefer Intel. AMD is still building it's reputation as an independent chip manufacturer. In 1995 they were still making clone 486's, so they're still young. They already have a word to say in the server market and are respected by gamers, so it will be interesting to see what they can accomplish more now that Intel is back in the game.
A word regarding the power efficiency in graphic cards: we could apply the very example about the Netburst architecture around 2003 I wrote above to the graphics card segment. Comparatively, this is were we are right now regarding graphic cards. At a turning point.
Almost nobody is complaining about the tremendous power requirements graphic cards have, just like nobody said the Northwood core was power hungry: it just wasn't a question yet in the minds of many people. And the Northwoods were considered the best processors compared to AMD and nobody said "yes, but they consume more power, so the AMD is actually the better one".
Nvidia did a great job with the 8800GTX because it is better than two 7900GTX and consumes only 8% more than one. Does having two power connectors make the 8800GTX a bad card? No, but in a year's time, probably. Not before long, Nvidia and ATI/AMD will start to get complaints about this and will have to start implementing power saving features like Intel's SpeedStepp or AMD's Cool n'Quiet, and adopt a better manufacturing process and even architecture. I surely hope to see ATI benefit from AMD's 65nm process and start applying it to their GPU's to reduce power consumption.
Cheers,
Tpi2007