Sign in with
Sign up | Sign in
Your question

Why doesnt AMD integrate hyperthreading?

Last response: in CPUs
Share
August 12, 2009 2:06:55 AM

I've been wondering why doesn't AMD use hyperthreading in their chips, if they did the i7 would have a formidable competitor. Are there any plans for any hyperthreading for AMD, or any other reveloutionary (per se) plans coming up?
a b à CPUs
August 12, 2009 4:39:23 AM

cause Intel has the PATENT.... simple as that........
a c 452 à CPUs
a b À AMD
August 12, 2009 9:33:20 AM

Because AMD has not sued Intel for the rights to use it like they have most of everything else. :p 
Related resources
August 12, 2009 9:50:25 AM

because they haven't finished off their version of SMT yet, and it's expected to be put in to play sometime in the next 2-3 years.
August 12, 2009 12:41:42 PM

Yeah, I believe AMD's official story is that it's not necessary, but it will likely show up in their chips sooner or later. That does involve significant architecture redesign though, and remember AMD has significantly fewer R&D resources. Unlike the Intel Tick-Tock strategy we see fewer architecture revamps from AMD. The next one should be the supposed Bulldozer architecture. Now in the time that AMD goes from Phenom to Bulldozer, Intel has gone from Core 2 Duo, to Nehalem, and then Sandy Bridge should hit around the same time as Bulldozer. So that's three Intel architectures in the same time that we see two from AMD.
a c 94 à CPUs
a b À AMD
August 12, 2009 1:00:35 PM

AMD has their own SMT patents going back to the late 1990s (if not earlier). Their most recent patent filings relate to ""Cluster-based Multi-threading"" or CMT.

There are several issues at work here, the most important being that parallel processing on the GPU is far more efficient than the creation of virtual CPU cores.

To that end AMD has filed patents which describe the creation of the Unified Processor ('unified' being one of the most overused terms in the industry :)  ).

BUT, the rumblings on the internets postulate how a unified cluster may utilize more than 4 instructions per cycle (which would be 'smashing').

That would be highly dependent upon the instruction set.
a b à CPUs
August 12, 2009 1:31:05 PM

For me I think their smaller R&D would ideally focus on either adding more cores, increasing IPC or creating heterogeneous cores. SMT may not be the best thing now as it may require a lot of development time to make it work with the current Phenom 2. Intel already had the advantage as it had smt on a previous product line and adopting it required smaller changes or updates.

The Hyper-Threading initially introduced on intel's P4 line did make sense before as they had no dual-core solution. It was the "poor-man's dual-core" but it did make using XP albeit more responsive than a single-core Athlon. Now people have more cores than they could usually handle (think surfing the net with a quaddie).
August 12, 2009 3:28:02 PM

amnotanoobie said:
For me I think their smaller R&D would ideally focus on either adding more cores, increasing IPC or creating heterogeneous cores. SMT may not be the best thing now as it may require a lot of development time to make it work with the current Phenom 2. Intel already had the advantage as it had smt on a previous product line and adopting it required smaller changes or updates.

The Hyper-Threading initially introduced on intel's P4 line did make sense before as they had no dual-core solution. It was the "poor-man's dual-core" but it did make using XP albeit more responsive than a single-core Athlon. Now people have more cores than they could usually handle (think surfing the net with a quaddie).


^This
a c 122 à CPUs
a b À AMD
August 13, 2009 8:27:39 AM

I think that even now SMT has its place like in server enviroments where they do help or even video encoding and the such.

but normally I prefer real cores.

As for AMD, who knows. And if they try going a approach like a ATI GPU I am not sure it could truly work since, yes GPUs are great for single same type instructions all at once (hence F@H) but for what CPUs do they do suck.
!