Intel Kills Itanium, Last Chip Will Ship in 2021

Itanium 9500-series chip architectureItanium 9500-series chip architecture

On Thursday, Intel notified its partners that it's going to discontinue the Itanium 9700-series Kittson processors, which were the last processors on the market to use Intel’s Itanium architecture. Intel plans to stop shipping the Itanium-based Kittson processors by mid-2021

The impact on customers is expected to be minimal as HP Enterprise (HPE) is the only Itanium customer left. HPE is expected to support its Itanium-based servers until late 2025. HP is expected to request the final Itanium processor orders by January 30, 2020. Intel will then ship the last Itanium CPU by July 29, 2021.

Intel introduced the Kittson processors in 2017, while also announcing that they will be the last processors that the company will make using the IA-64 architecture. Kittson wasn’t a brand new microarchitecture design, but rather a higher clocked version of the 2012 9500-series Poulson microarchitecture.  The Poulson microarchitecture was supposed to be a performance class above Intel's Xeon processors, featuring 12-instructions per cycle issue width, four-way Hyper-Threading and multiple reliability, availability and serviceability (RAS) capabilities.

Intel also hasn’t moved Itanium chips manufacturing to a new process node since 2010 and is made on Intel’s 2010 32nm planar processor node.

Although the end-of-life for the Itanium chips has been planned since at least 2017, the fact that Hyper-Threading was found to be be vulnerable to at least two attacks last year, may have also played a role in sunsetting its Hyper-Threading-heavy Itanium processors.

Itanium, the Future that Never Was

Itanium (originally called EPIC) was a 64-bit architecture initially developed by HP before Intel joined efforts. The tech was supposed to replace the 32-bit x86 architecture, once operating systems and applications would start needing to support 64-bit operations.

The first Itanium processors were supposed to ship in 1998, but despite the fact that Microsoft and other operating system vendors committed to supporting it, it turned out that Itanium’s very long instruction word (VLIW) architecture was too difficult to implement while maintaining the competitive prices and performance of other architectures, including the 32-bit x86 architecture.

The first Itanium chip was delayed to 2001 and failed to impress most potential customers who stuck to their x86, Power and SPARC chips.

By 2003, AMD launched the 64-bit AMD64 instruction set architecture (ISA) extension to the x86 ISA, which was quickly adopted by the industry due to the easy migration path from the 32-bit x86 ISA. Under pressure from Microsoft, Intel also adopted the AMD64 ISA, and the rest is history. AMD64 became the de facto 64-bit ISA in the industry. Intel kept improving the Itanium architecture over the years, but it's been nothing like the rate of improvement we saw for x86 chips in the past two decades.

HP itself may have single-handedly kept Itanium alive in the time since the AMD64 ISA took over. In a 2008 lawsuit between HP and Oracle, unsealed documents revealed that HP paid Intel $440/£336 million to support the development of Itanium processors between 2009 and 2014.

HP and Intel signed another contract together promising Intel another $690/£528 million for the development of Itanium processors until 2017, which is when the last Itanium processor was built.

12 comments
    Your comment
  • jimmysmitty
    The biggest loss is that we never went to a pure 64bit uArch and are still tied down to the aging x86 design. While its nice to have for migration it doesn't help us move to a pure 64bit world. Many programs are still written in 32bit which don't take advantage of the biggest thing 64bit gave us, more than 4GB of total system memory.
  • joeblowsmynose
    149725 said:
    The biggest loss is that we never went to a pure 64bit uArch and are still tied down to the aging x86 design. While its nice to have for migration it doesn't help us move to a pure 64bit world. Many programs are still written in 32bit which don't take advantage of the biggest thing 64bit gave us, more than 4GB of total system memory.


    If the main feature of a 64bit world is the ability to use more than 4gb of RAM (which I agree, it is), and many programs still don't require more than 4gb, and, if your program does need more than that then programing it in 64 bit would be your choice (if HD caching can't be used due to performance restrictions), I am not sure where the loss really is ...

    Would there be an advantage if we removed the 32bit instruction sets from everything? If so, where might we find that?
  • richardvday
    That's not true.
    The operating system vendors are almost unanimously killing off 32 bit versions of there OS anyway
    Just because an application is 32 bit does not limit the operating system to only 4gb.
    Now that 32-bit application is limited in how much memory it can access.
    Most applications do not need that much memory anyway and the applications that do are written in 64-bit so they can access the memory they need .
    I'm not saying we should stick to 32 bit by any means the move to 64-bit needs to continue by default most companies are writing their applications in 64-bit now anyway