Detailed specs for dozen Intel Arrow Lake desktop CPUs leaks ahead of touted October 10 launch date

Intel Raptor Lake processors for desktops
(Image credit: Intel)

Intel’s next-gen desktop chips are rumored to launch this October, and a couple of hardware leakers have revealed the alleged specs of 12 of the 14 SKUs. Twitter/X user @harukaze5719 was the first to post a table of Intel’s Arrow Lake-S processors, but @jaykihn0 soon replied with their own more detailed data set. 

These chips are the successors to the troubled Intel 14th-gen Raptor Lake Refresh desktop processors, and, interestingly, will drop the long-established hyperthreading technology. The leaked specifications for the new Intel chips are tabulated below, but we remind you to please take this information with a pinch of salt:

Swipe to scroll horizontally
Header Cell - Column 0 CoresBase Clock (GHz)Boost Clock (GHz)Base TDPIntegrated GPUXe Count
Intel Core Ultra 9 285K8P + 16E (24)3.7 / 3.25.4 / 4.6125WYes64
Intel Core Ultra 9 2858P + 16E (24)2.5 / 1.95.3 / 4.665WYes64
Intel Core Ultra 9 285T8P + 16E (24)1.4 / 1.24.7 / 4.535WYes64
Intel Core Ultra 7 265K8P + 12E (20)3.9 / 3.95.2 / 4.6125WYes64
Intel Core Ultra 7 265KF8P + 12E (20)3.3 / 3.35.2 / 4.6125WNoN/A
Intel Core Ultra 7 2658P + 12E (20)2.4 / 1.85.1 / 4.665WYes64
Intel Core Ultra 7 265F8P + 12E (20)1.5 / 1.24.6 / 4.565WNoN/A
Inel Core Ultra 7 265T8P + 12E (20)2.4 / 1.85.0 / 4.535WYes64
Intel Core Ultra 5 245K6P + 8E (14)4.2 / 3.65.0 / 4.6125WYes64
Intel Core Ultra 5 245KF6P + 8E (14)4.2 / 3.65.0 / 4.6125WNoN/A
Intel Core Ultra 5 2456P + 8E (14)--65W--
Intel Core Ultra 5 2356P + 8E (14)--65W--
Intel Core Ultra 5 2256P + 4E (10)3.3 / 2.74.7 / 4.465WYes32
Intel Core Ultra 5 225F6P + 4E (10)3.3 / 2.74.7 / 4.465WNoN/A

This table is, so far, the most detailed listing of Intel’s upcoming chips. However, given that these are leaks and not official data from Intel, we cannot say with 100% certainty that these numbers will be accurate.

If the October 10 release date is correct, we’re still about a couple of months away from the official launch and subsequent availability of these chips. So, Intel might still make some last-minute changes, although this is unlikely.

Intel says that these chips will be significantly more power efficient than the 13th- and 14th-gen Raptor Lake family of chips and that they won’t be affected by the instability issues plaguing their forebearers. 

The company needs to launch these chips soon, as its profitability has taken a massive hit recently and it needs to reassure both enthusiasts and investors. Furthermore, it must prove to its client base that it can recover from the Raptor Lake fiasco by releasing power-efficient and stable processors that deliver leading performance without any self-inflicted damage during their service life.

Jowi Morales
Contributing Writer

Jowi Morales is a tech enthusiast with years of experience working in the industry. He’s been writing with several tech publications since 2021, where he’s been interested in tech hardware and consumer electronics.

  • Neilbob
    I really do hope that the power consumption of these is much improved, and is at least comparable to AMD; I'm here for that. I am very tired of the ridiculous pursuit of performance-at-any-cost angle Intel has been taking the last few years (AMD too, to a lesser extent in case anyone thinks I'm being biased). They started going all in with the 9900K, and it's been downhill ever since.

    And in case some people emerge to blather about Raptor Lake, yes, I know that efficiency can be drastically improved by tweaking clock/voltage settings and yada yada, etc, but my interest in faffing about with such settings has evaporated in recent years - and the enormous majority of users wouldn't even know if or how. If some people want to jigger about with overclocking and voltages then let them, but default, out-of-the-box characteristics are far more important, and I'd much rather take a 5% performance hit in order to save 30% or more power (for example. I don't know the numbers).

    I don't want to see yet more outrageous power consumption bars in the reviews. Crossed fingers I'm not being naïve here.
    Reply
  • JRStern
    Integrated GPU means what? Graphics? NPU? Both? Neither? What?

    Even the -5 level chips seem like plenty to me. What are they thinking?
    Reply
  • Dustyboy1492
    Neilbob said:
    I really do hope that the power consumption of these is much improved, and is at least comparable to AMD; I'm here for that. I am very tired of the ridiculous pursuit of performance-at-any-cost angle Intel has been taking the last few years (AMD too, to a lesser extent in case anyone thinks I'm being biased). They started going all in with the 9900K, and it's been downhill ever since.

    And in case some people emerge to blather about Raptor Lake, yes, I know that efficiency can be drastically improved by tweaking clock/voltage settings and yada yada, etc, but my interest in faffing about with such settings has evaporated in recent years - and the enormous majority of users wouldn't even know if or how. If some people want to jigger about with overclocking and voltages then let them, but default, out-of-the-box characteristics are far more important, and I'd much rather take a 5% performance hit in order to save 30% or more power (for example. I don't know the numbers).

    I don't want to see yet more outrageous power consumption bars in the reviews. Crossed fingers I'm not being naïve here.
    It's a new node 20A, or ~2nm, should be close to same density as TSMC's leading node, I would expect efficiency to be considerably improved over 7nm that was used for 12-14th gen.
    Reply
  • bit_user
    The article said:
    Can Intel rise up from the ashes with its Arrow Lake chips?
    Ashes??? They might be on fire, but they're certainly not a heap of ashes!

    Neilbob said:
    I really do hope that the power consumption of these is much improved,
    ...
    I don't want to see yet more outrageous power consumption bars in the reviews. Crossed fingers I'm not being naïve here.
    If not, there's always Bartlett Lake. Said to launch with up to 12P cores, in early 2025.

    BTW, if we do see something like a climb-down from high TDPs, perhaps it'll be more due to things like practical limits on thermal density holding a lid on clock speeds, than either company deciding to go Eco.
    Reply
  • bit_user
    JRStern said:
    Integrated GPU means what? Graphics? NPU? Both? Neither? What?
    Good question. The F models normally lack graphics, but whether or not they'll retain the NPU remains to be seen.

    If I had to guess, I'd say probably no NPU. The reason being that you pretty much only buy a F model if you've got a dGPU and those have even higher AI performance. At that point, the integrated NPU becomes unnecessary. So, it would make sense for Intel to lump together all dies with defects in either the iGPU or NPU as "F" models.

    JRStern said:
    Even the -5 level chips seem like plenty to me. What are they thinking?
    These core counts match those of Raptor Lake S. The thread counts are lower, due to the loss of hyperthreading. So, they certainly can't afford to step back on core counts, or else they'd really be hurting on MT performance.
    Reply
  • Amdlova
    Intel cpus without graphics it's no go for me.

    - You can drive a FULLHD display at 165hz with less than 1w
    - Using the IGPU you can shove almost 10w from the GTX 4060Ti 16gb from 17w to 7w (idle)
    - AMD or NIVIDIA can't Display at 165hz or more with out BREAK the POWER! 40w or more for nothing
    - Works with some softwares to record your game play, removing some CPU usage
    - Can Display multiple video wallpapers without Drain your house down over the wall
    - You can use Intel quick sync to speed up some steve jobs

    PS - FOR AMD REASON and only AMD reasons Don't WORK WELL WITH AMD CARDS.
    Reply
  • bit_user
    Amdlova said:
    Intel cpus without graphics it's no go for me.

    - You can drive a FULLHD display at 165hz with less than 1w
    I'm sure that if you measured your PC power at the wall in display power saving vs. showing desktop, the delta would be more than 1 W. The reason you shouldn't look at just package power or self-reported GPU power is that it doesn't capture all of the components that measuring a GPU would, including RAM and PHY.

    That's about what I'm seeing, even on a lowly Alder Lake N97. In fact, I'm seeing about a 1.3 - 1.5 W delta and it has just single-channel DDR5-4800 and is currently hooked up to a 1080p monitor @ 60 Hz.

    Amdlova said:
    - AMD or NIVIDIA can't Display at 165hz or more with out BREAK the POWER! 40w or more for nothing
    Well...
    Makaveli said:
    meh out of all the titles they have listed for the RT I only have 2 and that is both Spiderman Remastered and Miles and they both play fine for me at 3440x1440 UW with RT on.

    "4080 Super used just 16.5W while idle, 20.8W while playing back a 4K AV1 video using VLC, and 31.0W when playing the same video directly on YouTube. The 7900 XTX used 19.1W while idle, 57.8W while playing our test video in VLC, and 96.8W when viewing the video on YouTube."
    idle power draw will depend on monitor setup.

    I idle at 7 watts on my 7900XTX and single display not the 19 watts you are seeing in this review.

    Youtube video play back for me sits at 32 watts not 3x the amount you are seeing at 96.8 watts....

    VLC video play back for me on my system is 26 watts.

    Jarred subsequently clarified that his 19.1W figure from the RX 7900 XTX was for a 4K display running at 144Hz. Makaveli said the 7 W figure was for 3440x1440 @ 144hz.

    Both are far below your claim of 40 W, and both are above the pixel scanout rate of 2560x1440 @ 165 Hz. That's also a big GPU with lots of GDDR6 and L3 cache. The number should scale down pretty well, for smaller GPUs.
    Reply
  • rluker5
    I think the large increase in base clocks at the same power with a reduction in boost clocks is evidence that the power consumption of the unlocked chips will be significantly lower at stock settings and max usage..

    Not proof, just supporting evidence.
    Reply
  • thestryker
    The thing I'll be looking for when these come out is the performance scaling as the power increases. While I would never let a CPU off of it's leash RPL is interesting in that once you reach a certain point performance scales linearly with increased power consumption (it's not what I would consider worthwhile as it is awful efficiency wise). If ARL can maintain higher clocks within power limits similar to what AMD has done with Zen 4/5 then it ought to be a winner.
    Reply
  • Thunder64
    Amdlova said:
    Intel cpus without graphics it's no go for me.

    - You can drive a FULLHD display at 165hz with less than 1w
    - Using the IGPU you can shove almost 10w from the GTX 4060Ti 16gb from 17w to 7w (idle)
    - AMD or NIVIDIA can't Display at 165hz or more with out BREAK the POWER! 40w or more for nothing
    - Works with some softwares to record your game play, removing some CPU usage
    - Can Display multiple video wallpapers without Drain your house down over the wall
    - You can use Intel quick sync to speed up some steve jobs

    PS - FOR AMD REASON and only AMD reasons Don't WORK WELL WITH AMD CARDS.

    I never understood why someone would save $ 30 on a F CPU when you give up so much. An iGPU that is servicable if you are w/o a video card for some reason or end up selling it or giving it away. Also, Quicksync. Maybe not as useful these days as in the past but still a nice feature for only a little more money.
    Reply