Core i7-13700K, i5-13600K Show Surprising Minimum FPS Gains in New Benchmarks

Intel XTU guide image
(Image credit: Intel)

According to a new video by a content creator on Bilibili, it looks like Intel's Raptor Lake architecture is much better at gaming compared to its Alder Lake predecessors -- but not in the way you may think. In the video, a couple of Raptor Lake engineering samples were benchmarked in several video games and compared to their Alder Lake predecessors. The benchmarks reveal mediocre average and maximum FPS gains but a huge boost in minimum frame rates. For more details -- check out the full frame rate data here.

For the uninitiated, minimum frame rate stats are some of the most important performance metrics for gaming. Minimum frame rate graphs showcase your worst frame rates in the game or scene being benchmarked.

Improving this result doesn't look great on paper - since your maximum or average FPS doesn't get any (or at least much) higher. But improving your minimums can play a significant role in improving performance stability in your game. Improving this area of performance allows your game's frame rate to fluctuate less and less, giving you a smoother gaming experience.

Benchmarks

Swipe to scroll horizontally
i5-13600K Raptor Lake Vs. i5-12600K Alder Lake (DDR5 Only)
Resolution - Frame Rate Analysisi5-12600K DDR5Core i5-13600K QS DDR5
4K Max FPS100%103.26%
4K Average FPS100%102.36%
4K Minimum FPS100%112.23%
1440P Max FPS100%107.99%
1440P Average FPS100%104.64%
1440P Minimum FPS100%110.86%
1080P Max FPS100%112.11%
1080P Average FPS100%111.50%
1080P Minimum FPS100%114.13%
Swipe to scroll horizontally
i7-13700K Raptor Lake Vs. i7-12700K Alder Lake (DDR5 Only)
Resolution - Frame Rate Analysisi7-12700K DDR5Core i7-13700K QS DDR5
4K Max FPS100%107.84%
4K Average FPS100%104.06%
4K Minimum FPS100%113.04%
1440P Max FPS100%109.84%
1440P Average FPS100%105.30%
1440P Minimum FPS100%111.04%
1080P Max FPS100%106.52%
1080P Average FPS100%106.65%
1080P Minimum FPS100%111.38%

Several configurations were tested in this suit of benchmarks, including a Core i7-13700K with a max turbo of 5.3GHz, a Core i5-13600K turbo'ing up to 5.1GHz, and a Core i7-12700KF and Core i5 12600K. Each chip was also paired with DDR4 and DDR5 memory to measure differences between the two architectures in terms of memory performance; however, memory timings are not listed, so take memory-specific results with some salt. For the GPU, a GeForce RTX 3090 Ti was used for all tests.

Thanks to @harukaze5719, who took the ten games worth of benchmark data from the Bilibili video (not including 3DMark) and compressed all the data into four graphs: one for the 13700K and one for the 13600K, with DDR4 and DDR5 graphs.

Looking at the DDR5 13700K graphs, the Raptor Lake chip shows very minimal performance gains compared to the 12700KF in average and maximum frame rates. The increases averaged anywhere between +2% and +7% at 1080P, 1440P, and 4K - with the only exception being a 9.7% performance gain at 1080P max. 

However, the minimum FPS graph tells a different story, with the 1080P and 4K results showing over a 10% performance gain, at 113% and 111%, respectively. These gains are substantially larger than the average and maximum results.

The Core i5-13600K results show similar behavior, with significantly higher minimum frame rate gains at 4K and 1440P compared to the older 12600K. This is especially true of the 4K minimum results.

But the 1080P results differ greatly from the 13700K results, with barely a 2% delta between the minimum, average and maximum frame rate data altogether. Nonetheless, the minimum frame rate gains are still the highest of all three.

The memory results between Raptor Lake and Alder Lake CPUs aren't that exciting, with most of the results being within the margin of error. The only exception is the 1440P results, where performance gains of 5% were seen.

Arguably, the most interesting results are the individual game benchmarks. Most of the games show minor (if any) gains between all four chips, but in the titles that do, the gains are massive for the Raptor Lake parts.

In PUBG, Naraka, Apex Legends, and Red Dead Redemption 2, minimum frame rates -- at varying resolutions, saw anywhere between 25% to 44% extra gains in favor of the 13700K compared to the 12700KF, in both minimum and maximum FPS - but mainly in the minimums.

The 13600K results were similar but in a different selection of games. That chip gained 20% to 35% additional performance in PUBG, Naraka, Apex Legends, Monster Hunter: Rise, Far Cry 6, and Horizon: Zero Dawn.

It is unknown at this point why Raptor Lake parts are doing so well in performance statistics where minimum frame rates are concerned, but it's not a bad place for Intels' upcoming platform. Of course, the performance bias towards this metric could be related to larger or faster L2 and L3 caches, but it's just a guess at this point.

Whatever the real reason, Raptor Lake is shaping up to be an excellent gaming CPU architecture, just like Alder Lake. These samples are also clocked moderately low for an architecture that is rumored to hit close to 6GHz boost frequencies. So there's a good chance there's more performance to be had on full retail models.

Aaron Klotz
Freelance News Writer

Aaron Klotz is a freelance writer for Tom’s Hardware US, covering news topics related to computer hardware such as CPUs, and graphics cards.

  • AgentBirdnest
    Interesting! Probably not the sexiest of stats for most people, but I'm really happy to see higher minimums.
    I wonder what is causing that. The larger caches theory mentioned in the article sounds plausible... I would love to know for certain though!
    Reply
  • helper800
    AgentBirdnest said:
    Interesting! Probably not the sexiest of stats for most people, but I'm really happy to see higher minimums.
    I wonder what is causing that. The larger caches theory mentioned in the article sounds plausible... I would love to know for certain though!
    It seems to be the most plausible answer to the question. The 5800X3D also have much higher minimums in some games which can be directly attributed to the L3 cache increase.
    Reply
  • AgentBirdnest
    helper800 said:
    It seems to be the most plausible answer to the question. The 5800X3D also have much higher minimums in some games which can be directly attributed to the L3 cache increase.
    Good point, I forgot about that. It's good to see this coming from both camps.
    Reply