Earlier in the week, we reported how the new Celeron G6900 from Intel's 12th Generation Alder Lake family matched the Core i9-10900K (Comet Lake) in single-threaded performance. In addition, the Geekbench benchmark results highlighted the punch of the new Golden Cove cores. However, it is much better to see and test processor performance in the real world, and Steve at Random Gaming in HD (opens in new tab) (RGIHD) has obliged with a nine-game test suite.
To recap the Intel Celeron G6900's specifications, it has two Golden Cove cores and no Hyper-Threading. It runs at 3.4 GHz, with 4MB Smart Cache and 2.5 MB of L2 cache. The processor supports up to DDR4-3200 or DDR5-4800. Intel's UHD Graphics 710 iGPU has up to a 1.3 GHz boost and 16 EUs. This Celeron has a 46W TDP.
As he was testing the base model in the Alder Lake lineup, which cost £55 (MSRP $42 in the US), Steve saw it to match it with the cheapest Intel LGA1700 motherboard he could find, the Gigabyte H610M S2H DDR4 for which he paid £80 (about $110). The RAM quantity wasn’t skimped upon for a budget build, with 16 GB (8 x 2 DIMMs) of DDR4-3000.
He didn’t say anything about the storage, but the GPU in the system for game testing was the Nvidia T1000, a creator-focused card with very similar specifications to the GTX 1650. In a previous video, RGIHD said he picked up this T1000 as it had better availability/pricing than the gamer/consumer-focused GTX 1650. So it might be the same for you, depending on your region.
The system, as mentioned earlier, was tasked with a decent range of popular games, new and old, using sensible quality settings.
The results showed The Witcher 3 at 1080p high with an average of 43 FPS, marred by lengthy intermittent freezes, which couldn’t accurately qualify as stutters – these pauses were far too long (1% low 0 FPS)
In GTA San Andreas at 1080p high, the system achieved an average of 58 FPS, and the glitches and slowdowns were much more bearable (1% low 15 FPS).
For Cyberpunk 2077 playing at 1080p low in the Badlands, we saw an average of 47 FPS 1% low of 11 FPS). However, this game has a bigger problem, as it will not load save games when this lowly 2C/2T processor is your CPU. It is hard to complain about this “bug” as the Celeron is two cores short of the minimum specced CPU.
The YouTuber ran CS:GO at 1080p low settings on the Dust II map. The average in-game performance was 120 FPS (1% low 42 FPS), so performance was much more respectable.
Steve went on to test Fortnite, Forza Horizon 5 (hangs at loading screen), GTA 5, Far Cry 6 (so slow it is unplayable), and Red Dead Redemption 2 ran at an average 34 FPS using 1080p “console settings.”
A common observation throughout the testing was that the Celeron held this system back in gaming. If you flick through the video, you will often see CPU utilization at or near 100% while the GPU isn't really under duress (unless it is in Cyberpunk 2077). The system is not balanced for gaming.
To conclude his exciting insight into the G6900, Steve indicated that there wasn't much point in going for this latest-gen Celeron over any previous-gen entry-level Intel processor that is still available due to the higher motherboard costs at this time. However, on an optimistic note, RGIHD intends to look at the new Pentium Gold G7400, which Steve will be testing with the iGPU as well as an appropriate discrete GPU. The Pentium has several upgrades over the Celeron, including its 2C/4T configuration, a faster clock speed of 3.7 GHz, and double the Smart Cache.