Zhaoxin's new ZX C-1190 iGPU performs like a decade-old iGPU

Zhaoxin's KX-7000 CPU.
(Image credit: Zhaoxin)

Although Zhaoxin's KX-7000 processors based on the company's latest Century Avenue microarchitecture promise a significant performance uplift compared to their predecessors in general-purpose applications, its integrated graphics does not impress. According to Benchleaks data, performance is lower than that of a decade old iGPU.

Zhaoxin's eight-core Kaixian KX-7000/8 processors integrate the company's latest ZX C-1190 iGPU design, which is said to be DirectX 12/OpenGL 4.6/OpenCL 1.2-capable and provide four-times the performance of its predecessor. This may not be enough to be competitive on the modern market though as it scores just 2024 points in Geekbench 5 OpenCL benchmark (via @Benchleaks).

To put the number into context. Intel's slowest integrated UHD Graphics 730 GPU benchmark scores are around 7500 points. AMD's lowest-end discrete desktop Radeon RX 6300 graphics card scores 26,127 points in the same test. Even Intel's 10-years old HD Graphics 4400 scores 2500 ~ 3000 points in this benchmark.

Of course, an OpenCL benchmark does not necessarily reflect gaming capabilities of a GPU, but it gives us an idea of how the compute capabilities of one graphics processor stack up against compute capabilities of another. Perhaps, Zhaoxin will be able to boost performance of its ZX C-1190 iGPU with better drivers, but it does not look like this is the case here.

Zhaoxin's integrated graphics processor is not supposed to be a performance champion. It packs four compute units operating at 700 MHz and while we do not have any information about exact specifications of these CUs, it is unlikely to think that each one packs hundreds of stream processors. 

While Zhaoxin's ZX C-1190 iGPU does not impress with OpenCL compute performance, it can still handle decoding and encoding of H.265/H.264 video at up to 4K as well as driving DisplayPort, HDMI, and D-Sub/VGA outputs. Since Zhaoxin's CPUs are designed primarily for office PCs and workhorse notebooks, its graphics capabilities are hardly something that its users will ever think about, as long as it can draw windows fast enough.  

Anton Shilov
Freelance News Writer

Anton Shilov is a Freelance News Writer at Tom’s Hardware US. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.

  • gg83
    I was thinking about AI assistants today. Soon AI will fix coding errors on the fly. Like if a bug crashes a program, the AI will just fix it and relaunch. AI will also be able to maximize efficiency so maybe it won't need the most complex and bleeding edge technology?
    Reply
  • Notton
    As long as it can output 2x 4K60Hz signals, support hardware AV1, etc, and run efficiently, why not.
    Reply
  • ivan_vy
    enough for spreadsheets and word processor, thousands of office and schools will use this, like the old days of VIA SG3 chips.
    Reply
  • Gururu
    Moore's law still valid for them, so they will catch up in no time?
    Reply
  • salgado18
    ivan_vy said:
    enough for spreadsheets and word processor, thousands of office and schools will use this, like the old days of VIA SG3 chips.
    Reply
  • jlake3
    Gururu said:
    Moore's law still valid for them, so they will catch up in no time?
    They’re believed to be on a 7nm process, and are underperforming a 22nm iGPU that wasn’t even the top model of its generation. They’ve got some architecture work cut out for them.
    Reply
  • DavidC1
    gg83 said:
    I was thinking about AI assistants today. Soon AI will fix coding errors on the fly. Like if a bug crashes a program, the AI will just fix it and relaunch. AI will also be able to maximize efficiency so maybe it won't need the most complex and bleeding edge technology?
    No it wont, and if you continue to think that way, you'll be very disappointed.

    It is so "smart" that image generator AI sometimes shows up watermarks embedded in the original pictures. It is a very sophisticated pattern recognition system that's all.

    Also, if you play with image generation AI out there, it shows messed up faces, especially those farther away in the back. And it is really bad with generating fingers.
    Reply
  • Joseph_138
    China is trying to ramp up their chipmaking capabilities. It's going to be slow, at first, but I think they'll catch up quickly, and put a real scare into the west.
    Reply
  • gg83
    DavidC1 said:
    No it wont, and if you continue to think that way, you'll be very disappointed.

    It is so "smart" that image generator AI sometimes shows up watermarks embedded in the original pictures. It is a very sophisticated pattern recognition system that's all.

    Also, if you play with image generation AI out there, it shows messed up faces, especially those farther away in the back. And it is really bad with generating fingers.
    I wasn't talking about image genorators, or current day AI. It's kinda funny. Just today, Tom's has an article with Jensen saying leave coding to AI. Basically what I was thinking about. AI fixing coding errors on the fly because AI coded the original.

    https://www.tomshardware.com/tech-industry/artificial-intelligence/jensen-huang-advises-against-learning-to-code-leave-it-up-to-ai#xenforo-comments-3837759
    Reply