Get all your news in one place.
100’s of premium titles.
One app.
Start reading
TechRadar
TechRadar
Craig Hale

Arm unveils new Lumex AI focused smartphone CPUs with some impressive stats

Arm Lumex.
  • Arm’s Lumex chips promise huge improvements to on-device AI
  • Its CPUs offer up to 5x better AI performance
  • CPUs are seen as the powerhouse for AI

Arm has lifted the wraps off its next-generation Lumex chip designs, optimized to run some local AI workloads on mobile devices.

Its architecture allows for four different design types, ranging from energy-efficient cores for wearables to high-performance cores for flagship smartphones.

Slating accelerated product cycles, which result in tighter timescales and reduced margin for error, Arm says its integrated platforms combine CPU, GPU and software stacks to speed up time-to-market.

Arm’s Lumex could be used in your next smartphone

Arm described Lumex as its “new purpose-built compute subsystem (CSS) platform to meet the growing demands of on-device AI experiences.”

The Armv9.3 C1 CPU cluster includes built-in SME2 units for accelerated AI, promising 5x better AI performance and 3x more efficiency compared with the previous generation.

Standard benchmarks see performance rise by 30%, with a 15% speed-up in apps and 12% lower power use in daily workloads compared with the prior generation.

The four CPUs on offer are C1-Ultra for large-model inferencing, C1-Premium for multitasking, C1-Pro for video playback and C1-Nano for wearables.

The Mali G1-Ultra GPU also enables 20% faster AI/ML inferencing than Immortalis-G295, as well as improvements across gaming like 2x better ray tracing performance.

Lumex also offers G1-Premium and G1-Pro options – but no G1-Nano.

Interestingly, Arm positions CPUs as the universal AI engine given the lack of standardization in NPUs, even though NPUs are starting to earn their place in PC chips.

Launching with Lumex is a complete Android 16-ready software stack, SME2-enabled KleidiAI libraries and telemetry to analyze performance and identify bottlenecks, allowing developers to tailor Lumex to each model.

“Mobile computing is entering a new era that is defined by how intelligence is built, scaled, and delivered,” Senior Director Kinjal Dave explained.

Looking ahead, Arm notes that many popular Google apps are already SME2-enabled, meaning that they’re prepared to benefit from improved on-device AI features when next-generation hardware becomes available.

You might also like

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.