Arm Lumex Platform Powers On-Device AI for Vivo & OPPO
The race for on-device AI supremacy is heating up, and Arm’s new Lumex platform is poised to shake things up. With smartphone giants like Vivo and OPPO already leveraging its capabilities, Lumex promises faster, smarter, and more personalized AI experiences directly on your device – a move that could redefine how we interact with our phones.Launched in September 2025, the Lumex Platform represents Arm’s most ambitious foray into mobile AI. Combining the Arm C1 CPU cluster with Arm Scalable Matrix Extension 2 (SME2) and the Arm Mali G1-Ultra GPU, Lumex aims to put cutting-edge intelligent computing power into the hands of billions.

The impact of Lumex is already being felt. Just a month after its release, Vivo and OPPO are among the first to integrate the SME2-enabled C1 CPUs and Mali G1-Ultra GPU into their flagship smartphones, delivering real-world AI enhancements to users.

Vivo’s X300 series, for instance, boasts a 20% speed increase in live translation and a 30% boost in image recognition. This is thanks to SME2’s low-precision matrix acceleration, which enables rapid, efficient, and private language translation and photo sorting.

Meanwhile, OPPO’s Find X9 series demonstrates notable improvements in automatic speech recognition. This translates to faster voice control and transcription across various apps, making the user experience more seamless and responsive.

Even before Android integration, SME2 was making waves in iOS, accelerating on-device intelligence across a range of devices, including the iPhone 16 and 17, iPad Pro, MacMini, MacBook Pro and MacBook Air. SME2’s cross-platform reach benefits billions of Arm-based devices.

Lumex is about more than just faster devices; it’s about unlocking new possibilities for app developers and end-users. Developers are leveraging SME2 to accelerate large language model (LLM) inference, image processing, and even audio generation, resulting in faster, more responsive real-time experiences.

SME2 works transparently through Arm KleidiAI integrations in leading frameworks and runtimes, such as LiteRT, MediaPipe, MNN, ONNX Runtime, and XNNPACK, enabling them to build and deploy apps across a wide range of Arm-based devices without complex rewrites.

Google behind the scenes

Google is already optimizing productivity and generative-AI features across its range of apps, including Gmail, Google Photos, and YouTube, on SME2-enabled hardware.

Google’s Gemma 3 model delivers 6x faster AI responses in chat interactions compared to devices without SME2. Additionally, Gemma 3 can summarize up to 800 words in under one second on a single CPU core with SME2 acceleration.

“We’re really excited about Arm’s new offering (SME2) because it brings additional compute to the CPU. For the first time, we can run these (GenAI) capabilities across a wide range of devices in the ecosystem,” states Oli Gaymond, Head of AI/ML Product, Android, Google.

Arm neural technology

SME2 is also being used in real-time computational photography to bring sharper, more vivid images and photos to life. Meanwhile, an Arm “AI Yoga Tutor” demo for mobile devices and TVs utilizes AI techniques to give users real-time conversational feedback tailored to their yoga poses, with SME2-enablement leading to 2.5x speed-ups in the AI pipeline.

The Arm Lumex platform isn’t just about incremental improvements; it’s about ushering in a new era of on-device AI. By empowering developers, OEMs, and end-users alike, Lumex is poised to redefine what’s possible with mobile technology, bringing advanced AI experiences to billions of people worldwide and raising the bar for intelligent performance.

Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments