Samsung’s Exynos 2500 To Be Optimized For AI Models By Nota AI

Nov 26, 2025 at 11:39am EST
Samsung Exynos 2500 chip with '5G' logo on a blue background.

Samsung has just inked a significant agreement, which paves the way for its latest SoC, the Exynos 2500, to run on-device AI models much more efficiently.

Samsung's Exynos 2500 to leverage Nota AI's compression and optimization technology

Samsung has inked a strategic agreement with Nota AI, allowing the latter to equip the former's Exynos 2500 chip with its AI model compression and optimization technology.

Related Story Samsung Achieves Technological Milestone With A New Kind Of NAND Flash Storage That Consumes 96% Less Power, Which Is Exactly What Smartphones Need

Samsung has already incorporated Nota AI's technology to create the Exynos AI Studio, an in-house AI model optimization toolchain, which allows advanced AI models to run efficiently by leveraging the Exynos 2500's raw power and without resorting to cloud-based computing.

Myungsu Chae, the CEO of Nota AI, noted:

"It's more than just supplying software—we've built a tightly integrated framework where AI hardware and software converge to deliver high-performance generative AI at the edge."

For the benefit of those who might not be aware, the Exynos 2500's architecture consists of:

  1. A 10-core CPU
    • 1x Cortex-X925 prime core clocked at 3.30GHz
    • 2x Cortex-A725 cores clocked at 2.74GHz
    • 5x Cortex-A725 cores clocked at 2.36GHz
    • 2x Cortex-A520 efficiency cores clocked at 1.80GHz
  2. A Samsung Xclipse 950 GPU based on AMD's RDNA architecture
  3. A dedicated Neural Processing Unit (NPU) capable of 59 TOPS
  4. 76.8 Gb/s LPDDR5X RAM

Do note that while competent, the Exynos 2500's NPU is no match for Qualcomm's Snapdragon 8 Elite Gen 5's Hexagon NPU, which is reportedly capable of reaching 100 Trillion Operations Per Second (TOPS).

Consequently, Samsung's latest agreement with Nota AI should boost the Exynos 2500's ability to process AI workloads by enhancing model optimizations, allowing the chip's relatively encumbered NPU to tackle AI-related tasks much more efficiently.

Follow Wccftech on Google to get more of our news coverage in your feeds.