Design

NXP scalable machine learning portfolio and capabilities

29th October 2020
Alex Lynn
0

NXP Semiconductors has announced that it is enhancing its machine learning development environment and product portfolio. Through an investment, NXP has established an exclusive, strategic partnership with Au-Zone Technologies to expand NXP’s eIQ Machine Learning (ML) software development environment with easy-to-use ML tools and expand its offering of silicon-optimised inference engines for Edge ML.

Additionally, NXP announced that it has been working with Arm as the lead technology partner in evolving Arm Ethos-U microNPU (Neural Processing Unit) architecture to support applications processors. NXP will integrate the Ethos-U65 microNPU into its next generation of i.MX applications processors to deliver energy-efficient, cost-effective ML solutions for the fast-growing Industrial and IoT Edge.

“NXP’s scalable applications processors deliver an efficient product platform and a broad ecosystem for our customers to quickly deliver innovative systems,” said Ron Martino, senior vice president and general manager of Edge Processing business line at NXP Semiconductors. “Through these partnerships with both Arm and Au-Zone, in addition to technology developments within NXP, our goal is to continuously increase the efficiency of our processors while simultaneously increasing our customers’ productivity and reducing their time to market.

“NXP’s vision is to help our customers achieve lower cost of ownership, maintain high levels of security with critical data, and to stay safe with enhanced forms of human-machine-interaction.”

Au-Zone’s DeepView ML Tool Suite will augment eIQ with an intuitive, graphical user interface (GUI) and workflow, enabling developers of all experience levels to import datasets and models, rapidly train, and deploy NN models and ML workloads across the NXP Edge processing portfolio. To meet the demanding requirements of today’s industrial and IoT applications, NXP’s eIQ-DeepViewML Tool Suite will provide developers with advanced features to prune, quantise, validate, and deploy public or proprietary NN models on NXP devices.

It’s on-target, graph-level profiling capability will provide developers with run-time insights to optimise NN model architectures, system parameters, and run-time performance. By adding Au-Zone’s DeepView run-time inference engine to complement open source inference technologies in NXP eIQ, users will be able to quickly deploy and evaluate ML workloads and performance across NXP devices with minimal effort. A key feature of this run-time inference engine is that it optimises the system memory usage and data movement uniquely for each SoC architecture.

“Au-Zone is incredibly excited to announce this investment and strategic partnership with NXP, especially with its exciting roadmap for additional ML accelerated devices,” said Brad Scott, CEO of Au-Zone. “We created DeepViewTM to provide developers with intuitive tools and inferencing technology, so this partnership represents a great union of world class silicon, run-time inference engine technology, and a development environment that will further accelerate the deployment of embedded ML features.

“This partnership builds on a decade of engineering collaboration with NXP and will serve as a catalyst to deliver more advanced Machine Learning technologies and turnkey solutions as OEM’s continue to transition inferencing to the Edge.”

To accelerate machine learning in a wider range of Edge applications, NXP will expand its popular i.MX applications processors for the Industrial and IoT Edge with the integration of the Arm Ethos-U65 microNPU, complementing the previously announced i.MX 8M Plus applications processor with integrated NPU. The NXP and Arm technology partnership focused on defining the system-level aspects of this microNPU which supports up to 1 TOPS (512 parallel multiply-accumulate operations at 1GHz).

The Ethos-U65 maintains the MCU-class power efficiency of the Ethos-U55 while extending its applicability to higher performance Cortex-A-based system-on-chip (SoC)s. The Ethos-U65 microNPU works in concert with the Cortex-M core already present in NXP’s i.MX families of heterogeneous SoCs, resulting in improved efficiency.

“There has been a surge of AI and ML across industrial and IoT applications driving demand for more on-device ML capabilities,” said Dennis Laudick, Vice President of Marketing, Machine Learning Group, at Arm. “The Ethos-U65 will power a new wave of edge AI, providing NXP customers with secure, reliable, and smart on-device intelligence.”

Arm Ethos-U65 will be available in future NXP’s i.MX applications processors. The eIQ-DeepViewML Tool Suite and DeepView run-time inference engine, integrated into eIQ, will be available Q1, 2021. The end-to-end software enablement, from training, validating and deploying existing or new neural network models for i.MX 8M Plus and other NXP SoCs, as well as future devices integrating the Ethos-U55 and U65, will be accessible through NXP’s eIQ Machine Learning software development environment.

Register for the joint NXP and Arm webinar November 10th to learn more.

Featured products

Product Spotlight

Upcoming Events

View all events
Newsletter
Latest global electronics news
© Copyright 2024 Electronic Specifier