Fast track to NPU accelerated embedded vision
congatec has extended its i.MX 8 ecosystem with a new starter set for AI accelerated intelligent embedded vision applications. Based on a SMARC Computer-on-Module with i.MX 8M Plus processor, the starter set’s sweet spot is the utilisation of the new processor integrated NXP Neural Processing Unit (NPU).
Delivering up to 2.3 TOPS of performance for deep learning based artificial intelligence, it can run inference engines and libraries such as Arm Neural Network (NN) and TensorFlow Lite. It also integrates with Basler embedded vision software to give OEMs an application ready solution platform for the development of next-generation AI accelerated embedded vision systems.
Typical applications are wide ranging, from price sensitive automated checkout terminals in retail to building safety, and from in-vehicle vision for navigation to surveillance systems in busses. Industrial use cases include HMIs with vision based user identification and gesture based machine operation as well as vision supported robotics and industrial quality inspection systems.
“A dedicated processing unit for neural algorithms that is supported by open source AI software solutions such as TensorFlow is an efficiency accelerator for many vision based systems. And when all this is integrated as an application ready, hardware and software validated platform including Basler pylon Camera Software Suite, it puts developers on a fast track to designing NPU accelerated smart vision applications,” explained Martin Danzer, Director Product Management at congatec.
The Basler pylon Camera Software Suite delivers a unified SDK for BCON for MIPI, USB3 vision and GigE vision cameras, and enables camera access from source code, GUI or 3rd party software. The high-performance pylon viewer is perfect for camera evaluation.
Thanks to integration in the congatec i.MX 8M Plus starter set for AI accelerated vision applications, engineers get instant access to important AI supported machine vision features such as triggering, individual image capture, and highly differentiated camera configuration options plus easy access to customised inference algorithms on the basis of the Arm NN and TensorFlow Lite ecosystem.
The feature set in detail
The new starter set for AI accelerated vision applications contains the entire ecosystem developers need to instantly start designing applications on the basis of this next-generation platform, which offers highly efficient vision and AI integration. At the heart of the set is the new SMARC 2.1 Computer-on-Module conga-SMX8-Plus.
It features four powerful Arm Cortex-A53 cores, one Arm Cortex-M72 controller and the NXP NPU to accelerate deep learning algorithms and comes with passive cooling. The 3.5” carrier board conga-SMC1/SMARC-ARM directly connects the 13 MP Basler dart daA4200-30mci BCON for MIPI camera with an F1.8 f4mm lens via MIPI CSI-2.0 without any additional converter modules.
Next to MIPI CSI-2.0, USB and GigE vision cameras are also supported. On the software side, congatec provides a bootable SD card with preconfigured boot loader, Yocto OS image, matching BSPs, and processor-optimised Basler embedded vision software enabling immediate AI inference training on the basis of captured images and video sequences.