Events News

Avnet to exhibit at the 2024 Embedded Vision Summit

10th May 2024
Harry Fowle
0

Avnet’s exhibit plans for the 2024 Embedded Vision Summit include new development kits supporting AI applications.

The summit is the premier event for practical, deployable computer vision and Edge AI, for product creators who want to bring visual intelligence to products. This year’s Summit will be May 21-23, in Santa Clara, California.

This annual event brings together a global audience of technology professionals from companies developing computer vision and Edge AI-enabled products including embedded systems, cloud solutions and mobile applications. Avnet will showcase its new development kits, modules, and reference designs in Booth 404.

Show attendees will be among the first to see demos showcasing:

  • High-performance, Power-efficient Vision-AI Applications with the Qualcomm QCS6490 SoC - Avnet Embedded’s new QCS6940 SMARC SOM allows developers to quickly implement AI systems based on the QCS6490, and then rapidly transition to production with the production-ready SOM. This demonstration will preview the new QCS6490 SMARC SOM and show running examples on the Qualcomm RB3 Gen2 EVK.
  • AI-Driven Smart Parking Lot Monitoring System Using Avnet’s RZBoard V2L - Improving parking efficiency is an important goal in smart city and smart building applications. Using embedded vision and combination of Edge AI and cloud connectivity, this demo shows how multiple parking spaces can be cost-effectively monitored, providing real-time feedback to drivers looking for those elusive open parking spaces. This demonstration uses the Avnet RZBoard V2L single-board computer, a custom CNN (convolutional neural network) model and Avnet’s IoTConnect to implement an example system. The occupied and free parking slots are identified using the energy-efficient RZ/V2L processor from Renesas, which runs Linux on its dual Arm Cortex-A55 CPUs, while accelerating the AI model on its DRP-AI accelerator core.
  • The VE2302 System-on-Module: High-performance Edge-AI Using the AMD Versal Edge AI Series - Get an early preview of Avnet’s newest SOM and Development kit based on the AMD Versal Edge AI series SoC. The small 50 x 50 mm SOM is packed with processing power, including the VE2302 device which features 328K of programmable logic cells, a Dual-core ArmCortex-A72 MPCore and Dual-core Arm Cortex-R5F MPCore, 4GB of Micron LPDDR4 memory with non-volatile boot options using the 64MB Micron OSPI Flash or 32GB eMMC. The SOM provides access to eight of the Versal AI Edge GTYP transceivers plus abundant HDIO, PMC MIO, LPD MIO, and XPIO user IO. To help designers with their evaluation and prototyping needs, Avnet has created a versatile carrier board for the VE2302 SOM and packaged this into an affordable VE2302 Development Kit.
  • Low-power, Always-on, Edge AI Smart Sensing and Sensor Fusion - Using custom machine learning models developed by MACSO and cloud connectivity through Avnet’s IoTConnect and AWS, this demonstration illustrates the efficacy of sensor fusion by combining audio and IMU modalities to detect falls. What distinguishes a fall from standard displacement is the danger involved which is often vocalised by the victim. Therefore, sensor fusion can be utilised to ensure robustness against false positives that may be caused by day-to-day movements.

A second demonstration shows machine learning models capable of filtered classification. In this example, MACSO’s models can detect the presence of a specific drum sound amongst other drum sounds enabled by its learned filtering capability.

Featured products

Upcoming Events

View all events
Newsletter
Latest global electronics news
© Copyright 2024 Electronic Specifier