Series 16 – Episode 8 – AI at the Edge: memory’s role in a data-driven future
Paige West speaks with Axel Stoermann, VP & Chief Technology Officer, KIOXIA Europe about AI and Edge computing.
As AI has evolved beyond high-profile tools like ChatGPT, its potential in Edge and endpoint computing has become increasingly apparent, sparking significant advancements in hardware, software, and data management.
Stoermann noted: “Eighteen months ago, many people recognised ChatGPT by name but were unclear about its applications. Today, AI is fundamentally reshaping technology landscapes, especially through data collection and memory management in Edge and endpoint scenarios.”
Traditionally, AI has been centralised within the Cloud, where massive data processing tasks benefit from extensive high-performance resources. However, according to Stoermann, Edge and endpoint AI applications are now driving new innovations and investments, particularly in sectors such as mobile technology and industrial automation.
"More and more, we're moving away from relying solely on Cloud-based AI towards deploying AI at the Edge and endpoints,” he said. “Mobile devices, for example, have rapidly evolved, incorporating AI to deliver advanced features directly to users. In industrial settings, latency-sensitive applications demand real-time responses, making local processing essential.”
Edge computing allows data to be processed closer to the source, reducing latency and enabling faster decision-making – a crucial factor in fields like smart manufacturing and autonomous vehicles.
As AI becomes increasingly decentralised, the performance requirements for Edge computing rise, placing new demands on hardware. Stoermann described two primary stages of AI processing: model training, which typically occurs in centralised data centres, and model inference, which is suited to the Edge. Each stage has distinct needs for memory, bandwidth, and processing power.
As more data is processed locally, security and data privacy are increasingly pressing concerns, particularly in regions with stringent data protection laws like Europe. Stoermann emphasised that robust security measures must accompany AI’s decentralisation.
“Data privacy expectations vary by region, and in Europe, we have a more conservative approach, especially with sensitive applications in finance, public services, and justice,” he explained. “One solution we’re exploring is separating compute and storage physically, using high-speed optical links. This approach could allow secure data management even at the Edge.”
When asked about future advancements in Edge and endpoint hardware, Stoermann predicted that demand for local AI processing would only grow. “As AI applications move from the Cloud to Edge and endpoint, we’ll see new challenges emerge, from power consumption to real-time processing and form factor constraints. Europe’s interest in automotive and ADAS (Advanced Driver Assistance Systems) is a prime example of where edge AI will flourish in the coming years.”
Over the next five years, Stoermann expects further breakthroughs in Edge hardware to support applications like ADAS and mobile computing. "Edge devices will need to handle increasingly complex workloads, which will drive advances in power efficiency and real-time processing. ADAS in particular, with its high demands for accuracy and response time, will push Edge AI forward," he said.
To hear more about Edge computing and much more, you can listen to Electronic Specifier’s interview with Axel Stoermann on Spotify or Apple podcasts.