Artificial Intelligence

What will be the AI trends in 2019?

30th November 2018
Joe Bush
0

Jos Martin, Senior Engineering Manager at MathWorks, investigates the likely Artificial Intelligence (AI) 2019 trends for scientists and engineers

Applications which were recently deemed ultra-modern are now at our fingertips, thanks to the intersection of technologies such as AI, deep learning, data analytics and IoT. Of course, with any new concepts the scientists and engineers at the helm of researching, developing and launching these applications are still finding their feet, discovering new functionalities, design workflows and the skills required for their developing professions.

In order for the real capabilities of AI to be felt across industries, there is a desire for the technology to be made more widely accessible, and appropriate, for use by engineers and scientists across the range of industries they operate within. Beyond the data scientist, engineers and scientists are likely to push the testing, trialling and acceptance of deep learning in industrial applications. Solution providers will drive change due to the growing complexity of larger datasets, combined with embedded applications, and larger development teams, to promote improved collaboration, interoperability, higher productivity workflows and less dependence on IT teams.

1. Living on the Edge
Where processing is required to be local, edge computing will facilitate the running of AI tools. Edge computing, which necessitates real-time, high performance and progressively complex AI solutions, will be facilitated by improvements in sensors and low power computing architectures.

In addition, where autonomous vehicles are required to understand the environment in which they function, and evaluate driving options in real-time, edge computing plays a crucial role in terms of safety. For remote locations which often have costly or limited internet access, such as deep sea oil platforms, there is the potential to generate enormous cost savings.

2. Interoperability
For building a complete AI solution, interoperability will be essential. As AI is still a relatively new technology, best practice examples are somewhat limited, with no set guidelines in place for its use. Presently, separate deep learning frameworks will concentrate on a small number of production platforms and applications, but for solutions to be effective they necessitate elements from a variety of workflows.

However, this can restrict productivity and generates friction. Fortunately, there are companies, such as ONNX.ai, hoping to solve these interoperability issues, so that developers will be empowered to select the most appropriate tool, share their models in a simple way, and distribute their solutions to a broader range of production platforms.

3. AI for all
The uptake and experimentation of deep learning will be fuelled by engineers and scientists, not just data scientists. Automation tools, technical curiosity, and business imperatives to reap the promise of AI, will energise more engineers and scientists to embrace AI. This technology will be made accessible beyond data scientists via the latest workflow tools which are making its use more straightforward and automating data synthesis, labelling, tuning, and deployment.

As a result, tools including time-series data like audio, signal, and IoT that are used by many engineers, as well as image and computer vision, will be able to be applied on a broader scale. This will impact a wide variety of industries - from unmanned aerial vehicles (UAV) using AI for object detection in satellite imagery, to early disease detection during cancer screenings via improved pathology diagnosis.

4. Demands for domain specialisation
A growing consumer of AI is industrial applications, but these command new requirements for specialisation. As AI and IoT applications such as predictive maintenance, smart cities, and Industry 4.0 move from an idea to reality, a new set of criteria need to be achieved. This can be seen with low-power, mass-produced and moving machines that necessitate form factors, safety critical applications that have to have greater dependability and verifiability, and progressive mechatronics design approaches that amalgamate electrical and mechanical elements.

Another barrier is that decentralised development and service teams are normally the people responsible for the development of these specialised applications, rather than under centralised IT. In terms of the agricultural industry, machinery that utilises AI for detecting weeds and smart spraying can also be used to make sure aircraft engines don’t overheat in predictive maintenance.

5. Collaboration driven by complexity
Additional collaboration and participants will be needed to facilitate the burgeoning use of machine learning and deep learning in complex systems. Bigger and decentralised teams are needed to cope with the greater scope and complexity of deep learning projects, spurred by data gathering, synthesis and labelling. In order to deploy inference models to data centres, cloud platforms, and embedded architectures such as FPGAs, ASICs, and microcontrollers, systems and embedded engineers will need greater flexibility.

They will also need knowledge of, and experience in, power management, component reuse and optimisation. As the sheer amount of training data increases, and lifecycle management of the inference models has to be handed over to system engineers, new tools will be crucial for the engineers responsible for the development of deep learning models to undertake their role.

Featured products

Product Spotlight

Upcoming Events

No events found.
Newsletter
Latest global electronics news
© Copyright 2024 Electronic Specifier