Developing human activity recognition of workers with wearables
Deutsches Forschungszentrum fur Kunstliche Intelligenz (German Research Center for Artificial Intelligence, "DFKI") and Hitachi announced the joint development of AI technology for human activity recognition of workers using wearable devices. The AI technology performs real-time recognition of workers' activities by integrating technology in eye-tracking glasses to recognise gazed objects with technology in armband devices to recognise action.
The recognition ability of each activity is achieved by having the AI understand the tools or parts used at the production site as well as anticipated actions through Deep Learning. DFKI and Hitachi will use this newly developed AI technology to assist operations and prevent human error, to contribute to enhancing quality and efficiency on the front line of manufacturing.
In line with initiatives such as Industry 4.0 in Germany and Society 5.0 in Japan, the manufacturing industry is accelerating steps towards innovating production using AI and robotics, and the automation of menial tasks.
At the same time, IoT technology is being called for to collect and recognise the condition or movement of all things, including people and equipment, to assist in operations and prevent human error. As a result, in recent years, monitoring systems using cameras have been developed for predictive diagnosis of inappropriate worker movement or equipment failure in production lines.
Researchers from the DFKI research department, Smart Data & Knowledge Services and Hitachi developed AI for human activity recognition to recognise the activity of workers using various data collected through the wearable devices, not image data from cameras. Features of the AI developed are as below:
1. Technology to recognise gazed objects by using eye-tracking glasses
This technology is to recognise targeted objects like "screw" or "screwdriver" without being disturbed by its surrounding environment such as background or other objects. This technology extracts the data of gaze points from the movements of eyeballs of workers who wear the eye-tracking glasses and utilises the image recognition technology by Deep Learning.
2. Technology to recognise basic human actions through armband device
This technology is to recognise basic human actions that require arms movements such as "twist" or "push". This technology extracts the data relating to body actions from the microscopic and instantaneous signals that are measured by sensors attached to the arms.
3. "Hierarchical activity-recognition model" that recognises workers' activities by integrating gazed objects and human actions
This technology integrates the two technologies mentioned above to develop "hierarchical activity-recognition model", which is to recognise activities such as "twisting a screw." As a result, recognising a variety of working activities is capable if all the actions and objects involved in the activities are learned in advance.
Based upon these technological developments, the AI technology that can recognise activities such as "twisting a screw" or "pressing a switch" as part of "inspection task" in real-time was realised. DFKI and Hitachi will advance the technological development for assisting operations and preventing human error on the front line of manufacturing, where operation guidance and inadequate action detection are required, by utilising this newly developed AI.
DFKI and Hitachi will exhibit a part of this technology at "CeBIT 2017", a leading global exhibition of digital business to be held from 20-24 March 2017 in Hannover, Germany.