Robots are becoming more human
Robotics has become a crucial part of the modern world, influencing employment, education, and exploration. People have always wondered when robotics will become advanced enough to blur the line between humans and machines. As Devin Partida explains, while technology still has a long way to go, advancements in robotics are suggesting that we can look forward to a future where true humanoid robots and artificial life could become a reality.
This article originally appeared in the December '21 magazine issue of Electronic Specifier Design – see ES's Magazine Archives for more featured publications.
There has been an ongoing debate for decades about what makes something true artificial life, whether robot or AI. While there is still no firm universal consensus on the subject, there are some things that are generally accepted as stepping stones to creating more human-like robots.
These qualifications exist on essentially three levels: physical, emotional, and existential. These key areas form a framework for analysing the progress that robots have made over the years to become more like their human creators.
Connecting emotionally
Imitating human emotions is perhaps the most important aspect of establishing a human-like robot. Achieving this is still exceptionally complex, of course, but the capabilities of AI and machine learning are making artificially emotional robots far more of a reality today than was possible in the past. While mechanical motion can be difficult to achieve and sentience even more difficult, connecting emotionally with people on a basic level can happen with some code and a display.
A prime example of this is the healthcare robot Stevie, developed by scientists from Trinity College in Ireland. This robot was designed to help care for seniors in nursing homes by monitoring their vitals, providing living assistance, and socialising with them.
The robot may not be self-aware or even have legs, but it has been able to create positive emotional connections with seniors during its testing process. Stevie can recognise and respond to emotional language from humans – something that would not have been possible even a decade ago.
Similar to Stevie is the children’s socialisation robot: Moxie. This little robot is designed to be a companion for children, helping to educate them and teach social skills. Moxie also uses an animated display to show emotions the same way that Stevie does.
It can recognise and remember names and faces, allowing it to build artificial relationships with children. In a world where more and more children are attending school online, robots like Moxie could become valuable tools for making sure kids learn the necessary social skills to interact with others.
Performing more human jobs
One of the more welcome strides that robots have made over the decades is the ability to perform human jobs with increasing autonomy and functionality. Robots have been used in the workplace since 1956. With many industries (such as agriculture and construction) experiencing crippling labour shortages, robots are becoming more a part of the workforce than ever before. This is driving innovation ever further.
Robots are taking over jobs that either can’t be filled or aren’t desirable or safe for humans. For example, leading robotics developer Bill Lovell has engineered scalable robotic arm technology that can do tasks as mundane as shovelling horse stalls or as monumental as cleaning up after a hurricane. Robotic arms have been a popular form of industrial robotics, gaining particular popularity on assembly lines.
Recently, though, industrial robotics has been expanding to new fields that would have been unimaginable not too long ago. A research team at Monash University in Australia is developing an autonomous apple-picking robot that uses AI to scan apples for ripeness, pick them, and even ‘zap’ weeds along the way.
As robots become capable of taking on more and more traditionally human jobs, many researchers (although it is hotly debated) predict a future where robots could replace myriad human employees.
Motion and facial expressions
Creating robots that are truly human-like in nature relies on enabling robots to have genuinely humanoid physical forms. Fine motor skills are extremely difficult to replicate artificially because the mechanical parts driving that motion have to fit into a human-like form.
A robotic hand, for example, needs finely tuned, extremely small motors in each joint, replicating the function of muscles in biological systems. Even more complex is replicating human emotions without a screen. Engineers are overcoming both challenges, however.
The oldest modern humanoid robot is Honda’s ASIMO robot, which is also one of the most advanced humanoid robots invented so far. ASIMO, short for Advanced Step in Innovative Mobility, is capable of an impressive array of physical tasks, from walking and running to carrying things and jumping in the air. It responds to its own name and understands commands from humans, as well. ASIMO is still clearly a robot, though, unlike a more recent humanoid robot, Sophia.
Created by Hong Kong-based Hanson Robotics, Sophia has a face that is virtually indistinguishable from that of a real person. She can move, gesture with her hands, make accurate and unique facial expressions, follow eye contact, and even hold genuine conversations with humans. She was even granted citizenship in Saudia Arabia and holds a position as the United Nations’ Ambassador of Innovation.
Digital life
On the edge of advancements in humanoid robotics are the beginnings of true digital life. As described above, a key element that has enabled modern developments in robotics is artificial intelligence. Scientists of the 20th century could only imagine robots capable of the things they can do today. The capabilities of AI life can serve as an indicator of the direction that humanoid robotics is advancing toward in the years and decades ahead.
Most people have run into an AI chatbot at one time or another. They have become popular for online customer service, but are typically only able to offer pre-programmed responses to standard questions. Alexa, Siri, and other AI personal assistants are more advanced, capable of understanding speech and having limited conversations. The AI that Samsung’s STAR Labs is developing takes artificial intelligence to an entirely new level.
Samsung’s ‘Neons’ are artificially intelligent, sentient people. They have unique physical appearances, voices, and personalities. These ‘virtual beings’ are designed to one day perform distinct jobs and roles, such as teaching, counselling, or simply companionship. While the Neons are still in development, Samsung hopes to one day make them widely available to the public, where they will be able to form relationships and memories with the humans they interact with.
AI beings with their own digital-physical appearances are becoming more common. For example, an AI Instagram influencer, ‘Miquela’, has amassed over a million followers and become a fashion icon. Meanwhile, AI pop star 'Hatsune Miku' has become a music phenomenon in Japan and around the world, with live performances from her holographic avatar opening to sold-out arenas.
Where robots and humans meet
Robots have been part of human culture and society for decades and have become a hallmark of human concepts of the future. Efforts to create robots that are truly human in nature have revealed the incredible complexity of human biology, pushing the boundaries of scientific innovation.
Today’s robots have come a long way from the humble robotic arms of the 1950s, as have the capabilities of computers, motors, and artificial intelligence.
It has become clear that humans won’t stop innovating and inventing until we can walk alongside robots – marking a future where both parties can together help to build a better world.