Osaka University researchers tackle indoor AR challenges
Augmented Reality (AR) is a technology that blurs the digital world with the physical world by overlaying virtual elements, like images, animations, or information, onto a real-world view. In simple terms, AR adds extra layers of information or visuals to what you can see in real life.
Unlike Virtual Reality (VR), which immerses you in a completely digital world, AR enhances the environment you're already in, and it works through devices like smartphones, tablets, or specialised AR glasses, using sensors, cameras, and software to align the virtual elements with the real world.
Many people now have access to phones or devices of some sort, as such AR has been growing to capture the imagination of the masses – so much so that apps that overlay virtual elements onto real-world environments are becoming a part of everyday life. But indoors, AR faces some pretty tricky hurdles as it struggles to function without clear GPS signals.
Researchers from Osaka University, curious about the specific limitations of AR indoors and how best to overcome them, have shed light on these issues through extensive experimentation, and they now believe that they have a potential solution to make AR more effective in indoor spaces.
The rise of AR
AR was once the stuff of science fiction. A quick search of its history tells us that it was conceptualised in the late 20th century when early experiments used head-mounted displays to overlay computer-generated images onto real-world views. However, because of hardware limitations and high costs, it remained a niche technology. But, in the 2010s, the landscape began to shift. Smartphone started to become equipped with cameras, accelerometers, and gyroscopes which opened the door for AR apps to reach a wider audience, transforming the technology from a novelty into a mainstream tool.
As AR became a tool for both entertainment and utility, its adoption soared and apps like Pokémon GO demonstrated how it could create engaging, location-based experiences. More than this, AR is now also used in retail, manufacturing, healthcare, and education, where its applications like visualising surgical procedures assist with assembly-line tasks, and enhancing remote learning.
Osaka University's breakthrough
Despite these advances, AR still struggles in indoor environments because it cannot rely on GPS for localisation. For AR to function, it needs to know where the device is located and how it is moving, a process achieved through two main systems:
- Visual sensors: cameras and LiDAR identify landmarks like QR codes or patterns in the surroundings
- Inertial Measurement Units (IMUs): these sensors detect motion to track the device's movement
The Osaka University research team sought to pinpoint the exact causes of these issues through an ambitious series of experiments. Over 113 hours of testing, involving 316 scenarios, participants interacted with AR environments such as a virtual classroom in a lecture hall. By disabling sensors and altering conditions like lighting and landmark availability, the researchers identified the weaknesses of current AR systems.
Their findings showed that in indoor spaces, cameras struggle in low-light settings and far-off or obscured landmarks, while LiDAR can falter on reflective or featureless surfaces. IMUs are prone to errors that accumulate over time and these cause virtual elements to "drift" out of alignment, which can lead to a frustrating or even nauseating user experience.
To address these issues, the team proposed using radio-frequency (RF) localisation technologies, such as ultra-wideband (UWB). Unlike visual systems, UWB operates similarly to Wi-Fi or Bluetooth and is unaffected by lighting, distance, or line-of-sight obstructions. It offers precise positioning, even in challenging indoor environments.
What this means for the future of AR
With the integration of RF-based localisation, AR could work seamlessly indoors, which would offer new possibilities for interactive navigation in indoor environments. It could also enhance training simulations, indoor gaming, and virtual design tools.
With AR becoming more human-centric, the discovery from Osaka University means that ARs potential to simplify tasks and save time when accessing information or interacting with the world will perhaps one day feel natural, easy, and not so dangerous – or nauseating.