How motion capture technology is shaping the future of drone swarms
Drone swarms have been growing in popularity alongside the rise of drone technology itself, now, motion capture technology is being utilised to further this research.
To learn more, Electronic Specifier’s Harry Fowle spoke with Eric Ewing, Visiting Professor at Brown University, who has been utilising Vicon’s motion capture technology to advance his research on drone swarms.
Ewing and his project
Ewing is a specialist in multi-robot systems, focusing his PhD on how to better control and coordinate swarms of drones. “I’ve done a lot of work with these ‘micro-copters’ we have called ‘crazy flies,’ amongst others, and how we can coordinate them together to complete a plethora of different tasks.
“I worked heavily on motion planning during my research, but also around education and outreach,” he says. It’s here that Ewing has been able to develop systems for coordinating drone swarms for unique tasks, with Vicon’s motion capture technology being integral to this process.
“We have the Vicon camera system set up in our drone lab so that it’s feeding us low latency information at very high speeds, very accurately on the state of our drones and where they are. This information is then utilised to feed the drones information so that they can act accordingly. What this does is ‘cut out’ some of the issues that robotics and drones face, such as state estimation.” It is this cutting of state estimation that is the real highlight of this innovative process.
Vicon’s vision solutions
Vicon has been able to offer Ewing and his team ultra-wideband instruments and ‘lighthouse stations’ that are able to perform triangulation alongside high-accuracy sensors onboard the drones. These lighthouse stations Ewing describes as being similar to those you would find as the base station for an HTC Vive. Utilising these technologies, Ewing and his team are able to directly track the drones in real-time with high levels of accuracy.
Another integral part of Vicon’s offering has been data centralisation. All of the information produced by all of the drones in a swarm is able to be centralised onto one computer which means that data collection is considerably easier. As Ewing explained: “Traditional localisation methods that don’t use a centralised system mean that the drones themselves know where they are, but we don’t have a centralised system that does, unless we are directly communicating with each of the drones. That transmission of information comes at a cost, a cost that comes with complexities. If every drone has to communicate its position back, and we have a lot of drones, we’re going to overwhelm the radio, through this new process we are able to avoid that potential issue.”
This, along with the precision and latency perks, is the main benefit of using motion capture technology within drone swarms. Through the use of this method, Ewing noted: “If you look at the noise in the marker tracking positions, we’re getting only about 0.2mm fluctuations in the readings, which is crazy precise for what we aim to use it for. On top of this, the latency is very low, low enough that it’s not the part of our system that causes any issues.” This low latency is essential to drone swarms, as latency can cause a lot of issues such as within control theory. “It’s great because now we don’t have to worry about that side of the equation. The lack of any noise gives us the easiest starting point possible when developing new control algorithms. If we can’t design a control algorithm when there is no noise or latency, then we don’t have a good control algorithm.”
By relying on precise state data, developers can focus on refining control algorithms without the added complexity of managing noise or latency in state estimation. This approach gives developers a solid starting point; if an algorithm can’t work under ideal conditions, it can be deemed ineffective much sooner. Then, when noise is eventually added, the motion capture system can still provide valuable data that shows how far actual performance deviates from the ideal state. This process can then allow for on-board estimation comparison, aiding in the development process.
Ewing, who has worked on high-level planning such as path planning to avoid collisions, appreciates this approach: “It’s really nice to have perfect state estimation so that I don’t need to worry about that side of things and focus solely on the planning side of things.” For Ewing, this more modular approach means that development can be considerably more streamlined, enabling focus on complex tasks by removing the need to solve state estimation at every step.
Practical applications and challenges
Whilst research is great, the real bread and butter comes in the form of practical applications of drone swarm technologies, which is precisely what Ewing and his team are working to achieve despite the challenges that have arisen.
Ewing can see a future where this technology is applied particularly for high-level problem-solving in fields like automated warehousing, search and rescue, and environmental monitoring. In this regard, the focus lies in developing “planning algorithms” and adaptive “policies” for robotic swarms to function in unpredictable environments. We are already seeing the early fruits of utilising drone swarms in the field with swarms being developed in the UK to fight wildfires, these new approaches only heighten the chances of more of these projects.
Automated warehouses that feature hundreds or thousands of robots sorting and picking are nothing new, they are already in full swing, yet they are limited by path planning in their current state. Ewing describes the current process: “Right now, these robots in warehouses don’t use a camera-based motion capture system of any kind. What they instead do is use fiducial markers, or little ‘QR codes,’ or AprilTags to map the whole floor.” Whilst this approach is fine for ground robots completing repetitive tasks, it fails completely when it comes to more advanced and adaptability-demanding applications both inside and outside of a warehouse.
Robotics research at present is aiming to support a wide array of different applications, but the biggest issue with this is adaptability. “The hardest part is the flexibility that we demand from robots and that development of policies for robots that can adapt to a wide range of circumstances,” Ewing says.
In scenarios like search and rescue, the focus is on the ‘planning’ rather than state estimation or control, as this flexibility is paramount for situational adaptability. You can’t plan ahead in these circumstances, thus the development of policies that can adapt to a wide range of different scenarios is key.
Whilst Ewing’s current approach using the Vicon motion capture technology makes some of what he is doing not readily transferable to the real world immediately, the key is that it is improving the process of getting to that ‘application to the real world’ phase. As Ewing puts it: “If everybody had to work on state estimation and control algorithms and that side of things, we wouldn’t actually ever be able to work on the stuff that actually makes it applicable to the real world.
“The fact we have this Vicon system means that we get to work on the high-level challenges, rather than being bogged down on the lower-level side of robotics.”
Looking ahead
Looking to the future, Ewing is optimistic for motion capture in robotics, especially within research and accessibility. For the most part, from a precision and latency standpoint, the technology is there. Whilst the limitation is no longer the camera system, the problem shifts to the radio side. Ewing explains that it is the radio becoming overwhelmed which is the limiting factor in this approach, so that is what needs to be addressed moving forward.
Yet, even now, the key next step will be getting this technology into as many hands as possible to accelerate research efforts in all sorts of different fields that are applicable to the real world. “For the future, we just want other people to have it, because it’s a very impactful technology for research,” concludes Ewing.