Robots working together tap cloud data for navigation
Researchers have shown that autonomous robots working together can navigate unknown terrain faster than they would individually when given a new way to pool data gathered in real time, analyze it collectively and navigate an unmapped path over hazardous terrain in the dark as a team.
The robot navigation system is based on a centralized cloud housing data gathered in real time from vision systems in all the robots. Each robot draws on that data and applies algorithms to help it dynamically triangulate paths less likely to contain obstacles and communicate that information to the others.
The researchers took a three-pronged approach to the decision-making capacity of their robot team. "We presented a solution of combined dead reckoning, data transferring and machine vision, based on our research group's original laser-based real-time technical vision system," said Mykhailo Ivanov, an engineer and one of the lead authors of the study at Universidad Autónoma de Baja in Mexico. Previous efforts, he added, focused on each problem separately.
Basic four-wheeled robots traveled across obstacle courses designed to have unique blind spots for each unit. A simple laser and pair of sensors that evaluated the reflected light for position and distance were the basis for the robots’ “vision.”
The robots’ ability to move through difficult terrain could be helpful in disaster response where they can be sent into small or dangerous areas to collect data on victims or a building’s structural integrity.
The next step in this research, according to Ivanov, will be to improve the robotic vision, which would make their cloud-based navigation teamwork system potentially useful across a wider array of industries and applications.
Read the full paper here.
Connect with the GCN staff on Twitter @GCNtech.