DARPA OFFSET

Controlling drone swarms with VR

Standard practice for piloting a drone has been one operator for each unmanned vehicle. But what if a single operator could control tens or hundreds of drones? That’s a question the Defense Advanced Research Projects Agency is working on with its Offensive Swarm-Enabled Tactics program.

As leader of one of the two OFFSET integrator teams selected in October 2017, Raytheon BBN Technologies is creating a virtual reality interface that allows a single user to control large groups of inexpensive unmanned vehicles. Northrop Grumman, the other lead integrator, is also designing, developing and deploying an open architecture for swarm technologies that uses game-based architecture to enable design and integration of swarm tactics, according to DARPA.

Raytheon has tested swarms with as many as 50 drones and plans to grow that number, according to Shane Clark, a scientist at Raytheon and principal investigator for the company's OFFSET efforts.

“The goal is to allow a single user to actually control, in real time, up to hundreds of air and ground vehicles that have different capabilities or are different models,” Clark told GCN.

To manage the swarm, Raytheon developed a VR interface. In testing, the drones communicate with the “swarm tactician” using a laptop over Wi-Fi, though the plan is to remain communications-platform-agnostic. The tactician interacts with the environment with a HTC Vive and a pair of controllers. Right now, the data is just used for real time decision-making, so there isn’t a storage component, though that could be added in the future, Clark said.

Because interacting with hundreds of individual drones would be complicated, they're managed  in groups. Grouped drones can show their target area, given task, battery life and status of the communications link. Operators can drill down to see information on the individual drones too, Clark said.

The swarm itself is designed to act as a mobile ad-hoc network, with each drone acting as a link connecting the entire swarm together.

The interface is currently capable of simple commands like selecting a subset of the swarm and tasking it to move to a particular area, or asking drones to spin in place to get a sustained view of the surrounding environment. Many of the drones will be outfitted with electro-optical cameras capable of image recognition, and others will be equipped with LiDAR to allow for 3-D mapping.

Raytheon is working on a capability that would allow the operator to use the VR environment to draw around an area to be mapped, select the drones to complete the task and then issue a voice command to map the area, Clark said.

OFFSET is being conducted in a series of “sprints,” and groups of “sprinters” will be selected through the solicitation process to develop applications in several technology areas. The first sprint, released last year, addressed advancements in swarm tactics for a mixed swarm of 50 air and ground robots in an urban environment over 15 to 30 minutes.  Relevant capabilities include mapping abilities, locating entry and exit points, deploying sensor networks and maintaining connectivity for warfighters, according to the broad agency announcement.

There have been three contracts awarded this year as part of the OFFSET program -- to Soar Technology, Charles River Analytics and Lockheed Martin -- according to updates on the original BAA.

The VR environment will be leveraged by different “sprinters” who will be using it along with the AirSim open source flight simulator developed by Microsoft  to test the different tactics they’ve been chosen to develop.

“In simulation it's easy to postulate a new sensor that gives you a particular capability and see how that might inform what sorts or tactics you could accomplish,” Clark said. “One of the examples they gave in the BAA was what if you had a camera that could see through walls, how would that change things?"

"We’re really interested in understanding some of that more speculative technology," he said, "so that we’re not just building a pile of parts that will become obsolete, but be forward-looking.”

About the Author

Matt Leonard is a reporter/producer at GCN.

Before joining GCN, Leonard worked as a local reporter for The Smithfield Times in southeastern Virginia. In his time there he wrote about town council meetings, local crime and what to do if a beaver dam floods your back yard. Over the last few years, he has spent time at The Commonwealth Times, The Denver Post and WTVR-CBS 6. He is a graduate of Virginia Commonwealth University, where he received the faculty award for print and online journalism.

Leonard can be contacted at mleonard@gcn.com or follow him on Twitter @Matt_Lnrd.

Click here for previous articles by Leonard.


inside gcn

  • pollution (Shutterstock.com)

    Machine learning improves contamination monitoring

Reader Comments

Tue, Mar 27, 2018 Hans

Please correct the text to show Shane Clark, PHD or Dr. Shane Clark, as it's the proper way to identify someone with a Doctorate.

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group