As NSF announces drone funding, experts preview the technology's potential
- By Matt Leonard
- Aug 04, 2016
The White House announced this week a wide range of initiatives in the field of unmanned aerial systems research. The goal is to figure out how emerging UAS technology will fit into the current communication and aerospace ecosystem while acting as a vital player in the emergence of the Internet of Things.
Standing in the lobby of the Newseum on Aug. 2, Sanjiv Singh, a professor of robotics at Carnegie Mellon University, told a crowd of onlookers about a drone that sat in a netted cage a few feet away. Its six blades, red-and-black color scheme, blinking lights and size separate it in appearance from most consumer drones. But the truly important bit sat on the bottom of the drone, in a silver cylinder: a laser scanner and small computer.
“It’s like a lighthouse,” Singh said. “It spins around 10 times a second, taking 300,000 measurements per second.”
The drone can use those measurements to build a map of its surroundings in real time -- a feature Singh demonstrated on a video screen above the crowd.
The National Science Foundation is investing $35 million over the next five years into researching how machines like this can be used. The money will be used to study design, control and potential applications. New York state announced that it would be investing $5 million into research as well.
The Federal Aviation Administration and NASA, meanwhile, are already working on how to regulate UAS and integrate them with the national airspace.
In June, the FAA announced Part 107, which set the baseline rules for small UAS. It requires that drones weigh less than 55 pounds, that operators register and that flights be restricted to daylight hours, among other limitations. The FAA plans to release its next set of rules for public comment this winter; these will focus on using UAS near crowds, according to the agency.
But industry experts agree that as the technology improves, the rules will change along with them.
Dave Vos, who leads Project Wing at Google[X], told GCN that what needs to be done now is collect data through a lot of test flights.
“Data is hard to argue with,” he said.
In panel discussions that accompanied the announcements from NSF and other agencies, there was an obvious consensus on where the technology needed to evolve. First, public- and private-sector experts said, there must be more automation within the vehicles. Then improvements to facilitate safe beyond-line-of-sight implementation will be required.
John Hansman, a professor of aeronautics and engineering systems at the Massachusetts Institute of Technology, said greater automation would likely help with issues surrounding spectrum and the ability to ensure constant communication between a drone and its operator. Loss of spectrum often results in grounding in current testing with military UAS, Hansman said.
But with more automation, a drone could be programed to know where to land if it loses spectrum.
Intel CEO Brian Krzanich, meanwhile, defined three key areas where technology and automation must be improved: collision avoidance, communication and multidrone operation.
There is already technology out there that is working at high levels of accuracy for collision avoidance, Krzanich said. As 5G becomes a reality, he predicted, it will help with communication and spectrum. And multidrone operation is also in use -- as an example, Krzanich showed a video of drones that flew in a coordinated pattern to orchestrate a light-show in the sky, eventually spelling out the Intel logo. More practical uses of multidrone operations include search and rescue missions and infrastructure inspection, he said.
Combine all three initiatives, Krzanich said, and it boils down to the fact that drones must keep getting smarter.
Singh, the Carnegie Mellon professor, compared automated drones to self-driving cars, which he worked on in the early years of the technology. It took decades for cars to get to the point where they are today, he noted. But the transition for drones will be quicker because much of the technology is already there or within reach.
People at the Department of the Interior, the National Oceanic and Atmospheric Administration, the electric industry and other fields are ready for the technology too. Drones are already being used to inspect power lines. Interior uses them for a multitude of situations including fire management, and NOAA has tested drones of all shapes and sizes as tools for gathering environmental imagery and other data.
DOI will have a training program for how to use UAS in search and rescue missions by 2018, the administration announced. By fiscal year 2019, the department will have a workflow in place for rapid data processing using the cloud. And by next year, agency officials hope to use drones to provide near-real-time wildfire information.
NOAA plans to research how to use UAS to improve data collection on ships, and the agency wants to look into how the technology can replace manned aircraft in roles that include gravity measurement.
Back in the Newseum lobby, Steven Krukowski, a Stanford PhD candidate, showed the crowd a video of a drone he helped design that can land on a moving target using onboard sensors.
“I know that UAVs and quadcopters are mainly viewed as flying cameras,” Krukowski said, “but with the systems that we’ve demonstrated today and their high-powered onboard computers, they’ve become so much more. A lot of them have become highly intelligent robots.”
Matt Leonard is a former reporter for GCN.