Smart computers for the battlefield
- By Mark Pomerleau
- May 26, 2015
Computers are more efficient than humans, especially when it comes to calculations, and humans can reason and adapt at lightning speed, a task still beyond most algorithms. But researchers are developing tools to improve man-machine interfaces -- to the benefit of both.
Currently in the development pipeline is a way for soldiers to communicate with computers using brainwaves. Jean Vettel, a neuroscientist with the Army Research Lab, recently discussed headgear lined with brainwave-sensing electroencephalography or EEG sensors. The sensors would feed data into a tablet or smartphone that a soldier would wear on their body, making him a kind of "noisy sensor," Vettel said.
On an experienced soldier, the sensors could detect a “gut instinct” of heighted anxiety, fear and tension and transmit that information to the squad’s less-experienced teammates, making the whole team aware at the same time that danger is afoot.
Vettel is also using the EEG sensors to eventually teach computers to identify an image of a threat.
The military’s drones, meanwhile, collect millions of images – more than can be processed by human analysts – so the Army wants to train computers to be able to recognize and differentiate among the images collected. In order to perform this task, however, computers need a database of threatening and benign images that an algorithm can learn from. That’s where the humans come in.
Soldiers will be able to mentally “tag” photographs as they appear in front of them while wearing the EEG headgear. The computer, in turn, will learn which images are threatening based on the mental tagging of soldiers.
"And then when we have images labeled, we can take those images and give it to a machine learning algorithm that can learn to distinguish between threatening or non-threatening images. We'll have used soldier expertise to train the algorithm.”
In related research, scientists from the Navy and the University of California, San Diego have created an algorithm to speed up the identification of mines on the ocean floor.
Researchers developed algorithms that could sift through a database of images and flag those that featured changes in pixel intensity between neighboring regions of an image, indicating a possible mine. Once the algorithms reduced the number of possible mine-like objects, humans viewed the filtered dataset with EEG headgear that noted if brain signals showed the subjects saw anything of interest. Humans were able to identify mines more quickly once the computer algorithm narrowed the dataset.
"Computers are very good at finding subtle, but mathematically precise patterns, while people have the ability to reason about things in a more holistic manner, to see the big picture. We show here that there is great potential to combine these approaches to improve performance," said Ryan Kastner, professor of computer science at the University of California, San Diego.
Machines are also being developed to learn from surroundings rather than direct human interaction. Nano chips that were developed by HRL Laboratories’ Center for Neural and Emergent systems – and funded by DARPA – were outfitted to tiny drones weighing less than 3.3 ounces. The chip acts much like a human brain in that it processes what it sees and learns from its environment. During a demonstration, a drone was flown in an enclosed space and traveled through three rooms. The nano aircraft was capable of recognizing when it was in a new room and could identify familiar objects from its optical, ultrasound and infrared sensors. According to Defense Systems, this program stems from DAPRA’s Systems of Neuromorphic Adaptive Plastic Scalable Electronics, or SyNAPSE, program, which was launched in 2008 to develop brain-like chips that could draw information from video feeds and other sensor data, and provide decision support.
Meanwhile, cognitive computing at the larger, higher end of the “smart” computing spectrum is being used in health care. IBM’s Watson computer – of Jeopardy! and now cookbook fame – has been used by doctors to narrow down options and pick the best treatments for patients, very similar to the aforementioned military applications. IBM notes that doctors still are making the decisions, but cognitive computing helps humans make sense of large amounts of data, something that serves to benefit both civilian workers and those on the front lines of national security where time can be a matter of life or death.
Mark Pomerleau is a former editorial fellow with GCN and Defense Systems.