Disruptive tech: Sometimes you're the windshield ...
New systems and tools can enhance operational processes, foil adversaries
- By Henry Kenyon
- May 03, 2011
Throughout history, events often have been determined by disruptive technologies. The question for organizations and nations is whether they're on the right or wrong side of those technologies.
“If there’s going to be a disruption, the disruption can be done by you, or it can be done to you,” David McQueeney, vice president of software research at IBM Corp., said May 3 at the Department of Defense Intelligence Information Systems conference in Detroit. New technologies can be used to enhance operational processes, and they can also serve to disrupt adversaries, he said.
McQueeney proposed a new Moore’s Law for the 21st century. An updated measurement is necessary because the world is poised on the brink of major change. Development in ubiquitous sensor technology will see the rise of the “smarter planet.” Although this process is already under way in the DOD, it will spread to the civilian government as sensors and actuators move to the very edges of systems, creating a smart infrastructure capable of providing data and of maintaining itself.
This new law reflects the reality that data is the real state of the world, he said. Data drives user understanding of battlefields, the actions of adversaries and the movement of markets. However, while computers will have more processing power in coming years, McQueeney said that the focus for system design will shift to managing data. Much of this data will be unfiltered and unstructured. “It isn’t about processing and bringing data to the processor,” he said.
IBM researchers are looking at four areas of computing developments:
- Using miniaturization to build entire systems on microchips.
- Optimizing systems for specific jobs/workloads.
- Managing and processing large datasets and developing machines that can scale to sort peta- and exaflops of information.
- Natural language computing to improve how computers interact with humans.
Many of these capabilities will become important as government and commercial organizations continue to move to cloud-based models. Virtualization is one of the challenges for cloud technologies. Although it has been in use for a while, as virtualization is layered over other processes, there will be difficulties, McQueeney said.
For example, enterprise firms are overwhelmed with image files, he said. Not only must this material be stored and processed but much of it represents legacy code, which can be challenging to update or interpret. Another challenge involves updating legacy operating environments, which must be automated to work with modern software applications. “That’s very heavy lifting,” he added.
Among the upcoming developments, processing big data sets will be important to efforts such as signals intelligence. McQueeney said that faster computers capable of managing large amounts of data will open up new opportunities for development and require new types of analytics.
Massive data processing will have applications in other areas such as cybersecurity. The U.S. Air Force is working with IBM on a program to monitor network traffic to find a cyberattack’s command-and-control patterns to disrupt it within milliseconds, he said.
IBM also is focusing on cognitive and language capabilities for computers. The company’s recent work with the Watson language computing system is an example of an attempt to develop a system capable of learning natural language. The computer system was designed to compete against human contestants on the "Jeopardy!" quiz show to demonstrate its language and syntax analysis capabilities.
Developing Watson was challenging because they system had to be able to ask questions and keep up with human contestants and deliver precise answers. Because the system relies on a deep analytic approach to language, it performs better than a search engine for finding information. “It seems like an easy problem. In fact, it’s a devilishly difficult one,” he said.