4 technologies that will power future NASA missions

Despite the conclusion of the space shuttle program, NASA shows no signs of slowing down.  The space agency’s recently released draft 2015 Technology Roadmaps outline NASA’s plans across 15 technology areas, ranging from nanotechnology to exploration systems.

Underlying all NASA’s technology goals are its cross-cutting computing abilities that support modeling and simulation, big data and machine intelligence. Supercomputing, efficient use of the cloud, quantum and cognitive computing all will support NASA’s scientific advancements and provide efficiency and flexibility.

The high-performance computing NASA needs is not yet available, however. The challenges include:

Supercomputing. While NASA’s supercomputers are some of the largest in the world, the space agency admits that its largest single computations performed are able to use only 70,000 processors, and achieve just roughly 10 percent of peak computing of these processors. Moreover, NASA’s processors must be restarted from time to time given challenges of limited system communication performance, programmability, reliability and power efficiency.  NASA’s goal is to achieve at least 1,000 times greater application performance, 10 times greater mean time between application failure and 500 times better energy efficiency.

Quantum computing. As more research outlets and universities are exploring quantum computing to solve challenging computational problems much faster than current models can manage,  NASA has made advances with its partnership with D-Wave’s  now-operational, 512-quibit specialized quantum computing device.  A quantum computer could optimize mission planning and scheduling, resulting in greater productivity and lower risk. The challenge, NASA said, is to produce a quantum computer that can maintain coherence for 1,024 qubits.

Cloud supercomputing. While some applications have already been moved to the cloud, broader use requires more security and transparency so that NASA can use the cloud for ‘surge supercomputing,’ which sends computation to public clouds when needed. Currently it is too labor intensive and expensive to move supercomputer computations to the cloud, and the maximum surge capacity is too small to make a meaningful difference. Within five years, NASA aims to automatically package computations for the cloud, increasing the agency's computing capacity by 50 percent.

Cognitive computing. While other government agencies and businesses are funding synaptic brain-like processors,  NASA noted that a cognitive computing programming paradigm has been developed for brain-like programming systems that bring “memory, processors and communication into close proximity to emulate the brain’s computing efficiency, size and power usage.” But cognitive computers will need up to 40 million times more neurons and synapses than current models and use far less power before they can be widely deployed. Still, cognitive computing offers a way for machines to learn from examples and observation, interact verbally, find complex relationships in data and adapt to circumstances without reprogramming, all of which supports the big data analysis that will enable  future missions.

About the Author

Mark Pomerleau is a former editorial fellow with GCN and Defense Systems.


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/Shutterstock.com)

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected