NASA’s systems for sharing code

NASA has been creating code for decades and boasts more than 300 public open-source projects. The agency’s challenge is not getting buy-in for open source so much as it is managing the enthusiasm for it.


How to be smart about open source

Experts offer five strategies choosing, contracting for and contributing to open-source software projects. Read more.

Jason Duley, NASA’s open-data program manager, said the first tool for doing that is -- a departmentwide internal GitHub instance designed to get “those source codes to the widest audience within NASA as possible.”

“We’ve seen a fair amount of ad hoc collaboration,” he added. “This ultimately saves the taxpayer money … and boosts productivity and the quality of our software internally.”

However, NASA “is like a large corporation with a bunch of little franchises” that have been writing software for decades, Duley said. “Everyone has their own way of developing code.”

So although the internal GitHub deployment is available to all, there are “dozens of code-sharing instances … spread out all over the agency,” he said. Some teams use Subversion, others are attached to Mercurial or their own Git repositories, and they “have varying degrees of visibility.”

Mandating that everyone migrate to was not practical, Duley said, so the agency developed a federated code-sharing system-- “a set of policies and technologies that logically integrate these disparate code” repositories.

The result is a search function that allows any developer in the agency “to see what sort of software is available inside of NASA, irrespective of where it’s physically housed. We can’t expect our folks to know where all the code is, so we’re trying to level the playing field and improve discoverability and reusability.”

Most of the software repositories are private, Duley said, so not all code is freely shared “cube-to-cube.” But by collecting metadata on the private projects, the federated system can at least offer “some basic project info and a point of contact.”

“Part of this is the policy side,” he said. “We’re working with our policy people to put down just exactly how folks should be doing this ... to reach the benefits down the road in terms of reuse and cost savings. We’re trying to make it as lightweight as possible.”

That centralized information also simplifies efforts when NASA makes code available to the general public in a growing catalog that can be found at Many if not most of those projects are now community-driven and live in public repositories at, though Duley said some software is simply listed as “available to be licensed.”

And in some cases, a larger project might be kept inside NASA, but a particular module with broader applications would be open sourced and publicly released. In those situations, Duley said, that module can be “rehomed” to the public-facing repository; all the version history is maintained, and NASA developers can continue to work seamlessly on the overall application.

“That’s the beauty of Git,” he said. “From a DevOps standpoint, you can mix and match different repositories to fit how you do those builds.”

Duley also said bigger benefits are still to come. The federated code-sharing system is only a year old, and as the dataset improves, it “enables us to do a few things to proactively improve the software that NASA produces.”

License management is one example. “What are the third-party licenses or dependencies being pulled into existing projects that could have licensing that could be restrictive or could deny NASA proper rights to do things that they need to do with that software?” he asked. A central repository makes such dependencies much easier to spot and manage.

Looking for vulnerabilities in NASA-developed code is another. “The goal there would be to set up a set of tools to do static analysis on software and proactively look at code repositories that exist in the agency [and] scan those for any vulnerabilities based on a weak library,” Duley said.

Those efforts remain a work in progress. “We haven’t gotten funding yet, but what we do have is a collection of tools that are available across the agency that we’re able to leverage,” he said. And NASA is working on “new business processes and policies to be able to do that more routinely.”

The agency has “been doing this conceptually for years,” Duley said, pointing to the teams that actively scan NASA websites for vulnerabilities. “Why not do the same thing for software projects and try to proactively catch as many of these issues as we can early on?”

About the Author

Troy K. Schneider is editor-in-chief of FCW and GCN, as well as General Manager of Public Sector 360.

Prior to joining 1105 Media in 2012, Schneider was the New America Foundation’s Director of Media & Technology, and before that was Managing Director for Electronic Publishing at the Atlantic Media Company. The founding editor of, Schneider also helped launch the political site in the mid-1990s, and worked on the earliest online efforts of the Los Angeles Times and Newsday. He began his career in print journalism, and has written for a wide range of publications, including The New York Times,, Slate, Politico, National Journal, Governing, and many of the other titles listed above.

Schneider is a graduate of Indiana University, where his emphases were journalism, business and religious studies.

Click here for previous articles by Schneider, or connect with him on Twitter: @troyschneider.


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected