NOAA’s weather data calls for clouds
- By Zach Noble
- Jul 02, 2015
Producing 20 terabytes of data every day, the National Oceanic and Atmospheric Administration needs extensive storage and computing power to make that all information accessible. With its Big Data Project, NOAA is harnessing its data to private-sector cloud infrastructure to deliver its data products to the public.
Making data public is part of NOAA’s mission, but that doesn’t mean the organization is fully equipped to disseminate that data on its own, Alan Steremberg, NOAA’s big data-focused Presidential Innovation Fellow, said at AFCEA Bethesda’s June 30 Data Symposium.
Steremberg would know.
He helped start Weather Underground and later went to work for the Weather Channel after it bought Weather Underground, giving him a firsthand look at two commercial ventures with business models based on packaging NOAA’s information.
Between real-time data and archived data, “these things are just really big and hard to move around,” Steremberg said. NOAA’s challenge became, “can we put compute alongside this data?”
In April, NOAA inked deals with Amazon Web Services, Google, IBM, Microsoft and the Open Cloud Consortium to do just that.
There has been “tremendous” industry uptake on NOAA’s data, Max Peterson, general manager at AWS Worldwide Public Sector, told FCW, GCN’s sister site.
“Data’s interesting, there’s tremendous value in data, but the second half of that is having a partner ecosystem that turns around and makes that data relevant to something that somebody cares about,” said Peterson.
The AWS partner ecosystem teams NOAA’s data with companies that can use it – say, Esri for geospatial mapping – and Peterson said AWS’s cloud infrastructure will deliver ever-more compute power to wrangle the data, without breaking the bank.
AWS’s Elastic File System, previewing this summer, enables users to pay only for the storage they use, Peterson touted. “It’s elastic, it’s on-demand, it’s scalable.” And AWS Lambda enables users to pay only when events come through and code runs, on the order of 20 cents per million events.
It has big implications for Internet of Things deployment such as NOAA’s, with myriad connected devices reporting constantly, and the overall flexibility of cloud means that vast data sets can be analyzed in their entirety, rather than broken off in sample set chunks.
“All of a sudden on AWS they can spin up 1,000 cores, or 5,000 cores, or however much compute they want to acquire to run their job,” Peterson said. “You can now run 100 percent of your data through whatever your analytic program is.”
From Steremberg’s perspective, private partnerships help ensure a “customer-first approach” at NOAA, with a focus on getting NOAA data on the “things that really drive our economy,” from weather data to ocean currents, packaged right.
“We’ll hopefully learn that there’s a model where everyone can make lots of money,” Steremberg added, saying startups and established companies alike could benefit from the cloud revolution.
In a landscape punctuated by costly mistakes and outright failures, NOAA Data Asset Portfolio Analyst David McClure offered some of the highest praise a federal technology program can get.
“What I’ve been struck with [is] things have been unfolding the way we expected they would,” McClure said of the Big Data Project at AWS’s annual Public Sector Symposium last week. “In a way, that’s surprising – that it’s working the way it’s supposed to.”
Amanda Ziadeh contributed to this report.
Zach Noble is a former FCW staff writer.