The House of Representatives has officially jumped on the open source bandwagon. A June 25 announcement declared that U.S. representatives, committees and staff would be able to procure open source software, participate in open source software communities and contribute code developed with taxpayer dollars to open source repositories.
Uncertainty had hung over the question of whether open source software, communications and code contributions were permitted within Congress because of restrictions relating to soliciting gifts. It has now been determined that — in general —members and staff in the House, when conducting official business, have a choice between using proprietary technology and open source solutions, according to the joint announcement by the OpenGov Foundation, the Sunlight Foundation and the Congressional Data Coalition.
Within Congress, support for open source software has been growing. In the next few weeks, Rep. Blake Farenthold (R-Texas) and Rep. Jared Polis (D-Colo.) plan to launch a House Open Source Caucus.
“We now have clear guidance on the use of open source software in the House of Representatives,” said Rep. Darrell Issa (R-Calif.). Members of Congress and the open source community can work collaboratively to improve online access to the Congress and bring the institution more in line with other flexible, modern organizations that use open source solutions to realize cost-savings and greater efficiency.”
In October 2014, the OpenGov Foundation, Sunlight Foundation and Congressional Data Coalition jointly called for rules changes that would permit the use and publication of open source software by House offices.
Posted on Jun 29, 2015 at 12:52 PM0 comments
Michigan has launched a geographic information systems open data website to let the GIS community search, preview and browse and download geospatial datasets or view them on Esri ArcGIS maps.
The site provides access to updated geospatial data on boundaries, geology, demographics, public health and other categories to help those in natural resources, public safety, environment, health and human services, transportation and tourism make more informed decisions.
The data can be downloaded as Esri shapefiles, spreadsheets or KML files, as well as accessed via API.
“This new site is a key piece of our overall efforts to make information open and available to citizens,” said David Behen, director of Michigan’s Department of Technology Management and Budget. “Pulling it all together in one place will improve the overall experience for everyone.”
Posted on Jun 24, 2015 at 1:41 PM0 comments
When BP’s Deepwater Horizon rig ruptured in the Gulf of Mexico in 2010, oil gushed into the water faster than agencies could respond. And the problem wasn’t just stopping the leak, it was informing the public about extent of the damage and progress on fixing it.
“The public imaging of this really wasn’t a home run for the Coast Guard at day one,” Adm. Paul Zukunft, Commandant of the U.S. Coast Guard, admitted in a recent keynote address at the Center for Strategic and International Studies.
So the Coast Guard worked with the National Oceanic and Atmospheric Adminstration to develop the Emergency Response Management Application (ERMA), an online mapping tool that integrates both static and real-time data in an easy-to-use format for environmental responders and decision makers.
By putting the data “out on the Internet,” Zukunft said, “people could navigate through it and not wait for the next CNN news cast” to find out what was happening with the oil spill.
Before long, the joint mapping application exploded. “[W]ithin 12 hours we had 200,000 hits…The next day it was two-and-a-half million. And then the public trust level went up as transparency of information went up as well,” he said.
The application was subsequently adapted for oil Alaskan oil spills in 2012.
"Arctic ERMA builds on the lessons we learned on usability, data management and data visualization from the Deepwater Horizon/BP disaster," said Amy Merten, then with NOAA’s Office of Response and Restoration.
Beyond visualization of oil spills, NOAA’s Data Integration, Visualization, Exploration and Reporting tool, or DIVER, manages and integrates data from the myriad sources that collected information during the five years following the Deepwater Horizon spill.
“NOAA pledged from the start of the Deepwater event to be as transparent as possible with the data collected,” said NOAA Administrator Kathryn D. Sullivan. “The DIVER data warehouse approach builds upon that original pledge and is another significant step in making NOAA’s environmental data available for the research community, resource managers and the general public.”
Posted on Jun 22, 2015 at 1:41 PM0 comments
Even the National Security Agency is using software defined networking these days.
Bryan Larish, the NSA's technical director for enterprise connectivity and specialized IT services, spoke at the recent Open Network Summit in Silicon Valley, and said the intelligence agency is deploy an OpenFlow SDN for its internal operations.
OpenFlow is one of the leading SDN protocols that allows centralized control and easy reprogramming of the packet-moving decisions on a network.
“We as an enterprise need to be able to control our network,” Larish told CIO.com. “OpenFlow centralized control seemed the only viable way to do this from a technical perspective. We are all in on OpenFlow.”
NSA is testing an OpenFlow SDN at both its main campus and branch offices, CIO.com reports. At NSA headquarters, the deployment is limited to a small section of the network for development. The agency is also using OpenStack in its data centers, Larish said, and NSA is looking to other commercially available products to address its network integration and management needs.
Posted on Jun 19, 2015 at 9:04 AM0 comments
The Environmental Protection Agency has added air pollution information to its database of compliance history. With this upgrade, users can now view and compare air quality data and facility compliance information on one web page, as opposed to searching through four different websites and databases.
The addition of air pollution data to the Enforcement and Compliance History Online (ECHO) database gives users a way to combine data and facility compliance information from the Toxic Release Inventory, National Emissions Inventory, Greenhouse Gas Reporting Program and Acid Rain Program.
Users can search by location, facility type, environmental conditions, year, emission amount, pollutant type, and enforcement/compliance actions and violations. Subcategories exist as well, helping users find exactly what they need.
This new feature adds to a string of recent improvements to ECHO:
- Dashboards featuring interactive graphs, charts and easy-to-analyze data have been added to illustrate facility compliance with pesticide information, as well as public water system violations and compliance data surrounding the Safe Drinking Water Act.
- ECHO widgets and web services allow developers to leverage data and reports into their own websites.
- A new customizable mapping tool showing the compliance status of EPA-regulated facilities lets users create customized maps using current data.
Posted on Jun 15, 2015 at 9:32 AM0 comments