The Obama administration made expansion of broadband networks one of its top technology agenda items in the belief that access to fast Internet connections would drive economic development, especially in rural and low-income communities.
A key piece of the strategy was to lower barriers for broadband providers to build out networks on federal property, roads and rights of way. And there’s a lot of land available: the federal government owns nearly a third of all property in the United States, on which sits about 10,000 buildings, according to a notice on the White House blog.
Last week the administration made available a set of tools to help companies choose sites to set up high-speed Internet access, particularly in underserved communities. The tools include:
An online mapping tool that displays all General Services Administration-owned buildings and lands, including contact information for assistance, and pointers to where commercial antennas might be best situated.
The map has interactive features highlighting information to help locate such sites, including the location of National Parks and other protected wilderness areas. The map was built with open government data, displayed in a new way to make it easier for carriers to take advantage of federal assets in planning or expanding their networks.
Another tool, the “dig once” guide, includes tips and policies for helping telecom carriers schedule broadband and network installations at the same time. According to the guide, coordinating close timing of network construction projects can cut costs by 90 percent.
The administration has also set up a “permitting dashboard” that can make it easier for companies to locate and complete paperwork surrounding a broadband project, including construction permits, lease agreements and other broadband application materials.
GSA is working to prepare a single master application for deploying broadband on federal land, which would help streamline the process for wireless and wireline network builds. The Agriculture Department has a similar streamlining tool under development, according the White House.
In the next few weeks, the White House said, it would also launch an online broadband projects tool, to be located on the Transportation Department’s Federal Infrastructure Projects Permitting Dashboard, to help agencies identify their broadband projects and track their status for the public.
Posted on Sep 24, 2013 at 11:36 AM0 comments
Hurricanes are never good news, but they do make history. The National Oceanic and Atmospheric Administration has put a lot of that history in one place, with its Historical Hurricane Tracks website, which puts more than 170 years of global hurricane data into an interactive map.
The site serves up data on global hurricanes as they made landfall going back to 1842, long before hurricanes were given names, and provides links to information on tropical cyclones in the United States since 1958, and other U.S. storms dating back to 1851. The most recent addition to the site provides details on last year’s Hurricane Sandy.
Visitors to the site can search by location, storm name or ocean basin and select the search area (by nautical miles, statute miles or kilometers). Selecting Miami, for example, will display a map on south Florida criss-crossed by the tracks of many a hurricane.
Hover the cursor over any of the tracks, which are color-coded to indicate their strength on the Saffir-Simpson Hurricane Wind Scale and how their strength changes during their course, and a table on the left will show the name of that storm, if it has one. Clicking on the track or the name in the table will isolate that storm, so the track appears alone on the map, with information in the table showing the wind speed and air pressure when it hit land.
So weather fans can follow the track of 1982’s Andrew through south Florida, 2005’s Katrina when it hit New Orleans or the unnamed marauder that swept through Galveston, Texas, in 1900 and which is still the deadliest hurricane in U.S. history.
Users can zoom in or out of the maps, select views by county and click links to details on a storm as well as NOAA’s report on that storm. The site also has information on population changes along U.S. coastal counties from 1900 to 2000, indicating the growing number of people and infrastructure at risk from hurricanes.
The site, which was developed by the NOAA Coastal Services Center along with the agency’s National Hurricane Center and National Climatic Data Center, offers a fairly comprehensive and easily customizable tool for checking a hurricane’s history. As hurricane season gets into its busiest months, it’s not a bad time to look back.
Posted on Sep 23, 2013 at 12:25 PM2 comments
IT security managers are under the gun, and lack the analytics tools necessary to neutralize – or even notice – serious threats to their networks, according to a recent survey on the use of security intelligence tools in a variety of organizations.
A study of 600 IT pros by SolarWinds, an IT management software vendor, and the SANS Institute found that most managers wanted “greater security visibility and context,” but were operating with a limited budget for info security and compliance tools.
And though most respondents said they planned to invest in these tools, half of them were spending 20 percent or less of their IT budget on security. The survey was set up to identify the use of security analytics and intelligence to reduce those threats.
Most reported having a problem with targeted attacks that were missed by antivirus and other point solutions. Forty-five percent of responders said that they had been hit in the last two years with one or more attacks that were “difficult to detect.” But another 20 percent said they lacked the visibility into their networks to even determine the answer to the question.
The survey showed such "difficult to detect" attacks took about a full week to detect and were caused by poor visibility or not collecting the right operational and security data to identify the threat.
The data used most often included log data from networks and servers, network monitoring data and data from applications and access control systems, according to the survey results.
Organizations looking to acquire new security intelligence tools in the next year want to incorporate data from endpoint and server monitoring tools, as well as data associated with virtual and cloud systems. They are also looking for training and vulnerability management tech and other security information and event management tech, according to the survey.
Security threats have becomes so pervasive that, “it's important for all IT pros to be equipped to tackle security challenges," not just security experts, said SolarWinds vice president Sanjay Castelino.
Posted on Sep 20, 2013 at 12:15 PM0 comments
The New York City’s Department of Transportation is reusing data generated by the state’s popular RFID-enabled E-ZPass toll-paying system to feed other traffic management and analytics applications across New York City, although critics say the agency has not made the extent of its traffic data-sharing well known to the public.
According to a report in Forbes magazine, the E-ZPass data is being fed to Midtown in Motion, a traffic management program announced by the mayor’s office in 2011 that uses 100 microwave sensors, 32 video cameras and E-ZPass readers at 23 intersections to gauge world-class traffic congestion in the heart of the city.
E-ZPass and data from the other sources is gathered using the New York City Wireless Network and processed by the city’s Traffic Management Center in Long Island City, where it is used to highlight traffic choke points, adjust traffic light timing and ultimately help the city’s greenhouse gas emissions, according to 2011 announcement by the mayor’s office.
The total cost for installation of the system was $1.6 million, with $1 million in city funding and $600,000 in funding from the Federal Highway Administration. The DOT says the system resulted in a 10 percent improvement in travel speeds and reduced pollution in its first year.
Extending the use of E-ZPass data and the terms and conditions under which it is reused and stored has not been broadly circulated, Forbes reported. At the same time, the data is not in motion for very long.
TransCore, a company that makes the RFID readers, told Forbes that the tag ID data is encrypted and only held in memory for several minutes. The system “cannot identify the tag reader and does not keep any record of the tag sightings,” said a company spokesperson.
That does not satisfy some, including a Defcon conference attendee who aired his concerns about the future of open-ended personal data sharing policies and practices. “If NYDOT can put up readers,” he told Forbes, “other agencies could as well.”
Posted on Sep 17, 2013 at 12:25 PM1 comments
The prospects for using an abandoned stretch of the TV spectrum to bring wireless service to rural areas will get an extensive test in the months ahead, as the Gigabit Libraries Network pilots Super Wi-Fi at public libraries in six states.
Super Wi-Fi uses unlicensed, low-frequency bands in the radio-frequency spectrum — called TV white space — that were opened up by the Federal Communications Commission in 2010 after TV broadcasters switched from analog to all-digital signals. The lower frequency limits throughput but greatly extends its range compared with established Wi-Fi signals, allowing signals that can go for several miles and pass through walls and buildings. It’s seen as a potential solution for bringing wireless service to underserved, mostly rural, areas.
GLN put out the call July 1 for libraries interested in forming a consortium for testing the technology and got submissions from more than 50 library systems, GLN said in an announcement. It accepted proposals from Delta County, Colo.; Pascagoula, Miss.; Stokie, Ill.; Humboldt County, Calif.; eight libraries in New Hampshire; and four locations in Kansas: Kansas City, Lawrence, Manhattan and Topeka/Shawnee.
The library systems will deploy Super Wi-Fi access points on e-bookmobiles and other publicly accessible places, GLN said.
Libraries, as a traditional source of public information, are a logical place to test the technology. About 15,000 libraries around the country currently have Wi-Fi access, but their short-range signals require people to be on premises. And another 1,500 libraries have no wireless access at all.
The national pilot, which grew out of a local initiative in Kansas City, will be “extremely important” in assessing Super Wi-Fi’s ability to help bridge the digital divide, Don Means, GLN coordinator, told Government Technology.
Super Wi-Fi projects have been somewhat slow to develop since the FCC freed up the spectrum, but pilots began cropping up this year. In January, Wilmington, N.C., and its surrounding New Hanover County launched the first municipal Super Wi-Fi network. And in July, West Virginia University deployed the first such network on a university campus.
Super Wi-Fi is not technically Wi-Fi, since it doesn’t conform to the set of IEEE 802.11 standards designated by the Wi-Fi Alliance as Wi-Fi, but it still functions on the same interoperable principles. GLN said the national pilot is an attempt to show how combining Wi-Fi compatibility with the far-reaching signals of the TV white spaces can deliver free wireless service.
Posted on Sep 16, 2013 at 1:28 PM1 comments