By GCN Staff

Amtrak moves real-time route maps to the Google cloud

Many urban public transportation systems offer riders GPS-based apps that show arrival times for the next bus or subway. Now Amtrak is jumping on the geospatial platform with a deal to have its national mapping data hosted in the Google cloud.

Amtrak said it would tap Google’s Maps Engine to offer an interactive train locator map, giving its 31 million customers way to see check a train’s position and when it will arrive at the station. After buying tickets, checking for train arrival times is the second most popular online activity at Amtrak.

The system works by taking near real-time train location data from GPS devices on each train. As a train passes by sensors near the tracks, location information is pushed into Google Maps Engine, along with station data from Amtrak’s content management system.

Steve Alexander, Amtrak’s creative director of e-commerce, said in a blog post that with Google handing the cloud infrastructure, Amtrak’s e-commerce team will be freed up to develop "more ways to make our map traveler-friendly, like adding information about local transit, restaurants and nearby tourist attractions."

Posted on Oct 10, 2013 at 10:26 AM1 comments

4 takeaways from HealthCare.gov launch

Call them glitches or failures, errors or slowdowns, but the Patient Protection and Affordable Care Act health exchanges off to a bumpy start. James Turner, writing for O’Reilly Programming, put together a list of what developers can learn from the launch of HealthCare.gov: 

Load testing: Because of the scale of the traffic, developers “need to really bang on the core functionality of the site, and tune the heck out of it.”

Functional design: Developers using JavaScript and AJAX for transitions and requests need to be very tolerant of intermittent failures on the back end.

Validation logic: Keep the client code, server code, error messages and instructions in sync.

User experience: Test, test, test. 

Posted on Oct 09, 2013 at 1:23 PM2 comments

Power transformers pose weakness in the electric grid

Solar weather storms and terrorist attacks are two of the greatest potential threats to the U.S. power grid, but vulnerabilities in Large Power Transformers (LPT) may also pose a high risk, according to panelists at the recent Data Center World conference. 

LPTs help utilities manage the transmission of electricity and adjust the electric voltage on each segment of the grid from generation to the end user,  according to a report on the conference by Data Center Knowledge. 

They can be a particularly vulnerable part of the power grid because LPTs are custom-designed, cost millions of dollars to replace and, at 100 to 400 tons apiece, are difficult to transport. According to a 2012 Energy Department report, replacing LPTs could take 20 months or more given the complex procurement rules for the technology and the fluctuating price of copper and electrical steel in recent years. 

Consequently, damage to one or more LPTs could expose utilities to significant downtime and further threats. “Most utilities have few spare transformers,” Tom Popik, founder of the Foundation for Resilient Societies, told the conference. 

Popik recommended data center managers engage with public utilities and elected officials to make them aware of this and other threats to the power grid, according to the Data Center Knowledge report. 

Posted on Oct 08, 2013 at 7:26 AM0 comments

West Virginia

West Virginia, Kansas City data centers on track for government business

DC Corp is launching a first data center in Martinsburg, W.Va., with an eye toward government customers. With 22 federal agencies already operating in West Virginia, Chuck Asbury, the company’s CEO, said he hopes the new data center in Martinsburg will appeal to government. 

Only 90 miles from Washington, D.C., but “outside the blast zone,” the Tier 3 facility will provide backup, disaster recovery or live hosting services as well as the option for organizations to build out the space as they see fit. 

With its access to dark fiber and Internet2, the company’s main focus will be federal customers and higher education institutions, Asbury told Data Center Knowledge. Groundbreaking will be in October with the first section of the facility coming online in the second quarter of 2014.

Meanwhile, in Kansas City, Mo., Hunt Midwest Real Estate Development announced that it is breaking ground on the first phase of SubTropolis Technology Center (STC), an underground, mission-critical data center. LightEdge Solutions, a cloud computing, colocation and consulting company, will be the anchor tenant for STC and will open the first phase of its 60,000-square foot underground operation, built to Tier 3 standards, in the first quarter of 2014. 

With LightEdge as the “proof of concept,” Hunt Midwest will market STC to government agencies and larger enterprise users who want to operate their own data centers, according to Data Center Knowledge.

SubTropolis can provide a secure location for government data centers because of the underground facility’s ability to withstand natural disasters, including F5 tornadoes, and other security threats, according to a report in the Kansas City Business Journal.

SubTropolis Technology Center is served by dark fiber capable of carrying 80 10-gigabit/sec waves, far beyond the 1 gigabit service Google is bringing to Kansas City’s residential customers via Google Fiber, Mike Bell, general manager for Hunt Midwest’s industrial/commercial development division, told the Business Journal. 

Posted on Sep 24, 2013 at 10:14 AM0 comments


White House toolkit helps deliver federal broadband

The Obama administration made expansion of broadband networks one of its top technology agenda items in the belief that access to fast Internet connections would drive economic development, especially in rural and low-income communities.

A key piece of the strategy was to lower barriers for broadband providers to build out networks on federal property, roads and rights of way. And there’s a lot of land available: the federal government owns nearly a third of all property in the United States, on which sits about 10,000 buildings, according to a notice on the White House blog.

Last week the administration made available a set of tools to help companies choose sites to set up high-speed Internet access, particularly in underserved communities. The tools include:

An online mapping tool that displays all General Services Administration-owned buildings and lands, including contact information for assistance, and pointers to where commercial antennas might be best situated.

The map has interactive features highlighting information to help locate such sites, including the location of National Parks and other protected wilderness areas. The map was built with open government data, displayed in a new way to make it easier for carriers to take advantage of federal assets in planning or expanding their networks.

Another tool, the “dig once” guide, includes tips and policies for helping telecom carriers schedule broadband and network installations at the same time. According to the guide, coordinating close timing of network construction projects can cut costs by 90 percent.

The administration has also set up a “permitting dashboard” that can make it easier for companies to locate and complete paperwork surrounding a broadband project, including construction permits, lease agreements and other broadband application materials.

GSA is working to prepare a single master application for deploying broadband on federal land, which would help streamline the process for wireless and wireline network builds. The Agriculture Department has a similar streamlining tool under development, according the White House.

In the next few weeks, the White House said, it would also launch an online broadband projects tool, to be located on the Transportation Department’s Federal Infrastructure Projects Permitting Dashboard, to help agencies identify their broadband projects and track their status for the public.

Posted on Sep 24, 2013 at 11:36 AM0 comments