Pulse


Pulse

By GCN Staff


Power transformers pose weakness in the electric grid

Solar weather storms and terrorist attacks are two of the greatest potential threats to the U.S. power grid, but vulnerabilities in Large Power Transformers (LPT) may also pose a high risk, according to panelists at the recent Data Center World conference. 

LPTs help utilities manage the transmission of electricity and adjust the electric voltage on each segment of the grid from generation to the end user,  according to a report on the conference by Data Center Knowledge. 

They can be a particularly vulnerable part of the power grid because LPTs are custom-designed, cost millions of dollars to replace and, at 100 to 400 tons apiece, are difficult to transport. According to a 2012 Energy Department report, replacing LPTs could take 20 months or more given the complex procurement rules for the technology and the fluctuating price of copper and electrical steel in recent years. 

Consequently, damage to one or more LPTs could expose utilities to significant downtime and further threats. “Most utilities have few spare transformers,” Tom Popik, founder of the Foundation for Resilient Societies, told the conference. 

Popik recommended data center managers engage with public utilities and elected officials to make them aware of this and other threats to the power grid, according to the Data Center Knowledge report. 

Posted on Oct 08, 2013 at 7:26 AM0 comments


West Virginia

West Virginia, Kansas City data centers on track for government business

DC Corp is launching a first data center in Martinsburg, W.Va., with an eye toward government customers. With 22 federal agencies already operating in West Virginia, Chuck Asbury, the company’s CEO, said he hopes the new data center in Martinsburg will appeal to government. 

Only 90 miles from Washington, D.C., but “outside the blast zone,” the Tier 3 facility will provide backup, disaster recovery or live hosting services as well as the option for organizations to build out the space as they see fit. 

With its access to dark fiber and Internet2, the company’s main focus will be federal customers and higher education institutions, Asbury told Data Center Knowledge. Groundbreaking will be in October with the first section of the facility coming online in the second quarter of 2014.

Meanwhile, in Kansas City, Mo., Hunt Midwest Real Estate Development announced that it is breaking ground on the first phase of SubTropolis Technology Center (STC), an underground, mission-critical data center. LightEdge Solutions, a cloud computing, colocation and consulting company, will be the anchor tenant for STC and will open the first phase of its 60,000-square foot underground operation, built to Tier 3 standards, in the first quarter of 2014. 

With LightEdge as the “proof of concept,” Hunt Midwest will market STC to government agencies and larger enterprise users who want to operate their own data centers, according to Data Center Knowledge.

SubTropolis can provide a secure location for government data centers because of the underground facility’s ability to withstand natural disasters, including F5 tornadoes, and other security threats, according to a report in the Kansas City Business Journal.

SubTropolis Technology Center is served by dark fiber capable of carrying 80 10-gigabit/sec waves, far beyond the 1 gigabit service Google is bringing to Kansas City’s residential customers via Google Fiber, Mike Bell, general manager for Hunt Midwest’s industrial/commercial development division, told the Business Journal. 

Posted on Sep 24, 2013 at 10:14 AM0 comments


lineworkers

White House toolkit helps deliver federal broadband

The Obama administration made expansion of broadband networks one of its top technology agenda items in the belief that access to fast Internet connections would drive economic development, especially in rural and low-income communities.

A key piece of the strategy was to lower barriers for broadband providers to build out networks on federal property, roads and rights of way. And there’s a lot of land available: the federal government owns nearly a third of all property in the United States, on which sits about 10,000 buildings, according to a notice on the White House blog.

Last week the administration made available a set of tools to help companies choose sites to set up high-speed Internet access, particularly in underserved communities. The tools include:

An online mapping tool that displays all General Services Administration-owned buildings and lands, including contact information for assistance, and pointers to where commercial antennas might be best situated.

The map has interactive features highlighting information to help locate such sites, including the location of National Parks and other protected wilderness areas. The map was built with open government data, displayed in a new way to make it easier for carriers to take advantage of federal assets in planning or expanding their networks.

Another tool, the “dig once” guide, includes tips and policies for helping telecom carriers schedule broadband and network installations at the same time. According to the guide, coordinating close timing of network construction projects can cut costs by 90 percent.

The administration has also set up a “permitting dashboard” that can make it easier for companies to locate and complete paperwork surrounding a broadband project, including construction permits, lease agreements and other broadband application materials.

GSA is working to prepare a single master application for deploying broadband on federal land, which would help streamline the process for wireless and wireline network builds. The Agriculture Department has a similar streamlining tool under development, according the White House.

In the next few weeks, the White House said, it would also launch an online broadband projects tool, to be located on the Transportation Department’s Federal Infrastructure Projects Permitting Dashboard, to help agencies identify their broadband projects and track their status for the public.

Posted on Sep 24, 2013 at 11:36 AM0 comments


Image compiled from NOAA Historical Hurricane Tracks website

NOAA puts 170 years of hurricane history into one interactive site

Hurricanes are never good news, but they do make history. The National Oceanic and Atmospheric Administration has put a lot of that history in one place, with its Historical Hurricane Tracks website,  which puts more than 170 years of global hurricane data into an interactive map.

The site serves up data on global hurricanes as they made landfall going back to 1842, long before hurricanes were given names, and provides links to information on tropical cyclones in the United States since 1958, and other U.S. storms dating back to 1851. The most recent addition to the site provides details on last year’s Hurricane Sandy.

Visitors to the site can search by location, storm name or ocean basin and select the search area  (by nautical miles, statute miles or kilometers). Selecting Miami, for example, will display  a map on south Florida criss-crossed by the tracks of many a hurricane.

Hover the cursor over any of the tracks, which are color-coded to indicate their strength on the Saffir-Simpson Hurricane Wind Scale and how their strength changes during their course, and a table on the left will show the name of that storm, if it has one. Clicking on the track or the name in the table will isolate that storm, so the track appears alone on the map, with information in the table showing the wind speed and air pressure when it hit land.

So weather fans can follow the track of 1982’s Andrew through south Florida, 2005’s Katrina when it hit New Orleans or the unnamed marauder that swept through Galveston, Texas, in 1900 and which is still the deadliest hurricane in U.S. history.

Users can zoom in or out of the maps, select views by county and click links to details on a storm as well as NOAA’s report on that storm. The site also has information on population changes along U.S. coastal counties from 1900 to 2000, indicating the growing number of people and infrastructure at risk from hurricanes.

The site, which was developed by the NOAA Coastal Services Center along with the agency’s National Hurricane Center and National Climatic Data Center, offers a fairly comprehensive and easily customizable tool for checking a hurricane’s history. As hurricane season gets into its busiest months, it’s not a bad time to look back.

Posted on Sep 23, 2013 at 12:25 PM2 comments


security monitoring tools

Survey finds IT managers ill-equipped to face cyber threats

IT security managers are under the gun, and lack the analytics tools necessary to neutralize – or even notice – serious threats to their networks, according to a recent survey on the use of security intelligence tools in a variety of organizations.

A study of 600 IT pros by SolarWinds, an IT management software vendor, and the SANS Institute found that most managers wanted “greater security visibility and context,” but were operating with a limited budget for info security and compliance tools.

And though most respondents said they planned to invest in these tools, half of them were spending 20 percent or less of their IT budget on security. The survey was set up to identify the use of security analytics and intelligence to reduce those threats. 

Most reported having a problem with targeted attacks that were missed by antivirus and other point solutions. Forty-five percent of responders said that they had been hit in the last two years with one or more attacks that were “difficult to detect.” But another 20 percent said they lacked the visibility into their networks to even determine the answer to the question. 

The survey showed such "difficult to detect" attacks took about a full week to detect and were caused by poor visibility or not collecting the right operational and security data to identify the threat.

The data used most often included log data from networks and servers, network monitoring data and data from applications and access control systems, according to the survey results.

Organizations looking to acquire new security intelligence tools in the next year want to incorporate data from endpoint and server monitoring tools, as well as data associated with virtual and cloud systems. They are also looking for training and vulnerability management tech and other security information and event management tech, according to the survey.

Security threats have becomes so pervasive that, “it's important for all IT pros to be equipped to tackle security challenges," not just security experts, said SolarWinds vice president Sanjay Castelino.

Posted on Sep 20, 2013 at 12:15 PM0 comments