Data center administrators, more than most IT managers, deal in trade-offs: they keep an eye on the cost of efficiency and weigh energy spent against peformance. One dial on the dashboard they pay close attention to is the temperature in the data center as they look to deliver performance at a reasonable cost.
According to a recent Computerworld report, the General Services Administration has recommended raising data center temperatures from 72 degrees Fahrenheit to as high as 80 degrees F. For every additional degree of temperature in a data center’s server inlet space, GSA said, it can save 4 percent to 5 percent in costs, according to the Computerworld article.
Those numbers square with 2008 recommendations by the American Society of Heating, Refrigerating and Air-Conditioning Engineers, which put the recommended temperature of data centers between 64.4 F degrees and 80.6 degrees F, along with a caveat that staying within that range does not ensure the data center is operating at top energy efficiency.
So, given the GSA and industry guidelines and caveats, what is the trend line on temperature ranges preferred by most data center operators?
According to a recent survey by the Uptime Institute, few data centers are being managed anywhere near the GSA limits. About half of more than 1,000 data centers from around the world are keeping the temperature in a spring-like range of 71-75 degrees F, accoring to the Computerworld report.
The survey did pick up a small surge toward hotter, low-cost environments, with 7 percent of data center operators keeping temperatures above 75 degrees, a jump from only 3 percent the previous year. At the same time, fewer operators are maintaining data center temperatures at the lower end of the ASHRAE range: only 6 percent compared to 15 percent in 2011.
And if you do decide to turn up the data center thermostat, it pays to go slow, Computerworld reported. "In order to implement hotter (temps), you need to do it gradually, and make sure you're not causing problems in other parts of the data center," said Uptime Institute content director Matt Stansberry.
Posted on Jul 18, 2013 at 12:58 PM0 comments
In high-frequency securities trading, milliseconds – even meters – can mean money. Algorithms that govern the trading process can move transactions so quickly that a few seconds jump on market information can translate into a financial advantage for buyers and sellers. That’s why investment firms in New York are snapping up office space in the city’s financial district and converting it to data centers.
And according to a report by CNBC, a similar phenomenon is taking place in the nation’s capital, where market-moving economic data is released on a daily basis.
Firms that trade on government economic data are paying for server space on K Street, converting what was once an address for high-powered lobbyists into a home for high-powered analytical data centers.
CoreSite, a company that operates data centers around the country, including a data center on K Street, offers financial traders "co-located" computers right in the heart of Washington. From there, it can provide split-access access to a steady stream of economic indicators from key financial offices, including the Department of Labor’s monthly employment report, released from the agency’s Constitution Ave. headquarters; changes to the Fed funds rate, announced from Treasury building on Pennsylvania Ave; and other economic big data generated from the departments of Commerce and Justice along the National Mall, according to CNBC.
As long as three year ago, CoreSite said its Washington, D.C., data center could offer “more than a millisecond advantage over suburbs such as Ashburn, Virginia,” according to a report from Data Center Knowledge about CoreSite’s low-latency hub.
Posted on Jul 17, 2013 at 10:14 AM0 comments
Super Wi-Fi systems, which became possible when TV stations abandoned analog for all-digital broadcasting, haven’t exactly taken the country by storm. But some public-sector organizations are starting to take advantage of the opportunity.
In January, Wilmington and New Hanover County in North Carolina launched the first municipal Super Wi-Fi, or “white spaces,” network. And this week, West Virginia University became the first university to deploy a Super Wi-Fi network on campus, providing free wireless access on its the Public Rapid Transit platforms, whose trams carry about 15,000 riders a day.
WVU worked with the AIR.U (Advanced Internet Regions University) consortium to build the network, which uses white spaces in the radio frequency spectrum left open by the TV broadcasting shift and freed up in 2010 by the Federal Communications Commission. WVU officials called the system a test site for Super Wi-Fi that could pave the way for bringing broadband connectivity to rural areas.
Rural and other areas that lack wireless broadband access are the target for the technology, which does have a somewhat misleading name. For one thing, it’s not really Wi-Fi, since it falls outside of the specific set of interoperable IEEE 802.11 standards designated as Wi-Fi and managed by the Wi-Fi Alliance. The alliance, in fact, has publicly objected to the term Super Wi-Fi.
However, because it operates in much lower frequencies than Wi-Fi, its signals can carry much farther and can reach further into buildings, allowing it to cover a much larger area, which makes it ideal for rural settings.
That’s the focus on AIR.U, which aims to bring wireless connectivity to rural campuses.
“Colleges in rural areas will be the greatest beneficiaries of Super Wi-Fi networks because they are located in communities that often lack sufficient broadband, their needs are greater and there is typically a large number of vacant TV channels outside the biggest urban markets,” said Michael Calabrese, director of the Wireless Future Project at the New America Foundation’s Open Technology Institute. “This combination of factors makes them ideal candidates for utilizing Super Wi-Fi spectrum to complement existing broadband capabilities.”
AIR.U is a New America initiative, whose founding partners also include Microsoft, Google, the Appalachian Regional Commission and Declaration Networks Group, an organization recently established to plan, deploy and operate Super Wi-Fi technologies.
Posted on Jul 11, 2013 at 11:46 AM0 comments
SGI has completed installation of the ICE X high performance computing system that powers the Defense Department’s Spirit supercomputer, the 14th fastest supercomputer in the world, and the fastest dedicated system within DOD.
The SGI ICE X has been deployed as part of DOD's High Performance Computing Modernization Program (HPCMP), which provides compute resources for the Air Force Research Laboratory at the DOD Supercomputing Research Center.
Named after the B2 Stealth bomber, Spirit is already being used for research such as quantum mechanical simulations with computational time that, SGI says in an announcement, “scales linearly with respect to the number of atoms.”
"Spirit is significantly faster than our previously available platform for running these linear-scaling calculations, which are becoming viable for production level work," said Gary Kedziora, an HPCMP computational materials scientist. This lets scientists “model larger and more complex materials using predictive quantum mechanical methods on thousands of SGI ICE X processor cores."
The ICE X system powers Spirit with 144 T of memory and one of the largest and fastest pure compute InfiniBand clusters, SGI said. Running on the standard Red Hat Enterprise Linux operating system, Spirit is housed in 32 racks and includes 2,304 compute blades with cold-sink technology. It has 9,216 sockets in 73,728 cores that are powered by Intel Xeon E5 processors operating at 2.6 GHz.
It can achieve a peak performance of over 1.5 petaflops (quadrillion floating point operations per second). Spirit also has 6.72 petabytes of SGI InfiniteStorage 5500 storage.
The system is already seeing a lot of use. "Our customers are flocking to the fastest system in the Department of Defense, finding that their applications are performing significantly better on the new system," stated Jeff Graham, the director of the Air Force Research Lab, who added that Spirit has boosted performance on DOD applications by more than 27 percent on average.
Posted on Jul 09, 2013 at 10:49 AM0 comments
A partnership between The National Information Sharing Consortium and Esri could improve the sharing of geospatial information among federal, state and local government agencies during natural disasters or emergencies.
Esri’s ArcGIS Online for Organizations (AGOL) will be deployed as a GIS platform to support the Homeland Security Department’s Virtual USA Program (vUSA), which provides interactive maps that display the location and status of critical assets, including helicopter landing sites, evacuation routes, shelters, gas supplies, water lines and power grids, according to a release.
AGOL is a cloud-based mapping portal that lets emergency management personnel share maps and data with each other and the general public from any device, Web browser or desktop application. The deployment of an AGOL portal that is compliant with vUSA will significantly enhance state and local agencies’ ability to participate in the vUSA program, since AGOL has been widely adopted by these agencies, NISC officials said.
Esri officials endorsed the vUSA initiative in 2010, expressing their commitment to make certain the geospatial company’s technologies supported the goals of the program. The partnership with NISC will accelerate the establishment of shared situational awareness and information sharing capabilities, Esri officials said.
Six pilot projects involving over 35 states have been conducted since 2009 to demonstrate vUSA’s ability to support near real-time information sharing. U.S. federal, state and local governments; Canadian provincial and federal governments; non-governmental organizations; and private-sector partners have participated in vUSA since the first pilot was conducted, NISC officials said.
NISC, in collaboration with DHS’ Science and Technology Directorate’s First Responder Group and the first responder community, are using pilot programs as the basis for transitioning vUSA technologies from prototypes to platform components that will not only be interoperable with existing systems as well as those information sharing efforts already in progress, officials said.
Other GIS-based public-sector IT sharing projects are in development. The National Geospatial-Intelligence Agency and the geospatial community are creating a cloud infrastructure to demonstrate how a coalition of organizations can share geospatial information as they respond to natural disasters around the world.
NGA officials want to see how industry can deliver open, standards-based geospatial data to first responders via multiple, interoperable cloud infrastructures, Todd Myers, NGA’s lead global compute architect, told GCN during a recent interview.
Posted on Jul 02, 2013 at 11:12 AM0 comments