The U.S. Department of Energy last month opened the Energy Systems Integration Facility, a $135 million research center designed to test how power grids, data centers and other IT systems can be made more energy-efficient. In fact, the center itself, located in Golden, Colo., might be the most energy-efficient data center in the world.
One of the systems on its research plan might include the use of robots for energy management and conservation. Recently IBM and EMC developed robots designed to rove data centers and collect temperature, power usage and other data that could affect the performance of data center IT systems.
Cooling alone can account for more than 60 to 70 percent of data center power costs, according to EMC officials, liabilities that can mount up as organizations buy more capacity than needed and overcool their systems. Around 85 percent of data centers also mismanage the provisioning of infrastructure, which increases energy consumption, according to EMC officials.
The EMC Data Center Robot helps combat these problems by patrolling for temperature fluctuations, humidity and system vibrations and locating sources of cooling leaks and other vulnerabilities.
EMC’s DC Robot collects data via digital sensors and sends it through a Wi-Fi connection for processing. An algorithm converts the temperature data into a thermal map, which can be used to identify anomalous hot and cold spots in data center aisles.
Most data centers use a set of fixed sensors to manage temperatures and other energy consumption indicators, an expense that can run into the millions of dollars – “low hanging fruit” that helped justify their investment in the DC Robot, say EMC officials.
While the DC Robot was one of the first data center energy-focused robots, IBM has developed a similar model, which it offers as part of an energy management troubleshooting service.
The firm's Measurement and Management Technologies unit will use the robo-tool to create a "robotic cooling assessment," a three-dimensional temperature and humidity maps to help organizations identify energy sinks and other problem spots in their data centers. The assessment determines a data center’s baseline and high-level cooling capacity.
A third energy diagnostics tool, from Purkay Labs, is a simple portable unit that checks energy-environment data for short or long term intervals. The unit consists of an adjustable carbon fiber rod that measures the air quality at three different heights.
While not mobile and so technically not a robot, “It’s a product that we’ve developed so you can get the temp across the entire aisle,” said CEO Indra Purkayastha.
Posted on Aug 30, 2013 at 11:31 AM0 comments
The U.S. Navy operates one of the world’s largest business enterprises, a floating office park whose equipment and personnel must be available to be shipped anywhere in the world on a moment’s notice. Recently it developed a Web-based tool that helps logistics planners tighten Navy supply lines by identifying unused space on the thousands of military and commercial flights, and ships travelling the globe any given day.
The Transportation Exploitation Tool (TET), whose development was sponsored by the Office of Naval Research, is a cloud-based software tool that helps speed the delivery of spare parts, personnel and other supplies via the quickest available route. ONR provides science and technology support to the Navy and Marine Corps.
“This system is truly revolutionary,” ONR program manager Bob Smith told Armed with Science, a Defense Department blog. “TET uses advances in technology to provide outstanding optimization of available flights and ship routes, saving our logisticians enormous amounts of time — and that can literally mean saving lives.”
The tool, developed at the Naval Supply Systems Command (NAVSUP), has saved the Navy more than $30 million in transportation costs to date, an amount the service estimates will grow to $200 million over 10 years, the blog reported.
The search software enables a planner to enter a description of the cargo that needs to be shipped and where it’s going. Then, “Expedia-like,” it shows all the potential routes where space is available and offers recommendations on the most efficient options.
Without the tool, managers had to search multiple databases via multiple interfaces. This often resulted in the need for additional flights to be booked and long delays, according to the blog post.
The TET tool was developed by several ONR teams, led by NAVSUP and including the Expeditionary Maneuver Warfare and Combating Terrorism Department, SwampWorks and Technology Insertion Program for Savings.
Vice Adm. Philip Cullom, deputy chief of Naval Operations for Fleet Readiness and Logistics, said the tool addresses a critical need for the Navy. Last month, Cullom presented Greg Butler, who led the development of the tool at NAVSUP, with the 2012 Adm. Stan Arthur Award, which recognizes excellence in logistics planning.
“There has been a real need to get things to the fleet faster and more efficiently,” Butler said, “and without breaking the bank in this austere fiscal environment. The naval services continue to work on ways to save money and give our sailors and Marines every advantage we can.”
Posted on Aug 23, 2013 at 1:39 PM0 comments
The Interior Department has awarded a set of 10 individual indefinite delivery indefinite quantity contracts to accelerate its move to the cloud.
The first project is for SAP application hosting, according to the agency. Additional services will include virtual machines, storage, database hosting, secure file transfers and Web hosting, as well as development and test environments. These contracts will not only move these apps to the cloud but move them in a well-planned, methodical way, said Andrew Jackson, deputy assistant secretary for technology, information and business services at Interior.
The individual projects will be awarded via task orders, one for each project, following a one-off competition for the project between the 10 selected vendors: Aquilent, AT&T, Autonomic Resources, CGI, GTRI, IBM, Lockheed Martin, Smartronix, Unisys and Verizon.
An early cloud adopter, Interior experimented with adopting the application service provider model for managing its Freedom of Information Act requests in 2001 and in May 2012 the agency consolidated all its email services into a single cloud-based system using Google Apps for Government.
The new cloud hosting services will allow the agency to begin closing or consolidating hundreds of DOI data centers, Jackson said. The current hosting environment, which focuses on managing servers in-house, will be able to "transition to a modern cloud-based environment, supporting the 25-point Implementation Plan to Reform Federal IT, the Federal Data Center Consolidation Initiative and the Cloud-First Policy outlined by the federal chief information officer," he added.
Posted on Aug 19, 2013 at 1:06 PM0 comments
New York State is moving all of its executive agencies to Microsoft Office 365, consolidating 27 different email, word and data processing systems into a common cloud-based platform, Gov. Andrew Cuomo has announced.
By moving 120,000 employees to a common platform for email, office productivity applications and calendaring, the state expects to save approximately $3 million annually in license fees, hardware, maintenance, energy and personnel costs, Cuomo said in a release.
The agreement, the first of its kind in New York, is the result of the governor’s strategic sourcing and IT transformation projects. In April 2011, Cuomo appointed the Sage Commission, a 20-member team comprising public- and private-sector representatives, to perform a comprehensive review of New York State government and identify structural and operational changes that would help make it more modern, accountable and efficient. Consolidating the state’s email systems was one of the commission’s recommendations. The state currently manages a consolidated Microsoft Exchange email system used by 26 agencies, but more than 50 other agencies manage their own stand-alone email systems.
“This system of multiple email platforms is a barrier to statewide collaboration, results in higher operating costs and presents upgrade challenges, as many systems lag behind more modern services,” said Brian Digman, the state’s CIO.
Moreover, maintaining email systems is not a core business of a state agency and is an unnecessary expense for individual agencies. Resources such as people, money, technology and time are better spent on providing key services, Digman said.
Over the remainder of the year, 120,000 state executive email boxes will be progressively moved to the Office 365 system, officials said. The move is expected to be completed by the end of 2013. Once the consolidation is complete, Office 365 will provide the following benefits to state agencies:
- Standardized email, document creation, calendaring, contacts and the ability to share files more efficiently across state executive agencies.
- Secure access to email and files anywhere, anytime, from multiple devices including laptops, smartphones and tablets.
- A universal address book for all executive agency state employees and some commissions, task forces and the state Education Department.
- The most current versions of Office programs available on multiple platforms.
- Automatic updates of all programs, without additional costs for new licenses.
- Uniform archiving and increased storage.
- Standardized antivirus, anti-spam and encryption security tools.
- Full system redundancy and backup.
Other municipalities and government organizations are looking to save money and operate more efficiently by moving email and office productivity applications to Microsoft Office 365 or other cloud-based services such as Google Apps. San Jose, Calif., will be using Microsoft Office 365, Windows Azure and StorSimple for more than 5,000 city employees, Microsoft reported.
Microsoft’s cloud services also are being deployed by San Francisco; Chicago; Kansas City, Mo.; Seattle; Santa Clara County, Calif.; California; Texas; and Washington state as well as the Veterans Affairs Department and the Environmental Protection Agency.
Posted on Aug 16, 2013 at 11:24 AM1 comments
CyberFETCH, or the Cyber Forensics Electronic Technology Clearinghouse, is a Web-based repository of digital forensic tools, technology, and information managed by the Homeland Security Department’s Science & Technology Directorate. It is an unclassified program that does not contain sensitive data, and it went live in May 2012.
The free, collaborative platform is available to forensic practitioners from the public sector, private industry and academia who are citizens of the United States and associated with a U.S-based government, industry or academic institution. Subject matter experts can post blogs sharing their lessons learned and retrieve news and scholarship on cyber forensics. The portal also provides glossaries, listings of upcoming training and events. Members can categorize the material they contribute and network with peers. It includes a wiki, chat, document sharing and other features.
But the forensic information collected and shared on the CyberFETCH website is private and can only be accessed by users who have authenticated themselves. They must fill out a registration form that DHS uses to verify their identity and the validity of their request. DHS is now seeking comments on the information requested in those forms.
While discussions on CyberFETCH began earlier, the system didn’t begin ramping up until 2011. There are now plans to build a public site for agencies and users that would like their information available to the general public.
DHS is accepting comments on revising information collected from users until Oct. 1, 2013, and is particularly interested in comments that 1) evaluate the necessity of the information collected; 2) evaluate the accuracy of the burden of collecting the information; 3) suggest ways to improve the quality, utility, and clarity of the information collected; and 4) minimize the burden of collecting it.
Posted on Aug 16, 2013 at 10:14 AM0 comments