CIA chief technology officer Gus Hunt told a tech industry audience March 20 that the agency had a nearly limitless demand for big data surrounding its intelligence targets.
Hunt’s remarks might help explain the impetus behind news broken by Federal Computer Week two days earlier that CIA had cut a $600 million deal with Amazon Web Services to build a private cloud infrastructure to help manage its big data demands.
"Since you can't connect dots you don't have … we fundamentally try to collect everything and hang on to it forever,” Hunt told a GigaOm conference audience, according to a Huff Post Tech report.
The CIA has made no secret of its aim to become a more data-driven organization, a goal Hunt has been citing for several years. “We are going to have to get analytics and visualization [tools] that are so dead-simple easy to use, anybody can take advantage of them, anybody can use them,” he said at a recent conference.
Posted on Mar 22, 2013 at 9:39 AM2 comments
The Securities and Exchange Commission has signed a $17.5 million contract with IO Government Services, making good on plans made last year to outsource the data center services that support its EDGAR financial records database operations, according to a report in Data Center Knowledge.
EDGAR, the SEC’s Electronic Data Gathering and Retrieval system, currently is run out of an SEC data center in Alexandria, Va. SEC chairwoman Mary Shapiro said in a letter to Congress last July that outsourcing and shutting down the facility would save the agency $18 million.
The giant database, which supports the financial reporting of U.S. public companies, will be outsourced to IO Government, a unit of data center innovator IO.
IO offers a set of small, modular data centers that are convenient to house and designed to make it easier for customers to add computing resources as needed.
The company’s IO.Anywhere line of data centers are offered in a range of campus, enterprise and service-provider modules that include their own cooling systems and backup power. The modules run on the IO.OS data center operating system, which is designed to offer customers control and flexibility to scale services as the demand dictates.
IO says its “Data Center 2.0” approach “marks a shift from large real estate-based infrastructures to flexible and sustainable modular installations.”
Last fall, IO landed another blue chip deal, signing a long-term contract with Goldman Sachs for modular data centers and services. The company said that, in addition to its capital savings, the IO systems would help it meet its power and energy usage goals.
Posted on Mar 19, 2013 at 9:39 AM0 comments
The importance of keeping data centers cool has been getting some attention lately, particularly in light of some innovative new ways of keeping servers running under optimal temperatures.
Microsoft this week experienced what can happen things get too hot. Users of its Hotmail and new Outlook.com e-mail services, along with its SkyDrive file-hosting service, were out of luck for 16 hours March 12-13 during a service disruption the company blamed on a hot data center.
The outage hit at 4:35 p.m. EDT March 12 after the company performed a routine firmware update in its data center facility, according to a blog post by Microsoft Vice President Arthur de Haan. Although the update had been done before without a hitch, de Haan wrote, it “failed in this specific instance in an unexpected way.”
The result: “a rapid and substantial temperature spike in the data center,” he wrote.
It got hot enough to trigger the company’s safeguards, which prevent access to mail boxes and deter any automatic failovers, for a large number of servers in that part of the company’s data center, which houses the infrastructure for Hotmail, Outlook.com and Sky Drive, he said. A data center team got to work on the problem, gradually restoring service, but full restoration took until 8:43 a.m. EDT March 13. De Haan said restoration, atypically, required human intervention in addition to infrastructure software, which is why it took so long.
The outage prompted a flurry of activity on Twitter, naturally, with some tweeters putting the blame at the company’s feet.
Microsoft actually has been one of the companies developing new ways of running cooler, more energy-efficient data centers, building a new outdoor data center in Virginia. It’s hard to say whether even new cooling techniques would have made a difference in this case, however, since it appears the firmware update gave the servers something of a fever.
The company is in the process of switching its Hotmail users to the new Outlook.com, which has a cleaner interface and more social media integration. The full switch will come this summer, though users can move to Outlook.com now.
Hotmail, which Microsoft bought in 1997, was the pioneering Web e-mail system and dominated the market for years, but steadily lost ground and was eventually overtaken in the United States by Gmail and Yahoo mail. (Hotmail is second to Gmail worldwide, and third behind Yahoo and Gmail among U.S. users.)
The move to Outlook.com, which of course shares the name of the dedicated e-mail client so many people use, could boost the company’s Web mail prospects, but Hotmail’s travails were not lost on Twitter users during the outage.
In his post, de Haan said the restoration has given the company an understanding of why the crash happened and the team was working to ensure it doesn’t happen again.
Posted on Mar 14, 2013 at 9:39 AM1 comments
Amazon Web Service is making access to its Virtual Private Clouds automatic for new users of the Elastic Cloud Compute (EC2) service, which would give them access to features such as multiple IP addresses and expanded security controls.
VPC, which lets customers create virtual networks of EC2 instances and virtual private network connections to their own data centers, has until now been a separate service from AWS. Now, new users will get access to a VPN by default, Amazon said in a blog post.
The service is being rolled out by regions, starting with Amazon’s Asia Pacific Region, based in Sydney, Australia, and South America Region, based in São Paulo, Brazil, with others to be added one at a time, the company said.
The automatic access to VPCs applies only to new users. Current customers, including hundreds of U.S. government and other public-sector agencies, would have to either sign up for a new account or launch a service in a region they haven’t used before. (There are four regions in the United States, including those in Virginia, Oregon and California, and the Gov Cloud region, designed for sensitive government workloads.)
Regardless of how the service is launched, the VPC comes at no extra charge, AWS said. Once launched, customer will get features such as “assigning multiple IP addresses to an instance, changing security group membership on the fly and adding egress filters to your security groups,” according to AWS’ blog.
AWS, which in 2011 was accredited under the Federal Information Security Management Act, has proved popular for public-sector agencies moving services to the cloud, with reportedly more than 300 government and 1,500 educational customers.
The availability of VPCs to customers is, according to TechCrunch’s Alex Williams, another indication of Amazon’s further push into the enterprise.
Posted on Mar 12, 2013 at 9:39 AM0 comments
The Office of Management and Budget might want to take note: Online mega auctioneer eBay has developed a new system of metrics that can reveal how even subtle changes in its vast data center and IT operations can affect the cost of a single online auction.
According to eBay, the dashboard-type system can show how many kilowatt-hours of energy eBay data centers use to process an auction; how many auctions it runs per server; or revenues per kilowatt hour of energy consumed.
Should the company want to know how many metric tons of carbon dioxide it used per transaction, the system could show it.
The micro-metrics are the product of a methodology eBay has developed over the last 18 months called Digital Service Efficiency (DSE) that allows the company to map the interconnection of performance factors that support its business services.
eBay says DSE provides a miles-per-gallon-type of measurement on its data center operations, IT infrastructure or carbon footprint. DSE “dynamically tunes [eBay’s] infrastructure engine by systematically exposing the multi-dimensional knobs that developers, engineers and operators can turn to optimize all layers of the infrastructure stack,” according to an eBay white paper on the project.
“Tuning these variables in tandem is like solving a Rubik’s Cube,” the paper’s authors write. “Imagine each color as representing an independent variable (for example cost, performance, environmental impact and revenue), yet each is dependent on the others. It’s easy to solve the same color on one side of the cube independently, but solving all sides at the same time is difficult.”
“We can see more clearly now than ever before that our designs, purchases and operating decisions have real, tangible effects on the key indicators important to running a business: cost, performance, environmental impact and, ultimately, revenue,” the authors said.
While DSE is based on eBay operations, the methodology can be adapted to other organizations and data center operations.
Agencies have been pursuing energy efficiency through data center consolidation and other measures. Even the world’s largest supercomputers are using low-power architectures.
Technology such as DSE, which provides fine-grained metrics on power use, could help further those efforts.
Posted on Mar 08, 2013 at 9:39 AM0 comments