When Congress tightened up financial data reporting regulations following the Wall Street securities trading scandals, few would have thought that once they were in place regulators wouldn’t be able to find the London Whale.
That’s the nickname given to a JP Morgan trader who became infamous for making bets upwards of $1 billion that companies would default on trades, the likes of which the new rules were designed to stop.
But while the Dodd-Frank Act called for increased public access to data related to credit swaps and other risky business, it left regulators on their own when it came to managing the data deluge and management challenges that came with it. The problem comes down to one many public-sector agencies can relate to: a mash of formats that aren’t interoperable, and therefore resistant to search.
And that’s what brought commissioner Scott O’Malia, a member of the Commodities Futures Trading Commission (CFTC), which regulates futures swaps, in front of a legal society audience March 19 with a warning that, “big data is the commission’s biggest problem.”
Dodd-Frank provided for greater transparency of financial trades so the commission could “look into the market and identify large swap positions that could have a destabilizing effect on our markets.”
However, the commission’s progress in understanding and using the data, O’Malia said, “is not going well,” according to a CFTC transcript of his remarks.
“The problem is so bad that staff have indicated that they currently cannot find the London Whale in in the current data files,” O’Malia reported.
In its rush to promote the new reporting rules, CFTC “failed to specify the data format parties must use when sending their (records to the database),” O’Malia said. “In other words, the commission told the industry what information to report, but didn’t specify what language to use. This has become a serious problem.”
And one that’s apparently mushrooming. That’s because each type of swap identified by 70-plus swap dealers will be reported in more than 70 different data formats.
“The permutations of data language are staggering,” O’Malia told the lawyers, adding, “doesn’t that sound like a reporting nightmare?”
To make matters worse, CFTC anticipates additional incoming data streams once major swap participants and end-users begin filing. “The commission now receives data on thousands of swaps each day,” the commissioner said. “So far, however, none of our computer programs load this data without crashing.”
Looking head, CFTC must “significantly improve its own IT capability,” said O’Malia. “Until such time, nobody should be under the illusion that promulgation of the reporting rules will enhance the Commission’s surveillance capabilities.”
In the meantime, O’Malia said, he would use his position as chairman of CFTC’s technology advisory committee, “to leverage the expertise of this group to assist in any way I can.”
Posted on Mar 27, 2013 at 1:57 PM0 comments
CIA chief technology officer Gus Hunt told a tech industry audience March 20 that the agency had a nearly limitless demand for big data surrounding its intelligence targets.
Hunt’s remarks might help explain the impetus behind news broken by Federal Computer Week two days earlier that CIA had cut a $600 million deal with Amazon Web Services to build a private cloud infrastructure to help manage its big data demands.
"Since you can't connect dots you don't have … we fundamentally try to collect everything and hang on to it forever,” Hunt told a GigaOm conference audience, according to a Huff Post Tech report.
The CIA has made no secret of its aim to become a more data-driven organization, a goal Hunt has been citing for several years. “We are going to have to get analytics and visualization [tools] that are so dead-simple easy to use, anybody can take advantage of them, anybody can use them,” he said at a recent conference.
Posted on Mar 22, 2013 at 12:52 PM2 comments
The Securities and Exchange Commission has signed a $17.5 million contract with IO Government Services, making good on plans made last year to outsource the data center services that support its EDGAR financial records database operations, according to a report in Data Center Knowledge.
EDGAR, the SEC’s Electronic Data Gathering and Retrieval system, currently is run out of an SEC data center in Alexandria, Va. SEC chairwoman Mary Shapiro said in a letter to Congress last July that outsourcing and shutting down the facility would save the agency $18 million.
The giant database, which supports the financial reporting of U.S. public companies, will be outsourced to IO Government, a unit of data center innovator IO.
IO offers a set of small, modular data centers that are convenient to house and designed to make it easier for customers to add computing resources as needed.
The company’s IO.Anywhere line of data centers are offered in a range of campus, enterprise and service-provider modules that include their own cooling systems and backup power. The modules run on the IO.OS data center operating system, which is designed to offer customers control and flexibility to scale services as the demand dictates.
IO says its “Data Center 2.0” approach “marks a shift from large real estate-based infrastructures to flexible and sustainable modular installations.”
Last fall, IO landed another blue chip deal, signing a long-term contract with Goldman Sachs for modular data centers and services. The company said that, in addition to its capital savings, the IO systems would help it meet its power and energy usage goals.
Posted on Mar 19, 2013 at 7:51 AM0 comments
The importance of keeping data centers cool has been getting some attention lately, particularly in light of some innovative new ways of keeping servers running under optimal temperatures.
Microsoft this week experienced what can happen things get too hot. Users of its Hotmail and new Outlook.com e-mail services, along with its SkyDrive file-hosting service, were out of luck for 16 hours March 12-13 during a service disruption the company blamed on a hot data center.
The outage hit at 4:35 p.m. EDT March 12 after the company performed a routine firmware update in its data center facility, according to a blog post by Microsoft Vice President Arthur de Haan. Although the update had been done before without a hitch, de Haan wrote, it “failed in this specific instance in an unexpected way.”
The result: “a rapid and substantial temperature spike in the data center,” he wrote.
It got hot enough to trigger the company’s safeguards, which prevent access to mail boxes and deter any automatic failovers, for a large number of servers in that part of the company’s data center, which houses the infrastructure for Hotmail, Outlook.com and Sky Drive, he said. A data center team got to work on the problem, gradually restoring service, but full restoration took until 8:43 a.m. EDT March 13. De Haan said restoration, atypically, required human intervention in addition to infrastructure software, which is why it took so long.
The outage prompted a flurry of activity on Twitter, naturally, with some tweeters putting the blame at the company’s feet.
Microsoft actually has been one of the companies developing new ways of running cooler, more energy-efficient data centers, building a new outdoor data center in Virginia. It’s hard to say whether even new cooling techniques would have made a difference in this case, however, since it appears the firmware update gave the servers something of a fever.
The company is in the process of switching its Hotmail users to the new Outlook.com, which has a cleaner interface and more social media integration. The full switch will come this summer, though users can move to Outlook.com now.
Hotmail, which Microsoft bought in 1997, was the pioneering Web e-mail system and dominated the market for years, but steadily lost ground and was eventually overtaken in the United States by Gmail and Yahoo mail. (Hotmail is second to Gmail worldwide, and third behind Yahoo and Gmail among U.S. users.)
The move to Outlook.com, which of course shares the name of the dedicated e-mail client so many people use, could boost the company’s Web mail prospects, but Hotmail’s travails were not lost on Twitter users during the outage.
In his post, de Haan said the restoration has given the company an understanding of why the crash happened and the team was working to ensure it doesn’t happen again.
Posted on Mar 14, 2013 at 8:29 AM1 comments
Amazon Web Service is making access to its Virtual Private Clouds automatic for new users of the Elastic Cloud Compute (EC2) service, which would give them access to features such as multiple IP addresses and expanded security controls.
VPC, which lets customers create virtual networks of EC2 instances and virtual private network connections to their own data centers, has until now been a separate service from AWS. Now, new users will get access to a VPN by default, Amazon said in a blog post.
The service is being rolled out by regions, starting with Amazon’s Asia Pacific Region, based in Sydney, Australia, and South America Region, based in São Paulo, Brazil, with others to be added one at a time, the company said.
The automatic access to VPCs applies only to new users. Current customers, including hundreds of U.S. government and other public-sector agencies, would have to either sign up for a new account or launch a service in a region they haven’t used before. (There are four regions in the United States, including those in Virginia, Oregon and California, and the Gov Cloud region, designed for sensitive government workloads.)
Regardless of how the service is launched, the VPC comes at no extra charge, AWS said. Once launched, customer will get features such as “assigning multiple IP addresses to an instance, changing security group membership on the fly and adding egress filters to your security groups,” according to AWS’ blog.
AWS, which in 2011 was accredited under the Federal Information Security Management Act, has proved popular for public-sector agencies moving services to the cloud, with reportedly more than 300 government and 1,500 educational customers.
The availability of VPCs to customers is, according to TechCrunch’s Alex Williams, another indication of Amazon’s further push into the enterprise.
Posted on Mar 12, 2013 at 12:54 PM0 comments