The Securities and Exchange Commission plans to open an office to help develop data analytics tools that can identify and track investment and management risk among the financial institutions it regulates.
The office will operate within the agency’s Division of Economic and Risk Analysis (DERA) to coordinate efforts to create “data-driven risk assessment tools and models” to support its regulatory activities, the SEC announced.
DERA has produced a number of risk assessment tools since it was created in 2009. That year it developed the Aberrational Performance Inquiry tool, designed to seek out and flag atypical hedge fund performance, according to the commission. It has been used by SEC’s enforcement division to assess private funds, leading to eight enforcement actions, according to the SEC.
DERA has also developed a risk assessment tool that helps SEC allocate resources by measuring a broker-dealer’s comparative riskiness relative to its peer group.
The division is also working with SEC’s Enforcement Division’s financial reporting and audit task force and the Division of Corporation Finance on a tool to help spot financial reporting irregularities that may signal fraud.
“The Office of Risk Assessment will build on the existing expertise of DERA’s staff, which includes economists, accountants, analysts and attorneys, to provide sophisticated assessments of market risks,” said DERA Deputy Director Scott W. Bauguess, who oversees the division’s risk assessment projects.
“The establishment of this new office reflects the Commission’s ongoing focus on deploying data-driven analytics to assist in routing scarce resources to areas of the greatest risks to the market,” he added.
Posted on Sep 19, 2014 at 9:48 AM1 comments
The World Bank recently launched the Open Government Contracts Platform, an open data tool designed to help governments and businesses search, manage and monitor government contracts and procurement opportunities globally.
Developed in partnership with government business intelligence vendor Govini and the larger Open Contracting community, a pilot platform currently displays 44,000 real-time contract records totaling $7.3 billion from India, a country with a high number of English, machine-readable government contracts.
The platform is accessible for public use and free of charge, the World Bank said in its announcement. Users can find information on opportunities, clients, competitors, governance and industries.
According to Govini, the platform will also include industry coverage in the U.S. federal, state and local markets as well as foreign governments. Both raw datasets and search results are fully exportable in CSV format and reusable by others.
With a more comprehensive view of the global marketplace, government agencies will be mouse clicks away from identifying the right vendors, the Bank said.
Posted on Sep 17, 2014 at 11:35 AM0 comments
The Intelligence Advanced Research Projects Activity will be hosting its first IARPA Day on Oct. 29-30 in the College Park, Md., area.
IARPA performs high-risk, high-payoff research to address challenges in the intelligence community. IARPA day will provide a unique look at the breadth and depth of the agency’s research through briefings, discussions and demonstrations of several current programs, including:
- The Aggregative Contingent Estimation program, which seeks to improve forecasting of world events through the wisdom of crowds.
- The Trusted Integrated Chips program, which addresses supply-chain security and intellectual property with a new chip-fabrication approach.
- The Strengthening Human Adaptive Reasoning and Problem-solving program, which explores ways to enhance analysts' ability to reason though complex and ambiguous problems.
- The Babel program, which is creating technology to provide robust search tools for human speech in any language in the world.
At IARPA day attendees can speak directly with agency leadership and IARPA's program managers about their work and experiences, as well as career opportunities at the research agency.
The event is being held on two days, the first day (Oct. 29) is only for attendees with Top Secret//SCI clearances and will be held at IARPA. The second day (Oct. 30) is entirely unclassified and will be held in a facility close to IARPA.
For more information on registration and attendance, visit the FedBizOpps announcement.
Posted on Sep 11, 2014 at 10:48 AM0 comments
The Department of Energy is investing in a project to speed the development of unhackable quantum encryption technology that will protect the country's power grid from cyberattack.
Under the DOE's Cybersecurity for Energy Delivery Systems program, the nation's top program for grid security, San Diego startup Qubitekk was awarded $3 million to work in partnership with Oak Ridge National Laboratory, Pacific Northwest National Laboratory, the University of Texas at Austin, Sandia National Laboratory and Pacific Gas & Electric to develop practical quantum security for the nation’s power grid.
Qubitekk, founded in 2012 to commercialize technology required to speed the adoption of quantum computing, recently announced the availability of the world's first plug-and-play entangled photon generator, the QES1. Like the transistors at the hearts of classical computers, the QES1 enables the flow of information through quantum computers and quantum encryption products – both of which the company is currently developing.
Meanwhile, Google is planning to build its own quantum computer. The Quantum Artificial Intelligence team at Google is launching a hardware initiative to design and build new quantum information processors based on superconducting electronics, according to a Google+ post by the lab team. Google also announced that John Martinis and his team at UC Santa Barbara will join Google in the initiative.
Google has been working with D-Wave Systems, maker of the quantum computer being tested by the Quantum Artificial Intelligence Lab at NASA’s Ames Research Center. Martinis will try to make his own versions of the kind of chip inside a D-Wave machine.
The Google Quantum AI team will test “new designs for quantum optimization and inference processors based on recent theoretical insights as well as our learnings from the D-Wave quantum annealing architecture,” Google said. The company will continue to work with D-Wave scientists and to experiment with the 512-qubit “Vesuvius” machine at NASA Ames that will be upgraded to a 1000 qubit “Washington” processor.
Posted on Sep 10, 2014 at 11:10 AM0 comments
The National Science Foundation is looking for some big ideas about big data.
According to a notice in the Federal Register, NSF seeks input from all parts of the big data ecosystem on the formation of new Big Data Regional Innovation Hubs that will generate the kinds of activities and partnerships established by National Big Data R&D Initiative over the past three years. The hubs would also stimulate, track and help sustain new regional and grassroots partnerships around big data.
According to the notice, potential roles for the hubs include:
- Accelerate big data solutions to specific global and societal challenges by convening stakeholders across sectors to partner on programs and projects.
- Act as a matchmaker between academic, industry and community stakeholders to help drive successful pilot programs.
- Increase the speed and volume of technology transfer between universities, public and private research centers and laboratories, large enterprises and small businesses.
- Facilitate discussions with opinion and thought leaders on the societal impact of big data technologies.
- Support the education and training of the entire big data workforce, from data scientists to managers to data end-users.
NSF wants comments from stakeholders across academia, state and local government, industry and non-profits working with big data. Comments are due Nov. 1, 2014.
Posted on Sep 09, 2014 at 11:39 AM1 comments