Pulse


Pulse

By GCN Staff


DOE, Google back quantum computing research

The Department of Energy is investing in a project to speed the development of unhackable quantum encryption technology that will protect the country's power grid from cyberattack.

Under the DOE's Cybersecurity for Energy Delivery Systems program, the nation's top program for grid security, San Diego startup Qubitekk was awarded $3 million to work in partnership with Oak Ridge National Laboratory, Pacific Northwest National Laboratory, the University of Texas at Austin, Sandia National Laboratory and Pacific Gas & Electric to develop practical quantum security for the nation’s power grid.

Qubitekk, founded in 2012 to commercialize technology required to speed the adoption of quantum computing, recently announced the availability of the world's first plug-and-play entangled photon generator, the QES1. Like the transistors at the hearts of classical computers, the QES1 enables the flow of information through quantum computers and quantum encryption products – both of which the company is currently developing.

Meanwhile, Google is planning to build its own quantum computer. The Quantum Artificial Intelligence team at Google is launching a hardware initiative to design and build new quantum information processors based on superconducting electronics, according to a Google+ post by the lab team. Google also announced that John Martinis and his team at UC Santa Barbara will join Google in the initiative.

Google has been working with D-Wave Systems, maker of the quantum computer being tested by the Quantum Artificial Intelligence Lab at NASA’s Ames Research Center. Martinis will try to make his own versions of the kind of chip inside a D-Wave machine.

The Google Quantum AI team will test “new designs for quantum optimization and inference processors based on recent theoretical insights as well as our learnings from the D-Wave quantum annealing architecture,” Google said. The company will continue to work with D-Wave scientists and to experiment with the 512-qubit “Vesuvius” machine at NASA Ames that will be upgraded to a 1000 qubit “Washington” processor.

Posted on Sep 10, 2014 at 11:10 AM0 comments


NSF seeks feedback on big data innovation hubs

The National Science Foundation is looking for some big ideas about big data.

According to a notice in the Federal Register, NSF seeks input from all parts of the big data ecosystem on the formation of new Big Data Regional Innovation Hubs that will generate the kinds of activities and partnerships established by National Big Data R&D Initiative over the past three years. The hubs would also stimulate, track and help sustain new regional and grassroots partnerships around big data.

According to the notice, potential roles for the hubs include:

  • Accelerate big data solutions to specific global and societal challenges by convening stakeholders across sectors to partner on programs and projects.
  • Act as a matchmaker between academic, industry and community stakeholders to help drive successful pilot programs.
  • Increase the speed and volume of technology transfer between universities, public and private research centers and laboratories, large enterprises and small businesses.
  • Facilitate discussions with opinion and thought leaders on the societal impact of big data technologies.
  • Support the education and training of the entire big data workforce, from data scientists to managers to data end-users.

NSF wants comments from stakeholders across academia, state and local government, industry and non-profits working with big data. Comments are due Nov. 1, 2014.

Posted on Sep 09, 2014 at 11:39 AM1 comments


Government expands adoption of critical security controls

A majority of government organizations taking a recent survey by security education provider SANS Institute said they have adopted the Critical Security Controls (CSCs), a roadmap of 20 best practices for computer security developed by a public private consortium.

The CSC project was initiated in 2008 as a response to extreme data losses experienced by U.S. defense firms.

This year’s survey found 90 percent of organizations used the roadmap, with government and financial-sector-based industries leading the pack. The results run well ahead of a similar 2013 SANS survey, which showed a 73 percent adoption rate, according to SANS.

"Organizations across a broad range of industries are making steady progress toward adopting, integrating and automating the CSCs," said SANS analyst James Tarala, author of the survey results paper.

Even so, there are problems limiting adoption of all of the controls, he said.  Staffing issues, lack of budget and silos that limit communication between IT security and operations remain barriers that adopters encounter, according to Tarala.

These are key problems identified in last year's survey that haven't gone away, according to the Institute.

Not all organizations have adopted all controls, nor are they following the order of the controls currently listed as 1-20. But of those who are able to measure improvement, 16 percent noted the controls improved risk posture and 11 percent improved their ability to detect advanced attacks.

Tony Sager, director of the SANS Innovation Center and chief technologist for the Council on CyberSecurity, said the organization was working on guidelines and case studies, a resource requested by two-thirds of the survey respondents. 

"The Controls are not about having the best list of things to do – they are about members of a community helping each other improve their security, according to Sager. Full results of the survey will be shared during a Sept. 9, 2014, webcast at 1 p.m., EDT.

 

Posted on Sep 08, 2014 at 9:41 AM0 comments


SANS Institute offers updated security policy templates

Security education provider SANS Institute released 27 updated information security policy templates government agencies can use to ensure their security policies are practical, up-to-date and reflect real-world experience.

The refreshed policy library removes policies that are no longer needed, adds those covering new technologies and new threats and updates policies to reflect changes in practice.

The update was produced by a team of security industry professionals chaired by Michele D. Guel, a senior security architect at Cisco Systems, and a 26-year veteran of the cybersecurity industry.

The templates can be downloaded from the SANS Security Policy Project.

For general policies, titles include Acceptable Use, Acceptable Encryption, Password Construction, Password Protection, Email Use, Disaster Recovery Plans, and Security Response Plans.

In the network security arena, users will find templates for policies on Remote Access, Router and Switch Security, Wireless Communications and Standards, and the Assessment of Potential Acquisitions.

Server security templates include policies covering Database Credentials, Technology Equipment Disposal, Lab Security, and Software Installation. Templates database also includes a Web Application Security Policy template.

The templates are often generalized versions of policies developed for and used by government agencies and corporations.

"The Policy Project site allows organizations to create better policies, faster, by starting from a proven set of templates,” said Alan Paller, director of research at the SANS Institute. “It also helps ensure their own policies have sufficient scope and depth relative to those included in the library.”

Posted on Sep 05, 2014 at 7:59 AM0 comments


AT&T cloud storage offers beefed up security

As agencies increasingly migrate to the cloud in search of security and savings, their potential industry partners are stepping up to supply the increased security features demanded by federal customers.

This week, AT&T announced Synaptic Storage as a Service (STaaS ) for Government, a multi-tenant, community cloud that has the same features as AT&T's commercial cloud storage offering but adds additional security, the company said in its announcement.

Among the security enhancements are:

  • Storage towers that are physically separated from other users' towers in the data center.
  • Separate logical cloud for government data so that government customer data will not co-exist with commercial data.
  • A separate cloud portal partition for government agencies.
  • All government agency customers and their authorized users are assigned RSA hard token for two-factor authentication.

"Federal agencies want the mobility, collaboration, information sharing and efficiency that cloud offers but they can't afford to adopt cloud solutions that sacrifice performance, reliability and above all, security," said Kay Kapoor, president, AT&T Government Solutions.

"Our new STaaS for Government offer delivers the key attributes federal buyers require and allows them to move to the cloud with ease and confidence."

Posted on Sep 03, 2014 at 10:22 AM0 comments