In the wake of embarrassing leaks by Edward Snowden about the National Security Agency’s domestic and international intelligence gathering, the agency is trying to figure out how it lost control of this information and how to prevent it from happening again.
As to how it happened, NSA Director Gen. Keith Alexander has a pretty good idea, at least at a high level: Too many people with access.
Alexander told the House Select Permanent Intelligence Committee on June 18 that NSA now has at least 1,000 systems administrators, a growing number of them contractors, like Snowden. Administrators are defined by their privileges on IT systems, their ability to access, define and change just about anything they want. One thousand is a lot of administrators to keep track of. Many people, Alexander included, think it is too many by at least one.
“Clearly the system did not work as it should have,” the general said in a June 23 appearance on ABC News’ This Week. “He betrayed the trust and confidence we had in him.”
The problem of administrative creep is not a new one, nor is it unique to the NSA or government.
“It’s a common audit finding that organizations have too many administrative personnel,” said Dave Frymier, chief information security officer at Unisys. Unisys faced the same problem when it found one day it had more than 100 Microsoft administrators. That number eventually was reduced to fewer than 15. “It just shows that they’re human,” he said of the NSA.
Alexander offered some ideas on how NSA plans to deal with the problem of trust. “We are now putting in place actions that would give us the ability to track our system administrators, what they're doing, what they're taking. A two-man rule,” he told ABC’s George Stephanopoulos.
The “two-man” rule requires two people with separate sets of credentials for access to sensitive resources. It can be expensive in terms of manpower and is not fool-proof, but most in the security community think it is a good idea, especially in an environment as sensitive as the NSA.
“I fall into the category of people who wonder why they hadn’t been doing this all along,” Frymier said. “It’s expensive, but it’s one of the better solutions to the problem.”
It is not the only solution, of course. The first — and most obvious — fix is to minimize the number of systems administrators. As with many simple solutions, however, this is easier said than done. While having a lot of administrators can be a security risk, it also helps to lighten workloads and make it easier to keep systems up and running. People tend to care about security risks only after an incident, but they care about having their systems running seven days a week, so convenience often trumps security.
A second solution is to reduce the privileges given to each administrator. Not all of them need all of the privileges all of the time. A system to grant privileges as required and to revoke them when a task is completed can make it easier to manage the managers. Microsoft has a more fine-grained administrative environment than Unix, and there are off-the-shelf tools to help with this process in a Microsoft environment. Unfortunately, NSA appears to be largely a Unix shop, Frymier said. But with its resources, the agency probably could develop its own administrative tools.
Another good security practice is to log all administrator activity. The problem with this is that logs often are looked at only after an incident, and administrators often have the ability to alter logs. This is where a system for real-time monitoring and alerting for suspicious activity would come in handy.
There are any number of other steps for segregating and protecting sensitive data, but none of them fool-proof. Eventually you have to trust someone with sensitive information. “At some point the problem is a human resource problem rather than a technical one,” Frymier said.
So the Cold War saw of “trust but verify” makes sense. “The two-man rule really is the best solution to the problem,” Frymier said. “It’s a good way to get a vast improvement.”
Posted by William Jackson on Jul 02, 2013 at 10:19 AM4 comments
Silent Circle, the company that provides end-to-end BYOD encryption, has introduced a Web-based management console to support large deployments of crypto licenses. It was developed largely in response to government demand for a tool to manage enterprisewide licensing, said CEO Mike Janke.
Government was always a primary market for Silent Circle, but the speed of adoption has caught the company by surprise.
“We had no idea that government customers would need a thousand subscriptions,” said Janke, a former Navy SEAL. “We didn’t see any of this coming. We envisioned 10 special ops guys, reporters in Sudan or some individuals around the world.”
Silent Circle’s secure voice, text, mail and video communications have gone in less than a year from being a point-to-point solution to an enterprise tool. There has been strong adoption in the financial industry and with oil companies, but “most of it was from [the Defense Department] and other government agencies,” Janke said.
The company has benefited from current events, particularly recent revelations about the National Security Agency’s surveillance of Internet and telephone communications. Growth, already a strong 100 percent month-over-month, rocketed to 420 percent in the last two-and-a-half weeks. Agencies that were buying 50 subscriptions now are buying hundreds as concerns grow not only about government snooping, but also of government leaking.
Encrypted communications is not new. What Silent Circle has done is make it practical for bring-your-own-device environments by harnessing the computing power of smart phones for crypto key management, cutting the middle man out of the security equation. Keys remain in the hands of the end users rather than a server, eliminating the need for trust in a third party.
Secure peer-to-peer connections with Silent Circle Android and iOS apps use the Zimmermann Real Time Transport Protocol, a crypto key agreement protocol for voice over IP that uses the Diffie-Hellman key exchange and the Secure Real Time Transport Protocol. Encryption is done with NSA Suite B cryptography, a public interoperable set of crypto tools that include the Advanced Encryption Standard, Secure Hash Algorithm 2 and elliptic curve digital signature and key agreement algorithms. The company operates its own network with SIP servers and codecs, but all encryption and security remain on endpoint devices.
Just 35 percent of the company’s business is in North America, with the rest of it off-shore in countries where security has long been a bigger issue than here. “We look at things in a bit of a bubble here compared to the rest of the world,” Janke said. People in Europe and Asia not only have to worry about NSA snooping, but also about their own intelligence agencies.
Although it is available in time to take advantage of the post-PRISM boom in secure communications, the new console was in the works well before the NSA leaks. “It took five months for our team to create this,” Janke said, primarily because of the security required for the portal. The console is a business management tool only and has nothing to do with encryption. It does not hold or manage keys and does not have access to message content. “It in no way, shape or form touches the technology.”
Despite the unexpected growth, Janke said Silent Circle is holding to its course for releasing new products this year, several of which, requested by government customers, now are in beta. These include encrypted file transfer from desktops, secure video conference calling and encrypted voice mail.
Posted by William Jackson on Jun 28, 2013 at 9:41 AM0 comments
Security is evolving from a do-it-yourself operation — loading software on a device or attaching a box to a network — to managed, hosted services leveraging the anytime/anywhere scalability of the cloud for large-scale analytics that were not practical before.
No one yet is seriously suggesting getting rid of firewalls and antivirus detection, but it has been painfully obvious for some time that by themselves, they are not adequate protection. Intelligence-based security is being touted as the way to counter more complex attacks against high-value targets, and the emergence of cloud computing now offers a way to gather enough intelligence and analyze big data fast enough to effectively spot malicious activity.
“We do not look for malware, we do not look for exploits,” said Dmitri Alperovitch, CTO of CrowdStrike, which has announced its first cloud-based security offering. “We look at what is being done, rather than how.”
The CrowdStrike Falcon Platform is one of the latest in a growing number of services offering security from the cloud, rather than security for the cloud. Another recent announcement in this field is the integration of global attack data into Risk I/O's cloud-based platform, which uses big data and predictive analytics to help prioritize vulnerability data. Other companies with cloud-based security services include the Appthority, Check Point, Fortinet, Okta, Symantec, Veracode and Zscaler.
Moving security out of the box and even out of the enterprise can help to address a new generation of adversaries using layered attacks to methodically find weaknesses, penetrate systems, escalate privileges and then quietly observe and export data. Intelligence is needed not just to detect these attacks, but to respond to them.
In the past, knowing who you were up against wasn’t necessary to security. You spotted the attack, and you blocked it. But, “if you are being targeted by a determined adversary, they are not going to stop because you block them,” Alperovitch said. “They are going to keep it up until they get in. They can spend years at it.”
CrowdStrike’s approach to active defense has a decidedly military and intelligence flavor. It takes a strategic view with an emphasis on knowing your enemy, not just the weapon. Most of the more than 4,000 organizations tracked for its Adversary Intelligence database are nation-sponsored. Its goal is not to stop every malicious attempt.
“You can’t block every attack,” Alperovitch said. “And sometimes blocking is not the best strategy.” If you spot and identify someone engaged in spying or espionage, the best strategy might be to string him along and watch him, “to better understand his tradecraft.”
The goal is to raise the bar for attackers, making their craft more difficult and expensive. This can mitigate one of the great advantages attackers have; it is dramatically cheaper to launch an attack than it is to defend against it, resulting in a very high return on investment for successful attacks. Recognizing sophisticated techniques “doesn’t eliminate all activity, but it dramatically raises the cost of intrusion,” Alperovitch said.
It is too early to say what impact the cloud and big data analytics will have on security, and it’s a pretty safe bet that it won’t solve every problem. But it is an attractive option for concentrating resources where they are most needed.
Posted by William Jackson on Jun 20, 2013 at 6:02 AM1 comments
A bill updating federal information security requirements has passed unanimously in the House and now awaits action in the Senate, raising the possibility that Congress might actually enact some kind of cybersecurity legislation.
The Federal Information Security Amendments Act of 2013 would require agencies to take a risk-based approach to information security, using automated tools for continuous monitoring of civilian, military and intelligence IT systems. It essentially would bring the Federal Information Security Management Act into line with the best practices agencies already are adopting.
Like the current FISMA, it would require annual reports to Congress, and it would be congressional oversight that ultimately would determine its success in improving federal cybersecurity. The question is: Will Congress continue to grade agency performance based on paperwork compliance, or will it measure actual security?
The bill was introduced by Rep. Darrell Issa (R-Calif.) with five bipartisan cosponsors to “provide a comprehensive framework for ensuring the effectiveness of information security controls,” and “effective governmentwide management and oversight of the related information security risks,” for both civilian and national security systems.
It is technology agnostic, leaving the selection of the appropriate hardware and software up to each agency based on guidance and standards developed by the National Institute of Standards and Technology. It defines “adequate security” as “security commensurate with the risk and magnitude of the harm resulting from the unauthorized access to or loss, misuse, destruction or modification of information.”
The bill gives a nod to cloud computing by including services in its definition of systems. NIST would develop standards in cooperation with security agencies, including the National Security Agency, “to assure, to the maximum extent feasible, that such standards and guidelines are complementary with standards and guidelines developed for national security systems,” although the Defense Department and CIA will continue to oversee their own systems. Each agency would have a chief information security officer, either the CIO or a senior official reporting directly to the CIO.
None of this is radically different from FISMA as it now stands, and nothing in the current law prohibits the use of these tools and processes. But FISMA has remained mired in paperwork documenting compliance within the letter of the law rather than improving cybersecurity. And much of the fault for that lies with Congress.
In the early days of FISMA there was a lot of basic and remedial work to be done. Agencies had to create accurate inventories of IT systems, determine their condition and OK their operation. Not certify that they were secure, but that the agency understood the risks of operating them and accepted those risks.
These were necessary tasks and important steps toward effective security. But FISMA has struggled to get past this stage because it is easier to measure paperwork compliance than security status. Harried administrators and security teams worked diligently to keep Congress off their backs and devoted what resources were left to improving security.
A focus on establishing priorities and automating processes has improved security in recent years, although agencies still struggle to keep up with the bad guys. Codifying these efforts could help if Congress can find a way to measure results rather than process.
Posted by William Jackson on Jun 14, 2013 at 9:39 AM1 comments