Gen. Keith Alexander

Getting harder to trust Alexander's NSA

“I need you,” National Security Agency Director Gen. Keith Alexander said several times to his audience at the National Press Club Wednesday. He needs the support of industry and the public in order to protect the nation from cyberattacks and terrorism in the face of growing concern over his agency’s wholesale collection of domestic data.

Alexander also spoke about the need to migrate the Defense Department’s 15,000 network enclaves to a more defensible architecture based on a thin, virtual cloud environment and about the need for legislation spelling out clear rules of engagement for protecting civilian cyber infrastructure and for cyber threat information sharing. But most of his talk focused on troublesome media leaks that threaten to hogtie the agency.

Data culled from the nation’s telephone and Internet carriers is crucial to thwarting foreign attacks, he said, but these programs are being threatened by what he called sensationalized and inflamed stories coming from the leaks.

“Talk about the facts,” he pleaded. “We need to get the facts out about why we need these tools.”

He then proceeded to give his latest version of the facts. But it is getting harder to trust him when his version has to be updated every month in the wake of new revelations about NSA activities. This is a shame, because it is getting in the way of the NSA’s genuinely important work of gathering foreign intelligence and protecting the government’s cyber infrastructure.

“I promise you the truth,” Alexander said back in July during his opening keynote address at the Black Hat Briefings. One of those truths was that “no one at NSA has ever gone outside the boundaries we’ve been given,” in its collection and analysis of domestic data.

Well, not exactly. Two months later, speaking at the Billington Cybersecurity Summit in Washington on Sept. 25, he acknowledged 12 willful violations of the agency’s legal authority. However, “we held ourselves accountable and we reported it,” he said. But not to the American people or to Congress until after it was publicly reported in August.

And then there were the 2,776 “incidents” that came out in the August release of declassified secret court records. These were just mistakes, he said in September, and “if we make a mistake, we self-report it in every case.”

“Self-reporting” at the NSA apparently means reporting to itself, because it didn’t report this to the public or to Congress.

Alexander, as usual, mentioned his 15 grandchildren during his talk. If you can’t trust a guy with 15 grandchildren, who can you trust? But he seemed unusually subdued. It could have been the recent dental surgery that had left one side of his jaw a little swollen. But it also might have been the three months of stress from the drip, drip, drip of revelations from those leaks. Publicly defending an agency that has spent decades in the shadows must be unnerving.

“We do the right thing in every case,” he said. “We’re trying to be more transparent.” That would be easier to believe if he didn’t have to update his version of the truth every month.

Posted by William Jackson on Sep 26, 2013 at 8:18 AM3 comments


Man with New Year

Congress to IT security: Happy fiscal New Year

Priorities for securing government’s IT infrastructure for the coming fiscal year include defending against insider threats posed by unmanaged privileged access and expanded continuous monitoring to address the growing complexity of outsider threats. But these issues could be dwarfed by the challenge of just keeping the lights on come Oct. 1.

“Security is probably the biggest issue we’ve got, because it underlies so much of the other things we are trying to do,” said Paul Christman, public sector vice president at Dell Software. “It can’t go on hiatus.”

Yet the fools on the Hill see the world spinning ’round toward the new budget year without any serious plans for enacting a budget to support critical operations. No doubt essential personnel will remain at their desks in the event of a shutdown, but without updated technology to support them, security will suffer.

“We’re finding it very challenging to assess and predict priorities, because our customers cannot assess and predict their priorities,” Christman said. “Funding has become chaotic and erratic.”

If there is any budget for fiscal 2014, insider threats are likely to be top-of-mind for administrators. A steady drumbeat of stories raises the question of how to manage the physical and logical access given to people agencies have decided to trust. On the IT side, systems administrators and others with privileged accounts often have way too much freedom, putting systems and the information they contain at risk.

The first step in controlling this access is effective policy. Most agencies and offices probably already have a good policy in place, Christman said. But there often are few if any controls to enforce it. Technology must match policy with the ability to monitor, track and audit the activity of those who are given the keys to the kingdom. This has been driven home by the activities of Chelsea (nee Bradley) Manning and Edward Snowden. The National Security Agency, smarting from the Snowden leaks, has responded by reducing the number of systems administrators and instituting a two-man rule requiring separate sets of credentials for access to sensitive resources.

This process would be burdensome and unnecessary for most agencies, which could effectively monitor activity with software. But that requires money, and money requires a budget.

The government also is in the process of moving from static assessments of IT security to continuous monitoring -- or continuous diagnostics and mitigation. This process is necessary to respond to a rapidly evolving threat landscape, and suites of automated tools are available to enable it. The Homeland Security Department is offering continuous monitoring as a service through blanket purchase agreements. But here again, a budget will be necessary to allow agencies to take advantage of the service in fiscal 2014.

Budget uncertainties are being compounded by the attrition of experienced procurement personnel. Because of retirements and sequester-powered furloughs, there is a shortage of officials with the know-how to effectively wend their way through acquisition regulations to take advantage of needed technology.

“I think this is going to make the next two weeks really, really strange,” Christman said of the year-end rush to spend out 2013 budgets. “I don’t see it getting any better next year.”

Posted by William Jackson on Sep 20, 2013 at 12:07 PM2 comments


Tibet

Encrypted communications gives voice to dissidents

Information gathering by the National Security Agency – whether legal, extralegal or illegal – has dominated the news for the last two months, but it is worth noting that the United States is not the only government engaging in electronic eavesdropping.

Kaspersky Labs reported in August that the Chinese language version of the website of the Central Tibetan Administration, which represents the Dali Lama, had been corrupted, redirecting visitors to an exploit that installed a backdoor on visitors’ computers. Researchers assigned no blame for the hack, but the Dali Lama, who fled from China to India in 1959, is not looked upon with favor by the Chinese.

With issues such as this in mind, the encrypted communications company Silent Circle is reaching out to support dissident groups in Tibet and elsewhere with off-the-shelf technology that can evade Chinese or any other government surveillance.

The Human Rights Foundation announced Sept. 4 that Silent Circle has donated 200 subscriptions for its Silent Phone application to Tibetan groups that have run afoul of the Chinese government. The mobile applications and service subscriptions enable strong encryption for voice and video communications between iPhone and Android phones using the app.

This is its first such partnership to provide secure communications for whistleblowers, activists and dissidents, said Alex Gladstein of the HRF. “They [Silent Circle] are very interested in using the technology to do good,” he said. The foundation is advising on and facilitating the donations. “On our end of the partnership, we are selecting groups that need this sort of thing. This is just the first effort. We will be doing others.”

When the scheme for providing peer-to-peer encryption for mobile devices was first hatched by former Navy SEAL Mike Janke, his vision was to enable secure BYOD communications in the field for special operations personnel and for maybe a few journalists operating in dangerous areas. It turned out that the timing for such a service was right, with mobile devices having become powerful enough to handle the processing required and with privacy concerns coming to the fore for businessmen and consumers as well as the military and intelligence communities.

Spurred by the ability to do encryption and key management on user devices without leaving a trail of metadata on third-party servers, and with pricing beginning at less than $10 a month, the company now has a user base in the millions. Governments are among its earliest and largest customers, Janke said.

Much has been made of the fact that Silent Circle’s technology could make it impossible for law enforcement and intelligence agencies to listen in to calls or look at data, images and video being exchanged between secured phones (and despite reports that the National Security Agency may have found a way to break most encryption, this holds true everywhere else). But instead of pushback, U.S. military and intelligence agencies have been early adopters. In contrast to the 1990s, when the U.S. government tried to stop the spread of strong commercial encryption, it now sees it as an economic tool and a weapon in grass roots struggles for democracy.

Cell phones and social media emerged as important tools during the Arab Spring uprisings of the last two years, but the open nature of these networks have sometimes left users exposed to the governments they are protesting or fighting. These groups now have access to consumer technology that puts them on an equal footing with governments. Bits and bytes might not stand up to bombs and bullets, but this helps to level the playing field.

Posted by William Jackson on Sep 10, 2013 at 11:02 AM3 comments


Man standing at open back door

The NSA wants to be your backdoor man

Distrust of the National Security Agency has deep roots. As far back as 1976 many believed that the code-breaking agency had slipped a backdoor into the new Data Encryption Standard, the approved algorithm for government encryption. For years, the suspicions were met with stony silence. Then, 35 years later, the NSA came clean.

The agency contributed changes to the proposed design, but left no backdoors or other surprises, Richard “Dickie” George, then technical director of NSA’s information assurance directorate, told an audience at the RSA Conference in 2011. “We’re actually pretty good guys,” George said. “We wanted to make sure we were as squeaky clean as possible.”

Now some of the squeak is wearing off that clean. No one doubts that the NSA is good at breaking codes. But the latest revelations from the Snowden files seem to confirm what many have long suspected: The NSA knows that it is easier to break a code when someone gives you the keys. Documents published by the New York Times describe a Signals Intelligence program to “actively engage the U.S. and foreign IT industries to covertly influence and/or overtly leverage their commercial products’ designs.”

A goal of the program is to “insert vulnerabilities into commercial encryption systems, IT systems, networks and endpoint communications devices used by targets,” and to “influence policies, standards and specifications for commercial public-key technologies.”

In other words, to install backdoors in commercial products.

There is a lot of outrage about the disclosure, but little surprise. Few people have taken theNSA’s assertions of the sanctity of commercial products seriously. The NSA seems proud of its efforts at subverting the security of personal communications. The project is in line with the Comprehensive National Cybersecurity Initiative, NSA said in its 2013 budget request, because it invests in corporate partnerships and cuts costs by exploiting existing sources of intelligence.

Most of us assumed that the public-private partnerships advocated in the CNCI were intended to strengthen cybersecurity and privacy. Live and learn.

To Chris Wysopal, chief technology officer at the application security company Veracode, what is surprising about the latest revelations is not so much that the NSA apparently is tampering with products. Everyone expects them to do that, he said. “What is eye-opening is that they are tampering with standards.” That would weaken all technology built to those standards, including that used by the U.S. government.

Although the NSA has expressed its desire to weaken standards, there is little evidence to date that it has managed to do so, Wysopal said. But there may be some evidence. In 2007 weaknesses were found in a pseudorandom number generator published by the NSA and included as a cryptographic standard for government use. It was immediately suspected that the flaw could have been intentional. Intentional or not, “in this case, it was detected and not used,” Wysopal said.

Since then there have not been similar discoveries in public crypto standards. And that underlines the greatest challenge in inserting backdoors through standards. As Dickie George told his audience of crypto professionals in 2011, “I don’t think we were good enough to sneak things in that you guys wouldn’t have found.”

Still, absence of evidence is not evidence of absence. We don’t know what we still don’t know.

Posted by William Jackson on Sep 06, 2013 at 11:58 AM3 comments