Would automated cloud security catch a 75-cent error?

Agencies moving applications to the public cloud will have to rely on providers for automated monitoring, threat detection and prevention, but might benefit from a human touch, Lee Badger, acting program manager of the National Institute of Standards and Technology’s cloud computing program, told a Washington audience recently.

To thwart future threats, automated intrusion, detection systems and firewalls will have to be configured for behavioral-pattern threat detection. But the added value of human analysis, a crucial element to enable system administrators to detect and track down the source of the intrusion, could be missing from the process, Badger said.

He spoke during a panel discussion on security and virtual environments at the Cloud Computing & Virtualization Conference and Expo in Washington, D.C., Sept. 8. The conference was sponsored by 1105 Media, parent company of Government Computer News.

Related stories:

Cloud security fears outweigh savings, but perhaps not for long

Cloud security awaits encryption breakthroughs

Badger noted the example of Clifford Stoll, an astronomer and systems administrator with Lawrence Berkeley National Laboratory in California and author of the book the “Cuckoo’s Egg.”

In August 1986, Stoll’s supervisor asked him to resolve a 75-cent accounting error in the computer usage accounts. He traced the error to an unauthorized user, who had acquired root access to the LBL system by exploiting a vulnerability in the movemail function, a computer program developed by the GNU Project that moves mail from a user’s Unix mail spool to another file.

Over a 10-month period, Stoll tracked the intrusion to Markus Hess, a German citizen who was working for the KGB with the objective of securing U.S. military information for the Soviets. Hess was able to piggyback off the LBL system onto the ARPANET and MILNET to attack 400 military computers.

It wasn’t technology that caught Hess. It was the fact that Stoll became determined to resolve the accounting error, even if it was for 75 cents, Badger noted. “Only the fact that he got really engaged [in solving the problem] allowed him to catch the guy,” Badger said.

An intrusion like the one Stoll uncovered is hard to find in a large environment, said C.J. Moses, deputy chief information security officer with Amazon Web Services Security.

Users need to be informed as to how their internal IT organization and external cloud providers treat security from an end-to-end perspective, said Max Peterson, vice president and general manager for civilian and intelligence agencies with Dell Federal Systems. The company recently entered the public cloud market with Dell Cloud with VMware.

Many organizations that are compromised don’t even realize there has been an intrusion in their network, noted Steven Chabinsky, deputy assistant director of the Federal Bureau of Investigation’s Cyber Division.

“They have little way of knowing if systems have been altered” during an attack, he said.

There is a need for technology that addresses assurance and attribution. Tools that can help users react to changes in their data, hardware and software environment address the issue of assurance.

Tools that give administrators a better view of who is on the network and what they are doing address the area of attribution. The Internet by design is private and anonymous, allowing people to route through different IP addresses and protocols, Chabinsky noted.

The cloud could be a test bed to aid in the rapid deployment of these new solutions because of its scalability, he said.

Cloud platforms could serve as “new flexible, scalable environments to test highly secure systems on,” Chabinsky said in an interview after the panel discussion.

Assurance and attribution does compete with civil liberties and privacy. A public dialogue is needed that would help foster the creation of alternative solutions, he said.

If there is a security breach in the network system that supports the electric power grid, maybe the attribution data can be encrypted and the government can access that data through the legal process.

However, in many instances, if someone has inserted malware that alters an organization’s systems and then withdraws the malware, the evidence is gone by the time law enforcement gets to the scene of the crime.

Individuals are then “losing their constitutional right to be protected by government,” Chabinsky said.

Industry and the government have to move beyond focusing on vulnerability management and find ways to stop bad guys, he said. “We can’t win only on defense,” he said, noting that the networked world is a constantly evolving environment.

About the Author

Rutrell Yasin is is a freelance technology writer for GCN.


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/Shutterstock.com)

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected