Trust issues

Agencies, industry grapple to define the meaning of trusted computing<@VM>Sidebar | Secure RFID tags?

'How you use your system dictates how you feel about trusted computing. It has got an Orwellian, Big Brother feeling to it that bothers a lot of people.' ' Bud Wilson, TechTeam Government Solutions

GCN Photo by Stan Barouh

'The industry has grown at warp speed, and the complexity is outstripping our ability to put these things together securely.' ' Ron Ross, National Institute of Standards and Technology

GCN Photo by Henrik G. De Gyor

Trusted or trustworthy computing sounds like a good idea. After all, who wants untrustworthy computing? Government and private sectors have been working on this concept, but the goal of consistently secure and trustworthy information technology systems remains elusive.

Part of the problem might be that it is difficult to precisely define trusted computing. If our system contacts a computer on another network, we want to make sure the output we get is valid and unaltered ' and the external agency's management fully recognizes the role that computer plays. But how do you gauge ' or trust ' such assertions?

'You can ask 10 people, and you'll get 10 different answers,' said Bud Wilson, IT director at TechTeam Government Solutions.

Most people in the industry perceive trustworthy computing to mean secure computing, Wilson said, but that is too broad to be a good definition. Microsoft has a trustworthy-computing initiative, which refers to a reliable, repeatable software development process.

To the Trusted Computing Group, trusted computing refers to security controls based on its specifications built into hardware platforms. This industry standards body has given us the Trusted Platform Module chip for storing cryptographic keys, passwords and digital certificates, which is becoming common in laptop and desktop PCs.

Then there is the trusted system according to the National Information Assurance Partnership, which refers to platforms that have been evaluated under the Common Criteria at Evaluation Assurance Level 4 or above for role-based access control, controlled access and labeled security protection profiles. So far, evaluated systems include Sun Microsystems' Trusted Solaris Operating System Version 8, Red Hat Enterprise Linux Version 5 and the XTS-400 Secure Trusted Operating Program from BAE Systems Information Technology.

'I'm not sure there is a generally accepted definition,' said Ron Ross, senior computer scientist at the National Institute of Standards and

Ross is struggling to write a definition of trustworthy systems for the upcoming Special Publication 800-39, 'Managing Enterprise Risk,' one in a series of NIST publications on computer security. It is expected to be available in October. The term trusted computing has evolved over time, he said. When the Orange Book, precursor to the internationally accepted Common Criteria, was developed in the 1970s by the Defense Department and the National Security Agency, the focus was on operating systems in a mainframe environment.

The focus in trustworthy computing in government today is on enabling cross-domain data sharing so data on networks handling differing levels of security classification can be accessed from a single computer. This would help eliminate the need for multiple computers on a single desk and simplify data sharing within and among agencies. DOD and the intelligence community are working on a platform to enable this type of sharing among themselves with an eye toward the holy grail of trusted computing. 'We're going to converge at some point between the DOD and the civilian agencies,' Ross said.

'We are starting to work a lot smarter now' toward this end, he said, but major challenges remain. Evaluation of trusted systems has so far focused on individual components. Greater emphasis is needed now on developing and integrating entire systems and on security-engineering techniques to create a trustworthy whole.
'The industry has grown at warp speed, and the complexity is outstripping our ability to put these things together securely,' Ross said. 'Complexity and connectivity are going to be constant threats to our security.'

Trusted control

The Trusted Computing Group's (TCG) Trusted Platform Module is probably the most visible element in enabling cross-domain information sharing. The group ' consisting of industry heavyweights such as Advanced Micro Devices, Hewlett-Packard, Intel and Microsoft ' has developed a specification for building a secure microcontroller that can be added to laptops, desktop PCs or server motherboards. The controller generates cryptographic keys for signing documents and computer-based transactions. The microcontroller also provides a description of the computer's hardware, which can be a source of nearly irrefutable identification for that computer.

DOD sees the TPM as a primary tool for securing sensitive-but-unclassified information on portable devices. In July, a DOD directive required the encryption of all sensitive data on laptops, personal digital assistants and removable storage devices using Federal Information Processing Standard 140-2 compliant tools. The department requires that all servers, desktop PCs, laptops and PDAs purchased include the TPM chip.

Storing the keys and digital certificates for these functions on a dedicated piece of hardware keeps them more secure from external attacks and malicious code, the department said. TPM's hashing function can be used to ensure the integrity not only of documents stored on a computer but also of applications and other pieces of hardware on the computer, said Michael Willett, senior research director at the TCG. He called the TPM a security metric.

'Hashing is a way to take a cryptographic snapshot,' he said. A hashing algorithm creates a unique numerical digest of a document, a piece of software or the code on a computer chip. The original contents cannot be derived from this digest or hash, but any change in the content results in a different hash. Comparing before-and-after hashes can reveal alterations, enabling detection of unauthorized tampering with documents or applications.

Safe storage

The TPM also can be used as an interface for security functions being defined in specifications for trusted-storage devices. TCG has released a draft of the specifications for public comment.

The TPM focuses on the computing platform, which is only one half of the equation, Willett said.

'As a storage guy, to me that's the sound of one hand clapping,' he said. Storage devices are 'where data spends most of its useful life,' and that is where security belongs, he said. A working group began developing trusted storage specifications about three years ago and released the 230-page document in June.

Although the draft specifications are not expected to be finalized until late this year, TCG said they are complete, and storage and application vendors can begin using them to design secure products. They are intended for use with any type of storage device, including hard drives and flash, tape and optical devices.

Specifications are provided for cryptography, public-key cryptography and digital signatures, hashing, random number generation, and secure storage
The specifications define the creation of a Security Provider segment in a nonaddressable portion of the device's memory used for system functions. Applications would present credentials to trusted-storage devices through the TPM chip or some other trusted element in the host device using a trusted-command interface negotiated by TCG with SCSI and Advanced Technology Attachment standards committees.

Willett said the major hard-drive manufacturers who participated in development of the trusted-storage specification plan to incorporate the specifications in their products. The first application announced is full-disk encryption, which Willett called a no-brainer.

The encryption will use the Advanced Encryption Standard algorithm with a 256-bit key. A random-number generator in the Security Provider segment of the drive will create the key. Encryption will be done in hardware, and the key will never leave the device. The user will access the key with a password. Changing the key can provide a rapid-erase function, making data on the disk inaccessible.

Another secure-storage application likely to appear soon will be application locking, which will tie disks or other devices such as USB drives to a single computer. Secure-storage devices and their host computers will authenticate on another through a handshake protocol that TPM manages.

TCG said an estimated 250 million devices with TPM chips installed have been shipped, and another 50 million are expected this year.

'There are chips bolted to most laptops, and it is appearing in servers,' Willett said. The DOD mandate is expected to be a major driver in making the chips ubiquitous, and applications using the chip, such as BitLocker in Microsoft's Windows Vista operating system, are beginning to appear.

But there has so far been a paucity of applications using the chip, and awareness of the chip and its functionality is growing slowly.

'There are a lot of reasons for that,' Wilson said. 'It's becoming pervasive in the hardware space. The early adopters are the financial sector and the DOD. Beyond that, it's a little bit early.'

The chip is becoming common in hardware, but most software does not yet support it, although that is beginning to change with the introduction of operating systems such as Vista.

But even with approaching ubiquity, many users and privacy advocates have reservations about the TPM and about trusted computing in general. The big question for many users is, 'Whom are you trusting?'

The chip often is associated with digital-rights management schemes that many consumers see as overly restrictive and infringing on their freedom to use software and other products they have bought. They do not like the feeling that they are not in full control of their own computers or the applications and devices running on them.

'I'm not a big fan of trusted computing,' Wilson said. He added that its adoption makes sense within closed organizations such as DOD or a bank where close regulation is accepted, but consumers and other nonregulated users are likely to balk at it.

'How you use your system dictates how you feel about trusted computing,' he said. 'It has got an Orwellian, Big Brother feeling to it that bothers a lot of people.'

Secure trade-offs

He also speculated that online anonymity could be threatened. 'The problem with TPM is [that] they are going to know who you are,' because each chip is unique, he said. 'Will it be used that way? I don't know. That was not the intent.'

However, the possibility it could be used to track activity worries some people.
Willett said he sees no downside to the technology. There are widespread concerns about relinquishing control of personal devices, but he said these concerns are unfounded.

We gave ultimate control to the users early on by giving them the ability to turn the chip off, he said. Regarding digital-rights management, DRM is a trade-off, Willett said. If users do not feel they are getting more value and functionality by using DRM-protected products, they can choose not to use them. 'It's up to you,' he said.

But Wilson said he fears that if digital-rights management becomes ubiquitous, freedom of choice will be jeopardized; consumers will not have the option of using applications and devices without DRM and will be forced to accept restrictions the technology imposes.

Ed Hammersla, chief operating officer at Trusted Computer Solutions, is more charitable toward TPM and trusted-computing technology.

'It's a good and helpful effort to increase the level of trust in the general computing environment,' he said, and TPM is the secret sauce that can help enable cross-domain information sharing.

But that's a far cry from having a fully trusted computing environment based on this technology, he said.

'The technology is helping us, in limited quantities,' he said. 'There are places where it is working and making progress.' Using a football analogy, he said there is still a long way to go to achieve real trusted computing. 'We are on the 20- or 30-yard line with a big field in front of us.'

Ross points out that trusted computing ultimately depends on more than technology built into hardware and software. It depends on a trusted relationship between the parties sharing information and between the users and their systems. This requires some way for each to judge the other's trustworthiness. This, in turn, requires the ability to demonstrate a level of compliance with a set of security requirements: a matter not only of technology but also of policy.

Developers need to give more attention to software development and system-engineering processes, Ross said. Full trust can best be achieved when the applications and operating systems running on our trusted-hardware platforms have been built from the ground up to standards of trustworthiness rather than merely evaluated for compliance with a set of specifications at the end of the process.
'We have focused an awful lot on the evaluation side, and we haven't spent enough time on the development process for good software,' Ross said. 'You cannot evaluate your way to good software.'
Researchers create a random number generator

A source of truly random numbers has been one of the biggest challenges for computer science, yet such numbers are vital for securing computational devices. Programs that encrypt data require a robust source of random numbers. Computers alone are incapable of producing truly random numbers. Algorithms have been written that can help machines produce pseudo-random numbers, or numbers that statistically resemble random numbers but contain subtle, repeatable patterns. But such patterns can be used to decipher a message encrypted with those pseudo-random digits.

The good news is that the specifications for the Trusted Computing Group's Trusted Platform Module come with a random-number generator, which should improve securing computers. A trio of University of Massachusetts researchers have found an inexpensive way to produce sets of truly random numbers for radio frequency identification tags. The technique also produces a unique fingerprint for each tag.

Daniel Holcomb, Wayne Burleson and Kevin Fu conducted the research, which the National Science Foundation funded. The RFID Consortium published the results in the most recent edition of the 'Proceedings of the Conference on RFID Security.'
Thomas Heydt-Benjamin, a colleague of the researchers, wrote on his blog that the technique involves reading the binary state of the RFID tag's memory cells just as the tag is powered on.

A typical Electronic Product Code Class 1 tag may have from 1,000 to 4,000 gates. Such memory is typically volatile: All information is lost when the memory loses power. Depending on how the manufacturer builds the tag, most of the gates will either reliably contain a charge or not contain a charge when powered on again ' representing either a 1 or a 0. However, each time a tag is powered, a certain number of gates will fluctuate randomly between having a residual charge or not having a charge. These fluctuations can be harnessed to supply a steady stream of random numbers.

The researchers said the numbers produced by this method have passed the National Institute of Standards and Technology test for statistical randomness.

Researchers have also found that the variations in each tag's gates are varied enough to be used as a way to uniquely identify, or fingerprint, each tag. Like fingerprints, each tag is slightly different.

Each tag may have different threshold voltages ' or voltages that tip a cell from a noncharged to a charged state. Minor variations in the lithographic process that produced the tags also work as identifiers.

Such fingerprints can be used to produce signatures, researchers say. By checking these signatures, the operator of the tag can be assured that information derived from that tag has not been altered by some other, possibly malicious, source.


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected