Trusted computing
|
Trusted computing is a family of open specifications whose stated goal is to make personal computers more secure through the use of dedicated hardware. Critics, including academics, security experts, and users of free and open source software, contend, however, that the overall effect (and perhaps intent) of trusted computing is to impose unreasonable restrictions on how people can use their computers.
Contents |
Synopsis
The basic system concepts in trusted computing are:
- Unique machine/CPU is identified using certificates;
- Encryption is performed in the hardware;
- Data can be signed with the machine's identification;
- Data can be encrypted with the machine's secret key.
The nature of trust
Trust means something different to security experts than the meaning laypersons often assign. For example, the United States Department of Defense's definition of a trusted system is one that can break your security policy; i.e., "a system that you are forced to trust because you have no choice." Cryptographer Bruce Schneier observes "A 'trusted' computer does not mean a computer that is trustworthy." According to those definitions a video card is trusted by its users to correctly display images. Trust in security parlance is always a kind of compromise or weakness—sometimes inevitable, but never desirable as such.
The main controversy around trusted computing is around this meaning of trust. Critics characterize a trusted system as a system you are forced to trust rather than one which is particularly trustworthy. In contrast, Microsoft, in adopting its term trustworthy computing presumably intends to focus consumers' attention on the allegedly trustworthy aspects of trusted computing systems.
Critics of trusted computing are further concerned that they are not able to look inside trusted computing hardware to see if it is properly implemented or if there are backdoors. The trusted computing specifications are open and available for anyone to review, but implementations are generally not. As well, many are concerned that cryptographic designs and algorithms become obsolete. This may result in the forced obsolescence of TC-enabled computers. For example, early versions of trusted computing hardware only supporting RSA, while later specifications added (and require) AES.
While proponents claim trusted computing increasing security, critics believe that not only will security not be helped, but trusted computing will facilitate mandatory digital rights management (DRM), harm privacy, and impose other restrictions on users. Trusting networked computers to controlling authorities rather than to individuals may create digital imprimaturs. Contrast trusted computing with secure computing in which anonymity, not disclosure, is the main concern. Advocates of secure computing argue that the additional security can be achieved without relinquishing control over computer from users to superusers.
Proponents of trusted computing argue that privacy complaints are baseless since consumers will retain a choice between systems, based on their individual needs. Moreover, trusted computing advocates claim that some needs require changes to the current systems at the hardware level to enable a computer to act as a trusted client.
Related terms
The TCG project is known by a number of names. Trusted computing was the original one, and is still used by the Trusted Computing Group (TCG) and IBM. The hardware device they developed is called the TPM the Trusted Platform Module. Microsoft calls it trustworthy computing. Intel has just started calling it safer computing. Prior to May 2004, the TCG was known as the TCPA. Richard Stallman of the FSF has adopted the name Treacherous computing.
Background
A variety of initiatives fall under the heading of trusted computing: Microsoft is working on a project called NGSCB. An industry consortium including Microsoft, Intel, IBM, HP and AMD, have formed the Trusted Computing Platform Alliance (TCPA), which has a Trusted Computing Group (TCG), designing a Trusted Platform Module (TPM). Intel is working on a form called LaGrande Technology (LT), while AMD's is called Secure Execution Mode (SEM). But essentially, there are proposals for four new features provided by new hardware, which require new software (including new operating systems and applications) to be taken advantage of. Each feature has a different reason, although they can be used together. The features are:
- Secure I/O
- Memory curtaining
- Sealed storage
- Remote attestation
Secure I/O
Secure input and output (I/O) is attested to by using checksums to verify that the software used to do the I/O has not been tampered with. Malicious software injecting itself in this path could be identified.
This would not be able to defend against a hardware based attack such as a key capture device physically between the user's keyboard and the computer.
Memory curtaining
Memory curtaining has the hardware keep programs from reading or writing each other's memory (the space where the programs store information they're currently working on). Even the operating system doesn't have access to curtained memory, so the information would be secure from an intruder who took control of the OS.
Something very similar can be achieved with new software, but doing it in hardware is a more elegant and reliable solution.
Sealed storage
Sealed storage protects private information by allowing it to be encrypted using a key derived from the software and hardware being used. This means the data can be read only by the same combination of software and hardware.
For example, users who keep a private diary on their computer don't want other programs or other computers to be able to read it. Currently, a virus could search for the diary, read it, and send it to someone else. (The SirCam virus did something similar.) Even if the diary were protected by a password, the virus could try most common passwords—on a modern computer, this is pretty fast. Or the virus could modify the user's diary software to have it leak the text once he unlocked his diary. With sealed storage, the diary is securely encrypted so that only the unmodified diary program on his computer can read it.
Remote attestation
Remote attestation allows changes to the user's computer to be detected by him and others. That way, he can avoid having private information sent to or important commands sent from a compromised or insecure computer. It works by having the hardware generate a certificate stating what software is currently running. The user can present this certificate to a remote party to show that their computer hasn't been tampered with.
Remote attestation is usually combined with public-key encryption so that the information sent can only be read by the programs that presented and requested the attestation, and not by an eavesdropper.
To take the diary example again, the user's diary software could send the diary to other machines, but only if they could attest that they were running a secure copy of the diary software. Combined with the other technologies, this provides a more secured path for the diary: secure I/O protects it as it's entered on the keyboard and displayed on the screen, memory curtaining protects it as it's being worked on, sealed storage protects it when it's saved to the hard drive, and remote attestation protects it from unauthorized software even when it is used on other computers.
Drawbacks
Opponents of trusted computing argue that the security features that protect computers from viruses and attackers also restrict the actions of their owners. This makes new anti-competitive techniques possible, potentially hurting people who buy trusted computers.
Cambridge Crytographer Ross Anderson has concerns that "TC can support remote censorship. In general, digital objects created using TC systems remain under the control of their creators, rather than under the control of the person who owns the machine on which they happen to be stored (as at present). So someone who writes a paper that a court decides is defamatory can be compelled to censor it—and the software company that wrote the word processor could be ordered to do the deletion if she refuses. Given such possibilities, we can expect TC to be used to suppress . . . writings that criticise political leaders." He goes on to state that:
- " . . . software suppliers can make it much harder for you to switch to their competitors' products. At a simple level, Word could encrypt all your documents using keys that only Microsoft products have access to; this would mean that you could only read them using Microsoft products, not with any competing word processor.
- "The . . . most important, benefit for Microsoft is that TC will dramatically increase the costs of switching away from Microsoft products (such as Office) to rival products (such as OpenOffice). For example, a law firm that wants to change from Office to OpenOffice right now merely has to install the software, train the staff and convert their existing files. In five years' time, once they have received TC-protected documents from perhaps a thousand different clients, they would have to get permission (in the form of signed digital certificates) from each of these clients in order to migrate their files to a new platform. The law firm won't in practice want to do this, so they will be much more tightly locked in, which will enable Microsoft to hike its prices."
Anderson summarizes the case by saying "The fundamental issue is that whoever controls the TC infrastructure will acquire a huge amount of power. Having this single point of control is like making everyone use the same bank, or the same accountant, or the same lawyer. There are many ways in which this power could be abused."
Users can't change software
In the diary example, sealed storage protects the diary from malicious programs like viruses, but it doesn't distinguish between those and useful programs, like ones that might be used to convert the diary to a new format, or provide new methods for searching within the diary. A user who wanted to switch to a competing diary program might find it would be impossible for that new program to read the old diary, as the information would be "locked in" to the old program. It could also make it impossible for the user to read or modify his diary except as specifically permitted by the diary software. If he were using diary software with no edit or delete option then it could be impossible to change or delete previous entries.
Remote attestation could cause other problems. Currently web sites can be visited using a number of web browsers, though certain websites may be formatted (intentionally or not) such that some browsers cannot decipher their code. Some browsers have found a way to get around that problem by emulating other browsers. For example, when Microsoft's MSN website briefly refused to serve pages to non-Microsoft browsers, users could access those sites by instructing their browsers to emulate a Microsoft browser. Remote attestation could make this kind of emulation irrelevant, as sites like MSN could demand a certificate stating the user was actually running an Internet Explorer browser.
Users don't control information they receive
One of the early motivations behind trusted computing was a desire to support stricter Digital Rights Management (DRM): technology to prevent users from sharing and using copyrighted or private files without permission. Microsoft has announced a DRM technology that it says will make use of trusted computing.
Trusted computing can be used for DRM. Take the example of downloading a music file from Metallica: First, Metallica will come up with some rules for how their music can be used. (For example, they might only want you to play the file three times a day without paying more money.) Then they'll use remote attestation to only send their music to a music player that enforces their rules. Sealed storage prevents you from opening the file with another player that doesn't enforce the restrictions. Memory curtaining prevents you from making an unrestricted copy of the file while it's playing. Secure output prevents you from capturing what's sent to the speakers.
Without remote attestation, you wouldn't have this problem. You could simply download the song with a player that doesn't enforce Metallica's restrictions or one that lets you convert the song to an unrestricted format like MP3.
Users don't control their applications
If a user upgrades his computer, sealed storage could prevent him from moving all his music files to his new computer, forcing him to buy all the songs again. It could also enforce spyware, with music files only given to users whose machines attest that they will tell the artist or record company every time the song is played.
TC opponents are alarmed at the prospect of these technologies being used as a form of remote control. For example, a newsmagazine could make it such that in order to download their news articles, one would need to attest that they use MS Reader. MS Reader could be programmed so as not to allow viewing of their news stories without asking the magazine's website if a change has been made. This could allow the magazine's editors to "rewrite history" by changing or deleting certain articles. Even if a user saved the original article on his computer, the software could refuse to let it be viewed once a change had been announced.
TC opponents, including Richard Stallman [1] (http://www.gnu.org/philosophy/can-you-trust.html), find this prospect and other aspects of TC reminscent of George Orwell's 1984, where the government changed everything ever archived to make it seem like their predictions were always correct. It has also noted that this issue and trusted computing in general is only applicable to closed source software, and would not be an issue if everyone used free software, something which Stallman strongly supports.
Proposed owner override for TC
All these problems come up because trusted computing protects programs against everything, even the owner. A simple solution to this is to let the owner of the computer override these protections. This is called Owner Override, and it only currently outlined as a suggested fix.
When you activate Owner Override, the computer will use the secure I/O path to make sure you're physically present and actually the owner. Then it will bypass the protections. So with remote attestation, you can force the computer to generate false attestations -- certificates that say you're running Internet Explorer, when you're really running Opera. Instead of saying when your software has been changed, remote attestation will say when the software has been changed without your permission.
While it would seem that the idea of Owner Override would be met with praise, some Trusted Computing Group members have instead heralded it as the biggest potential downfall of the TC movement. Owner Override defeats the entire idea of being able to trust other people's computers, remote attestation. Owner Override continues to offer all of the security and enforcement benefits to an owner on his own machine, but loses any ability to ensure another owner cannot waive rules or restrictions on his own computer. Once you send data to someone else's computer, whether it is your diary, a DRM music file, or a joint project, that person controls what security if any their computer will enforce on their copy of that data.
External links
- Trusted Computing Group (https://www.trustedcomputinggroup.org/home) (TCG) - Trusted computing standards body, previously known as the TCPA.
- 'Trusted Computing' Frequently Asked Questions (http://www.cl.cam.ac.uk/~rja14/tcpa-faq.html) - Anti-TC FAQ by Cambridge University security director and professor Ross Anderson.
- Next-Generation Secure Computing Base (NGSCB) (http://www.microsoft.com/resources/ngscb/default.mspx) - Microsoft's trusted computing architecture
- Palladium and the TCPA (http://www.schneier.com/crypto-gram-0208.html) - from Bruce Schneier's Crypto-Gram newsletter.
- Against-TCPA (http://www.againsttcpa.com/)
- Interesting Uses of Trusted Computing (http://invisiblog.com/1c801df4aee49232/article/0df117d5d9b32aea8bc23194ecc270ec)
- Can you trust your computer? (http://www.gnu.org/philosophy/can-you-trust.html) essay by the FSF