Computer Security lecture notes
Copyright  © 2008 Mark Dermot Ryan
Free use according to GPL allowed

Trusted Computing: concepts

This lecture in a nutshell

In order to provide better security and to prevent software and media piracy, a "locked-down" PC architecture is being developed which has hardware-level cryptographic keys for encryption and authentication. These keys are not under control of the user. This can be used to enable remote servers and content providers to trust te PC platform; it can prevent sharing of files, such as music and video content; or, in the business environment, to prevent divulging or printing of classified documents. Such scenarios raise issues about censorship, ownership, and individual freedom. Trusted computing takes control away from the PC owner, and gives it to software companies and content providers. Trusted computing has been much criticised by respected commentators, some of whom have called it Treacherous Computing.
       This lecture explores the concepts and the possibilities, not present-day reality. Present-day reality is currently coordinated by the Trusted Computing Group, which has much more sensible proposals than some of the ones that have been written about. Those proposals are the subject of the next lecture.

The problem of insecure PCs

Today's desktop and laptop computers are essentially open platforms, giving the user-owner total choice about what software runs on them, and the power to read, modify or delete files stored on them. This freedom has led to problems, such as

Trusted Computing (TC)

Trusted Computing is a cluster of proposals and ideas for a locked-down PC architecture which can give guarantees about the application software it is running, and which allows applications to communicate securely with other applications and with servers. In its strongest form, pathways between the computer and peripherals such as the keyboard, mouse and monitor are encrypted. The encryption keys are built into the hardware, and are not available to the PC owner. The PC only runs the operating system if it can verify its identity and integrity, and the operating system communicates securely with remote servers to provide guarantees about identity and integrity of application software before running it (attestation).

TC requires hardware support, to enable hardware encryption keys, memory curtaining, secure execution, and tamper resistance. At boot time, control  can be given to a small ROM program which verifies the hash value of the operating system code before loading and running it. This assures that the operating system is as expected. It in turn can verify the hash of application programs, to check their trustworthiness before running them.

Hardware-based cryptographic keys are used to perform secure operations. At manufacture time, a root cryptographic key is generated and stored within the hardware. This key is never transmitted to any other component, and the hardware is designed in such a way that it is extremely difficult to retrieve the stored key by reverse engineering or any other method, even to the owner. Applications can pass data encrypted with this key to be decrypted by the hardware, but it will only do so under certain strict conditions. Specifically, decrypted data will only ever be passed to authenticated, trusted applications, and will only ever be stored in curtained memory, making it inaccessable to other applications and the operating system.

Applications of TC

Digital rights management

Music and video. An early motivator for TC was digital rights management (DRM): music and video files are encrypted, and can only be played by recognised application software on a TC platform. The software will prevent users from making copies, and can restrict them in arbitrary other ways, e.g. by playing files only a certain number of times, or for a limited period. Early announcements of TC included much more draconian measures, such as software which would delete ordinary applications and media files if it detected copyright violations which took place outside the scope of TC.

The DRM enforced by TC is much stronger than DRM as we know it today. The DRM mechanisms we currently have, such as
are all broken or breakable in principle. They cause inconvenience to the user and therefore buy time for the content provider, but eventually workarounds are introduced. TC purports to be unbreakable (modulo the tamper resistance and the cryptography), and to resist BORE (break-once-run-everywhere).

Of course users can always re-record the audio content, and even (but less conveniently) the video content. So DRM for music and video will probably never succeed, but for software TC can make DRM very effective.

TC enables DRM

TC puts control of files into the hands of their creators, rather than the owner of the PC on which they reside. Using TC, you can create an e-mail and set a condition that it may not be forwarded on. The e-mail itself is encrypted and contains the information about the rights you have associated with it. The recipient of the e-mail will only be able to view it when their TC chip agrees that they have the right specified in the e-mail. In this case, their TC software will display the e-mail in such a way that they will be unable to copy and paste the text into a new e-mail in order to forward it. The same principles will apply to all types of files, notably music and video files. You could create  documents that can only be read in, say, the next week, after which point they become unusable.
Other kinds of content. Current motivations and applications for TC extend beyond entertainment content. Bill Gates: `We came at this thinking about music, but then we realized that e-mail and documents were far more interesting domains'. Email which cannot be printed or forwarded, and self-destructs after a specified period, opens up many possibilities. Similarly, document authors could enforce privacy by restricting the ways copies are made or extracts taken by cut-and-paste, or preventing them altogether. Organisations can enforce restrictive distriubution policies on documents created by their staff, preventing leaking to journalists or competitors.

These restrictions are enforced by the software, backed up by the hardware. The TC version of Microsoft Word will check whether you have the right to copy-and-paste from the current document, before allowing you to do so. You will not be able to open the document with another application, because the document is encrypted and only MS Word can get the key from the hardware.

Given that the data has to be perceived by the end-user, it is impossible to prevent forwarding the email by taking a photograph of the screen and sending that. But only very determined users will want to take the trouble to do that.  Software piracy control mechanisms will be pretty unbreakable with TC.

Platform authentication

Network connection. Corporations can use TC to ensure that only authorised PCs can connect to their networks. Currently, the techniques for doing this (such as identification by MAC address, as used in SoCS) are woefully inadequate because they can be faked. With TC, network owners can ensure that only PCs with recognised hardware can connect. Using attestation protocols, they can also ensure that the platform is running approved software.  Public wireless networks in coffee shops, hotels, etc., can also be protected in this way.

PC integrity in an internet cafe. A user wishing to check email or access other confidential services from a public computer has no means to ensure that the public computer is running legitimate software. The public computer could steal the user's password and other authentication information, keep copies of email in transit, etc. Obviously, the fact that it looks like it is running known software is absolutely no guarantee that it is doing so. Using TC and attestation protocols, the public computer could prove to the user that it is trusted and that it is running known and approved software, thus increasing confidence that it will behave legitimately.

Printer at airport. TC could enable users to be assured that a printer at an airport which looks like a known model from a known manufacturer is really so; that it can be trusted not to keep a copy of the user's confidential document.

Distributed firewalls

Distributed firewalls represent another application of TC. Traditionally, firewalls assume that everyone on the inside of the network is trusted. However, the increased use of wireless access points, dial-ins, VPNs and tunnels breaks down the distinction between inside and outside. With a distributed firewall, every node in the network runs part of the firewall, protecting the host from the network and protecting the network from the host. But how to ensure that the distributed firewall is running according to the organisation's policy? This was easy on the centrally-managed firewall, but harder on the distributed firewall where every PC user can try to tamper with the rules. TC provides an answer, by making the host's part of the firewall attest its rule set. Other TC firewall features are also possible. For example, rate limitations for preventing denial-of-service attacks can also be enforced by an attested firewall. By limiting the rate machines can send email, we can also reduce the generation of spam.

Rate limitation for DDoS prevention

Rate limiting can be used to address the problem of distributed denial of service (DDoS) attacks, by limiting the rate at which a machine can open a network connection. Similarly, one could also limit the rate at which emails can be sent, preventing spam attacks.

Preventing cheating in multiplayer games

TCcould also potentially be used to prevent most kinds of cheating in online games. Cheating by various means is currently prevalent in a number of multiplayer games and diminishes the enjoyment of those games by legitimate players. Common methods of cheating include:

Improving reputation reckoning

Many "community" systems such as eBay auctions, Amazon product reviews and P2P file sharing rely on the willingness of users to behave fairly, and this is often implemented by a mechanism for promulgating reputation. Such mechanisms can address problems such as P2P users disseminating "poisoned" content such as incomplete or damaged or misnamed files, vandalism by Wikipedia users, dishonest sellers on eBay, etc. Currently, users with low reputation can simply apply for a new identity. Without additional infrastructure there is no way to detect if two distinct identities represent the same entity. Trusted platforms can provide an improved means of building more robust reputation systems. Using TC, we can ensure that a single hardware platform can represent at most one identity.  This approach thwarts attacks on reputation systems in which a single platform registers thousands of identities. Additionally, trusted platforms can simplify distributed reputation systems because the platform can be used to track its own identity.  A problem with this scheme is that it locates user trustworthiness inside the hardware platform. If I buy a second-hand computer, I am stuck with the reputation its previous owner had. Biometrics might help push the reputation back to the user.

Third-party computing

Computing resources are increasingly borrowed, leased or donated. Examples include (1) using donated cycles for massively parallel scientific computations such as SETI@home; (2) using leased time on commercial computer farms for large-scale rendering and animation; (3) GRID computing allowing heavy users of computing power to pool and share their resources. Trusted computing can be used to help guarantee that the results produced by a remote computer are correct and have been kept confidential.

Anonymous service usage

TC can help users maintain their privacy, by using services such as google, amazon, BBC, etc., anonymously. Suppose google sets up a TC front end, whose job it is to anonymise requests to google. A user making a query gets this front end to attest its system state. On completion of the attestation protocol, the user has a guarantee that the front end will behave correctly in order to preserve the user's anonymity. The user sends a query encrypted with a session key generated during the attestation. The front end decrypts the query and batches it with a number of other queries. It sends them all off to google, and then returns the results to the respective users, each set encrypted with the appropriate session key. The purpose of batching the queries is to prevent google from linking the times of the user's encrypted query with the resulting decrypted query.

Balancing security and privacy

In many applications, security and privacy may be in opposition to each other. For example, in mobile phone usage, "security" may mean that the authorities have full access to my usage data, while "privacy" may mean that they have none. Security is needed to help crime investigation and detection, while privacy is necessary to have a healthy and free society. Therefore, we should find a balance between the two needs. For example, it might be that lawmakers decide that records should be held for a certain period, and then destroyed. Customers whose records are examined should be notified within one year of such examination. How to implement systems that give the right guarantees? Trusted computing could help.

Data security

TC can help keep your data safe, by encryption supported by the hardware key. Passwords and other private keys can also be kept safe. Theft of data from stolen laptops can be circumvented.

Product lock-in

Many printer cartridges now come with chips that authenticate them to the printer. Printers may refuse to work with third-party or refilled cartridges, or even with genuine cartridges that have passed an expiry date.

How TC works

TC-capable hardware must be manufactured with a public/private key pair, called the endorsement key (EK). The private key is held securely by the chip, and is never released. Ideally, the manufacturing process destroys all records of the private key. The chip is tamper-proof (it self-destructs rather than gives up its private key). Memory is curtained, to prevent debuggers and other software getting the private key, for example during signing operations. Applications authenticate themselves to a server, by sending the hardware's public key to the server, together with a digital fingerprint of the application. The server checks that both the hardware and the application are trusted, before sending the content (see the attestation protocol box). Servers need to know the set of valid public keys.

Memory curtaining is a strong, hardware-enforced memory isolation feature to prevent untrusted programs being able to read the memory allocated to trusted programs. TC-compliant hardware must also have secure IO, to address the threats posed by keycatchers and screen grabbers, and sound recording devices. (A keycatcher is a hardware device between the keyboard and the computer, which records what you type. A screen grabber records what is displayed on the screen.) Secure IO can also guarantee that input is provided by a physically-present user, as distinct from another program impersonating a user.


An attestation protocol (adapted from [2])

Attestation is the means by which a trusted computer assures a remote computer of its trustworthy status. The platform is manufactured with a public/private key pair built into the hardware. The public part of the hardware key is certified by an appropriate CA. Each individual platform has a unique hardware key. Using the private part of its hardware key, the system can guarantee assertions about the platform state. A remote computer can verify that those assertions have been guaranteed by a trusted computer.

An Application Key (AK) is a key pair created during attestation, for use by a particular application. At creation time, it gets tied to the platform's root key. Using an AK instead of using the platform root key directly has several benefits: (a) it reduces the load on the hardware, since only the hardware can use the root key but the CPU will use the AK; (b) helps prevent cryptanalysis of the root key; (c) somewhat addresses the privacy issues, since the AK is not directly associated with the hardware.

The AK in this protocol is signed by the  hardware key, which means any verifier (service provider) can link the session with the unique identity of the PC. This means that all the activities of the PC user can be linked, and a profile of the activities can be built.

Limitations of attestation

Attestation tells a remote party exactly what executable code was launched on a platform and establishes a session key for future interaction with that software component on the platform [2].

Dual mode

NGSCBMicrosoft is one of the main drivers for TC software; its version is called Next Generation Secure Computing Base (NGSCB), formerly known as Palladium.  NGSCB includes a mechanism for introducing TC in parallel with present-day open systems. Two modes run in parallel: the trusted mode and the untrusted mode. The untrusted mode will be like Windows is now, allowing near-complete freedom to the PC owner. The trusted mode will be the locked-down one. The owner need not use the trusted mode, but it will be necessary to do so in order to access certain kinds of content, such as emails and documents whose authors have imposed TC restrictions, and TC-managed media files. It will not be possible to export files from the trusted mode to the untrusted mode.

Criticisms of TC

Richard Stallman: "With a plan they call trusted computing, large media corporations, together with computer companies such as Microsoft and Intel, are planning to make your computer obey them instead of you."

Ross Anderson: "TC can support remote censorship. In its simplest form, applications may be designed to delete pirated music under remote control. This business model, called traitor tracing, has been researched extensively by Microsoft. . . .
In 2010 President Clinton may have two red buttons on her desk - one that sends the missiles to China, and another that turns off all the PCs in China - and guess which the Chinese will fear the most?"  [2003]

TC in full generality has been much criticised by respected commentators, and with good reason. It removes control of the PC from its owner/user, and gives the control to the software and content provider. This can easily be abused. For example, TC can enforce censorship; if someone writes a paper that a court decides is defamatory, the author can be compelled to censor it by withdrawing all access rights -- and the software company that wrote the word processor could be ordered to do the deletion if she refuses [3].

How does TC differ from CSS, or the XP activation mechanism?

DVDs have a protection scheme known as Content Scrambling System. Although it appears complex and includes hardware authentication and session key establishment (see, e.g., Greg Kesden's Tutorial on CSS), it is essentially a Break-Once-Run-Everywhere technology (BORE). It was indeed broken, by a 16-year old Norwegian programmer who wrote DeCSS. TC is very much stronger than something like CSS. TC avoids BORE by having different keys in every PC. Even if you succeed in breaking yours, that won't help anyone else.

The XP activation mechanism uses ideas closer to TC, by attempting to avoid BORE.  During the product activation phase of XP, a fingerprint of your computer's hardware is registered by Microsoft as corresponding to a particular license. The fingerprint information includes the network card identity, the disk volume id and capacity, etc. Microsoft will not register different hardware against the same license, and XP won't run unless you register it. This stops you installing it on your friend's computer. (`Hacked' versions are simply versions built without the product activation feature, sold to companies who don't want the hassle of individual registrations.) But the hardware fingerprinting mechanism is a mess, because it can prevent users from ligitimately upgrading their computer, and causes a headache for MS helpline staff. TC is much more reliable.

TC will also allow software companies to increase their monopolies. You may feel it is hard to migrate from MS Office today, because it is accepted as an industry standard. But in a TC world, this lock-in will be even harder to break. Companies will recieve TC-Office documents, and will need TC-Office to read them. Moreover, they will need to keep paying the rent for TC-Office in perpetuity, if they want to continue to have access to their archives. Home users will need it too, in order to read their gas bill. "Trusted Computing" means PCs are more trustworthy from the point of view of software vendors and content providers, but less trustworthy from the point of view of their owners. It means your computer is working for other people, not for you. It gives them complete power over what your computer does, and it prevents you from even knowing in what ways it is using against you information which you have provided. Some people have dubbed it Treacherous Computing.

TC will grow gradually, in small ways which are hard to resist. Users' objections will be assuaged by the reassurance that TC can be turned off, so it needn't seem such a threat.  But  eventually the price of turning it off will be too great. At work, you will need it to read TC'd emails and documents that are being sent to you. At home, you will need TC to communicate with your bank, your city council, and your entertainment provider; and you will need it because it will become your company's policy for teleworkers. Increasingly, the peripheral hardware you buy for your computer will only work with the TC mode of your computer. The non-TC world will continue to exist, but soon it be perceived as GNU/Linux is today: great because it gives you more freedom, but a pain because it gives you less choice.

How does TC compare with present-day use of encryption, e.g. with PGP?

When someone uses PGP (or the open-source equivalent, GPG) to send you an encrypted document, and you use PGP to decode it, the result is an unencrypted document that you can read, forward, copy, and even re-encrypt to send it securely to someone else. PGP leaves you completely in control. A TC application puts control with the originator of the document. It might let you read the words on the screen, but would not let you produce an unencrypted document that you could use in other ways. [4]

The counter-argument is persuasive too. People won't use it if it stops them doing what they want to do. Why should I rent music from Sony, and put up with all the TC restrictions, when I can have it for free, and without restrictions, from my friend who has produced an open MP3 version? The first bank that enforces TC will find its customers prefering to move to another bank. If TC is a way of making the Chinese pay for software, they won't use it. Nor will students, hobyists and enthusiasts. If these people don't use it, it may fail. Even if you are prepared to try to live with the costs and restrictions that TC seeks to impose, the sheer difficulty of coping with its constant checking, attesting, and nannying may make you turn it off. Companies can't even tolerate the intrusion of Microsoft's XP activation feature, so still less will they tolerate attestations on the network and through the firewall every time their staff wants to open a document.

What else is like TC?

There are reports that Microsoft beginning to think that DRM is a lost battle, but TC is now much more than DRM.


What it comes down to is that "trusted computing," like computing itself, is a tool which can be used in many ways. There are both good and bad applications. TC can be used to help legitimate users as much as to put obstacles in their way. The important thing is to be aware of TC's capabilities and the issues surrounding it.


  1. Microsoft's papers including some technical information.
  2. Tal Garfinkel, Mendel Rosenblum, and Dan Boneh. Flexible OS Support and Applications for Trusted Computing gives some detail on a possible protocol (described in these notes).
  3. Ross Anderson's Trusted Computing FAQ is a source of information and speculation, and has lots of links/references.
  4. Richard Stallman, Can you trust your computer?
  5. Seth Schoen, Trusted computing: promise and risk.


Updated 4 October 2008