Article
for Synetrix Newsletter
|
Copyright © 2004 Mark Dermot Ryan
The University of Birmingham
|
Trusted Computing and NGSCB
This article in a nutshell
In order to provide better security and to prevent software and media
piracy, a "locked-down" PC architecture is being developed which has
hardware-level cryptographic keys for encryption and authentication. It
is tamper-resistant, has curtained memory, and communication with the
keyboard, mouse, monitor and printer may be encrypted. This can be used
to prevent sharing of files, such as music and video content; or, in
the business environment, to prevent divulging or printing of
classified
documents. But Trusted Computing raises issues about censorship,
ownership, and individual freedom. Trusted computing takes control away
from the PC owner, and gives it to software companies and content
providers. Trusted computing has been much criticised by respected
commentators, some of whom have called it Treacherous Computing.
The problem of insecure PCs
Today's desktop and laptop computers are essentially open platforms,
giving the user-owner total choice about what software runs on them,
and the power to read, modify or delete files stored on them. Using
firewalls and other tools, users can also determine what kinds of
communication their computers can have with the rest of the world.
This freedom has led to problems, such as
- insecurity for the user, since open platforms are prone to
infection by viruses, worms, and to inadvertent installation of
spyware, denial-of-service attackers,
compromised software, keyboard
keycatchers, etc.
- insecurity for the network on which the computer is placed, since
it may have viruses and worms, denial-of-service attackers, etc., which
threaten other machines on the network.
- insecurity for software authors and media content providers,
since open platforms allow programs, music files, images etc. to be
copied without limit and without loss of quality.
Trusted Computing (TC)
Trusted Computing is a cluster of proposals for a locked-down
PC architecture which can give guarantees about the application
software it is running, and which allows applications to communicate
securely with other applications and with servers. In its strongest
form,
pathways between the computer and peripherals such as the keyboard,
mouse and monitor are encrypted. The encryption keys are built into the
hardware, and are not available to the PC owner. The PC only runs the
operating system if it can verify its signature, and the operating
system communicates securely with servers to authenticate application
software before running it (attestation).
Microsoft is one
of the main drivers for TC; its version is called New Generation Secure
Computing Base (NGSCB),
formerly known as Palladium. NGSCB includes a mechanism for
introducing TC in parallel with present-day open systems. Future
versions of Microsoft Windows
incorporating NGSCB will have two modes, the trusted mode and the
untrusted mode. The untrusted mode will be like Windows is now,
allowing near-complete freedom to the PC owner. The trusted mode will
be the locked-down one. The owner need not use the trusted mode, but it
will be necessary to do so in order to access certain kinds of content,
such as emails and documents whose authors have imposed TC
restrictions, and TC-managed media files. It will not be possible to
export files from the trusted mode to the untrusted mode.
Open and closed platforms
TC offers a scenario somewhere between the completely open PC platform
we have today, and the completely closed platform represented by, for
example, mobile phones and games consoles. Closed platforms rely on
tamper-resistance to protect the embedded secret key and ensure
well-behaved operation. TC works by introducing a trusted virtual
machine running on the open-platform PC. The virtual machine
communicates directly with the hardware in order to guarantee security,
but allows the user some degree of seamlessness between the trusted
mode and untrusted mode, in order to increase usability.
TC will require several hardware changes, to enforce tamper resistance,
memory security, and encryption keys. Intel's LaGrande Technology (LT)
and AMD's Secure Execution Mode (SEM) provide the hardware support for
the major ideas of NGSCB. The Trusted
Computing Group (TCG) is an alliance of Microsoft, Intel, IBM, HP,
AMD and other companies, to coordinate these activities.
Applications of TC
The original motivation was digital
rights
management (DRM):
music and video files will be encrypted, and can only be played by
recognised
application software on a TC platform. The software will prevent you
from making copies, and can restrict you in arbitrary other ways, e.g.
by playing files only a certain number of times, or for a limited
period. Early announcements of TC included much more draconian
measures,
such as software which would delete ordinary applications and media
files if it detected copyright
violations which took place outside the scope of TC.
Current motivations and applications for TC extend way beyond DRM.
Bill Gates: `We
came at this thinking about music, but then we realized that e-mail and
documents were far more interesting domains'. Email which cannot be
printed or forwarded, and self-destructs after a specified period,
opens
up many possibilities. Similarly, document authors could enforce
privacy
by restricting the ways copies are made or extracts taken by
cut-and-paste, or preventing them altogether. Organisations can enforce
restrictive distriubution policies on documents created by their staff,
preventing leaking to journalists or competitors.
How TC can be used
TC puts control of files into the hands of their creators, rather than
the owner of the PC on which they reside. Using TC, you can create
an e-mail and set a condition that it may not be forwarded on. The
e-mail itself is encrypted and contains the information about the
rights
you have associated with it. The recipient of the e-mail will only be
able to view it when their TC chip agrees that they have the right
specified in the e-mail. In this case, their TC software will display
the e-mail in such a way that
they will be unable to copy and paste the text into a new e-mail in
order to forward it. The same principles will apply to all types of
files, notably music and video files. You could create documents
that can only be read in, say, the next week, after which point they
become unusable.
These restrictions are enforced by the software. The TC version of
Microsoft Word will check whether you have the right to copy-and-paste
from the current document, before allowing you to do so. You will not
be able to open the document with another application, because the
document is encrypted and only MS Word has the key (securely held, of
course).
Distributed firewalls
represent another application of TC. Traditionally, firewalls assume
that everyone on the inside of the network is trusted. However, the
increased use of wireless access points, dial-ins, VPNs and tunnels
breaks down the distinction between inside and outside. With a
distributed firewall, every node in the network runs part of the
firewall,
protecting the host from the network and protecting the network from
the host. But how to ensure that the distributed firewall is running
according to the organisation's policy? This was easy on the
centrally-managed firewall, but harder on the distributed firewall
where every PC user can try to tamper with the rules. TC provides an
answer, by making the host's part of the firewall attest its rule set.
Other TC firewall features are also possible. For example, rate
limitations for preventing denial-of-service attacks can also be
enforced by an attested firewall. By limiting the rate machines can
send email, we can also reduce the generation of spam.
How TC works
TC-capable hardware must be manufactured with a public/private key
pair. The private key is held securely by the chip, and is never
released. Ideally,
the manufacturing process destroys all records of the private key. The
chip is tamper-proof (it self-destructs rather than gives up its
private
key). Memory is curtained, to prevent debuggers and
other
software getting the private key, for example during signing
operations. Applications authenticate themselves
to a server, by sending the hardware's public key to the server,
together with a digital fingerprint of the application. The server
checks that both the hardware and the application are trusted, before
sending the content (see the attestation protocol box). Servers need to
know the set of valid public keys.
An attestation protocol (adapted from [2])
- The hardware has a public/private key pair, PKh and SKh.
- When an application A is started, it first generates a
public/private key pair PKA and SKA. The
application requests the hardware to certify its public key. The
certificate CA = {PKA,#A}SKh
returned by the hardware includes a hash of the executable A.
- When the application wants to attest its validity to a remote
server, it sends the certificate chain (PKh,CA)
to
the server. The server checks:
- PKh is not revoked.
- The application hash embedded in CA is on the
server's list of applications it trusts.
- The application now authenticates itself
by proving knowledge of SKA. For example, the application
and
the server can run a key exchange to generate a session key.
Memory curtaining is a strong,
hardware-enforced memory isolation feature to prevent untrusted
programs being able to read the memory allocated to trusted programs.
TC-compliant hardware must also have secure
IO, to address the threats posed by keycatchers and screen
grabbers, and sound recording devices. A keycatcher is a hardware
device between the keyboard and the computer, which records what you
type. A screen grabber records what is displayed on the screen. Secure
IO can also guarantee that input is provided by a physically-present
user, as distinct from another program impersonating a user.
Why TC is a bad thing
TC has been much criticised by respected commentators, and with good
reason. It removes control of the PC from its owner/user, and gives the
control to the software and content provider. This can easily be
abused. For example, TC
can enforce censorship; if someone writes a
paper that a court decides is defamatory, the author can be compelled
to censor it by withdrawing all access rights -- and the software
company that wrote the word processor could be ordered
to do the deletion if she refuses [3].
How does TC differ from CSS, or the XP activation mechanism?
DVDs have a protection scheme known as Content Scrambling System.
Although it appears complex and includes hardware authentication and
session key establishment (see, e.g.,
Greg
Kesden's Tutorial on CSS), it is essentially a
Break-Once-Run-Everywhere technology (BORE). It was indeed broken, by a
16-year old Norwegian programmer who wrote DeCSS. TC avoids BORE by
having different keys in every PC. Even if you succeed in breaking
yours, that won't help anyone else.
XP is closer to TC, by attempting to avoid BORE. During the
product activation phase of XP, a
fingerprint of your computer's hardware is registered by Microsoft as
corresponding to a particular license. Microsoft will not register
different hardware against the same license, and XP won't run unless
you register it. This stops you installing it on your friend's
computer. (`Hacked' versions are simply versions built without the
product
activation feature, sold to companies who don't want the hassle of
individual registrations.)
TC will also allow software companies to increase their monopolies. You
may feel it is hard to migrate from MS Office today, because it is
accepted as an industry standard. But in a TC world, this lock-in will
be even harder to break. Companies will recieve TC-Office documents,
and will need TC-Office to read them. Moreover, they will need to keep
paying the rent for TC-Office in perpetuity, if they want to continue
to
have access to their archives. Home users will need it too, in order to
read their gas bill.
"Trusted Computing" means PCs are more trustworthy from the point of
view of software vendors and content providers, but less trustworthy
from the point of view of their owners. It means your computer is
working for other people, not for you. It gives them complete power
over what your computer does, and it prevents you from even knowing in
what ways it is using against you information which you have provided.
Will TC take off, or will it die?
Replacing the PC with a closed platform is obviously impossible,
because PC buyers precisely value the fact that they can run any
software they like, and they have control over their computer.
TC allows the freedom of the open platform to coexist with the security
of the closed platform. It allows the restrictions to be introduced
gradually. Users' objections will be assuaged by the reassurance that
TC can be turned off, so it needn't seem
such a threat. But eventually the price of turning it off
will be
too great. At work, you will need it to read TC'd emails and documents
that are being sent to you. At home, you will need TC to communicate
with your bank, your city council, and your entertainment provider; and
you will need it because it will become your company's policy for
teleworkers. Increasingly, the peripheral hardware you buy for your
computer will only work with the TC mode of your computer. The non-TC
world
will continue to exist, but soon it be perceived as GNU/Linux is today:
great because it gives
you
more freedom, but a pain because it gives you less choice.
How does TC compare with present-day use of encryption, e.g. with
PGP?
When someone uses PGP (or the open-source equivalent, GPG) to send
you an encrypted document, and you use PGP to decode it, the result is
an unencrypted document that you can
read, forward, copy, and even re-encrypt to send it securely to
someone else. A TC application would let you read
the words on the screen, but would not let you produce an unencrypted
document that you could use in other ways. PGP makes security features
available to the users; they use it. TC is designed to impose
restrictions on the users;
it uses them. [4]
The counter-argument is persuasive too. People won't use it if it stops
them doing what they want to do. Why should I rent music from Sony, and put up
with all the TC restrictions, when I can have it for free, and without
restrictions, from my friend who has produced an open MP3 version? The
first bank that enforces TC will find its customers prefering to move
to another bank. If TC is a way of make the Chinese
pay for software, the Chinese pay for software, they won't use it. Nor will
students, hobyists and enthusiasts. If these people
don't use it, it may fail. Even if you are prepared to try to live with
the costs and restrictions that TC seeks to impose, the sheer
difficulty of coping with its constant checking, attesting, and
nannying may make you turn it off. Companies can't even tolerate the
intrusion of Microsoft's XP activation feature, so still less will they
tolerate attestations on the network and through the firewall every
time their staff wants to open a document.
Another argument why TC won't work for DRM is that there will be cracks
and workarounds. Note, however, that cracking TC is harder than (say)
cracking the DVD encryption mechanism CSS (which has been cracked).
That is because TC is designed not to be BORE: break-once, run everywhere. BORE
means that one person cracks it, and everybody benefits. Individual
hardware keys and the attestation protocol help ensure that the fact of
one person cracking it doesn't help anyone else.
What else is like TC?
TC has already arrived, in some restricted forms.
- CPRM
(Copyright Protection for Recordable Media) is currently primarily a
memory-stick copy-protection mechanism (Secure Digital SD format
currently taking 30% flash memory market (Oct 2003)), but is proposed
to
spread to the ATA hard disk standard. Each CPRM-compatible ATA hard
drive is individually signed, and authenticates the playback and
movement of files on the device against a central server using
CPRM-compliant software. So it's the same idea as (and probably part
of)
TC.
- Many printer
cartridges now come with chips that authenticate them to the
printer. Printers may refuse to work with third-party or refilled
cartridges, or even with genuine cartridges that have passed an expiry
date.
- Microsoft Windows XP's activation feature makes TC-like ideas
work on a platform which does not have a unique id, by taking a
fingerprint of the computer's hardware. This information can include
the MAC address of the network card and the volume id and capacity of
the disks, quantity of memory, etc.
- Apple iTunes music store also avoids BORE problems by generating
a key from identities in your pc. This was recently hacked by the same
person who cracked the DVD CSS system.
So cracks may take longer, but workarounds are relatively easy. Even if
the pathway is encrypted all the way to the speaker, I can still record
what comes out of the speaker, and then create an MP3 out of that.
That's BORE. Some quality is lost, for sure, but only once.
If I want to forward a TC'd email with non-forward restrictions, I may
have to resort to taking a digital photograph of my screen displaying
the email. I can email that to whoever I like, together with an OCR'd
version for easier reading.
So in the end, TC won't work. And there are reports that even Microsoft
beginning to think that. The problems it addresses will be solved
by other means. For media content, easy distribution on the web will
never go away, so we will have to find other ways of rewarding artists.
In the office, TC offers a sledge hammer to crack a nut; the problems
of privacy, confidentiality and authenticity can be solved more readily
and more fairly by open technologies like PGP.
Resources
- Microsoft's
papers including some technical information.
- Tal Garfinkel, Mendel Rosenblum, and Dan Boneh. Flexible
OS Support and Applications for Trusted Computing gives some detail
on a possible protocol (described in these notes).
- Ross Anderson's Trusted
Computing FAQ is an excellent source of information, and has lots
of
links/references.
- Richard Stallman, Can you
trust your computer?
- Seth Schoen, Trusted
computing: promise and risk.
End