Openness vs. secrecy in IT-security

In the February 15, 2003 issue of CRYPTO-GRAM, Bruce Schneier of Counterpane provides a persuasive argument for why security properties of different systems, including IT and locks, are better of being open to public scrutiny instead of being kept secret.

The full disclosure vs. bug secrecy debate is a lot larger than
computer security. In January, security researcher Matt Blaze
published a paper describing a new attack against door locks. The
specific locks are “master key systems,” the sorts that allow each
person to have a key to his own office and the janitor to have a single
key that opens every office. The specific attack is one where a person
with an individual office key can make himself a master key. The
specifics are interesting, and I invite you to read the paper. It
turns out that the ways we’ve learned to conceptualize security and
attacks in the computer world are directly applicable to other areas of
security — like door locks. But the most interesting part of this
entire story is that the locksmith community went ballistic after
learning about what Blaze did.

The technique was known in the locksmithing community and in the
criminal community for over a century, but was never discussed in
public and remained folklore. Customers who bought these master key
systems for over a century were completely oblivious to the security
risks. Locksmiths liked it that way, believing that the security of a
system was increased by keeping these sorts of vulnerabilities from the
general population.

The bug secrecy position is a lot easier to explain to a layman. If
there’s a vulnerability in a system, it’s better not to make that
vulnerability public. The bad guys will learn about it and use it, the
argument goes. Last month’s SQL Slammer is a case in point. If the
hacker who wrote the worm hadn’t had access to the public information
about the SQL vulnerability, maybe he wouldn’t have written the
worm. The problem, according to this position, is more the information
about the vulnerability and less the vulnerability itself.

This position ignores the fact that public scrutiny is the only
reliable way to improve security. There are several master key designs
that are immune to the 100-year-old attack that Blaze
rediscovered. They’re not common in the marketplace primarily because
customers don’t understand the risks, and because locksmiths continue
to knowingly sell a flawed security system rather than admit and then
fix the problem. This is no different from the computer world. Before
software vulnerabilities were routinely published, vendors would not
bother spending the time and money to fix vulnerabilities, believing in
the security of secrecy. And since customers didn’t know any better,
they bought these systems believing them to be secure. If we return to
a world of bug secrecy in computers, we’ll have the equivalent of
100-year-old vulnerabilities known by a few in the security community
and by the hacker underground.

That’s the other fallacy with the locksmiths’ argument. Techniques
like this are passed down as folklore in the criminal community as well
as in the locksmithing community. In 1994, a thief made his own master
key to a series of safe-deposit boxes and stole $1.5 million in
jewels. The same thing happens in the computer world. By the time a
software vulnerability is announced in the press and patched, it’s
already folklore in the hacker underground. Attackers don’t abide by
secrecy agreements.

What we’re seeing is a culture clash; it’s happening in many areas of
security. Attorney General Ashcroft is working to keep details of many
antiterrorism countermeasures secret so as not to educate the
terrorists. But at the same time, the people — to whom he is
ultimately accountable — would not be allowed to evaluate the
countermeasures, or comment on their efficacy. Security couldn’t
improve because there’d be no public debate or public
education. Whatever attacks and defenses people learn would become
folklore, never spoken about in the open but whispered from security
engineer to security engineer and from terrorist to terrorist. And
maybe in 100 years someone will publish an attack that some security
engineers knew about, that terrorists and criminals had been exploiting
for much of that time, but that the general public had been blissfully
unaware of.

Secrecy prevents people from assessing their own risk. For example, in
the master key case, even if there weren’t more secure designs
available, many customers might have decided not to use master keying
if they knew how easy it was for an attacker to make his own master key.

I’d rather have as much information as I can to make an informed
decision about security. I’d rather have the information I need to
pressure vendors to improve security. I don’t want to live in a world
where locksmiths can sell me a master key system that they know doesn’t
work or where the government can implement security measures without
accountability.

Leave a Reply

Your email address will not be published. Required fields are marked *