I'm a closet economist. This means that, like most economists, I love pithy little phrases that neatly encapsulate intellectual concepts. You know: supply and demand. Efficient markets. Marginal utility.
The original
Science article
Here’s one I really like that you may not have heard of: the tragedy of the commons. This was the title of a 1968 Science article describing how individual decisions about use of a shared resource often result in it being ruined for everyone. The classic example is farmers grazing their animals in the "commons" -- shared pasture land not owned or controlled by any one person. Because the shared land isn't subject to usage
rules, farmers make individual (vs. collective) decisions about how many animals should feed
there and how much they should eat. Predictably the commons becomes overgrazed, ruining it for everyone. More modern examples include
mining, oil, and gas extraction. Fishing,
too: overfishing has been much in the news of late.
Now say
you’re a phone phreak instead of a farmer. You’ve discovered
some new vulnerability in the phone network – maybe a new way to make a free
phone call or a hitherto undiscovered telephone conferencing bridge you can use to chat with your friends. Naturally, you’ll want to share this with
your buddies, both for bragging rights as well as to spread the love so they can use it too. And just as naturally, they'll want to tell their
friends all about it as well. Information
like this wants to be shared – phone phreaks are social animals and, like
enthusiasts in all walks of life, they love telling their
friends about the new new thing.
But there’s
the rub. That new little hack you’ve
discovered is about to get “overphished.” When hundreds of phone phreaks start using it, somebody in the telephone
company is going to notice. And when that happens they’ll patch the hole,
ruining it for all the other phreaks.
In other words, a tragedy of the informational commons will have occurred. Information like this may want to be free,
but it also wants to be secret, because if it’s shared too much it loses its
value.
Famous phone phreak Joybubbles frustrated some fellow phone phreaks back in the 1970s because he believed that “knowledge shared is knowledge expanded.” For this and other
reasons he usually wouldn't agree to keep secrets.
As a result, some phone phreaks avoided
telling him things they thought were particularly sensitive for fear that he would tell others, leading to overphishing.
The same problem comes up in another area related to
information security: codebreaking. Let’s
say your crypto boys and girls have been eating their Wheaties and have broken your
enemy’s codes. Now what? What do you do with this bonanza of
intelligence? For the information that
you glean from reading your adversary’s email to have any value, you need to
act on it. But if you act on it, your
enemy may figure out that you’ve broken his codes and will go and change them,
leaving you in the dark. Ooops.
This is one
of the reasons that since World War II signals intelligence (“SIGINT”) – that
is, information obtained from intercepting enemy radio traffic or other communications –
has been heavily compartmented using clearances
above top secret.
By restricting who knows about it, we
hopefully prevent the bad guys from finding out that we’ve broken their codes,
either by the news leaking directly to them or by our exploiting the intelligence
in some way that makes it obvious that they’re been hosed.
American Cryptology During the Cold War
It’s interesting to see that some of the phone phreaks in the 1970s effectively made
their own compartments for phreaking information -- information that Joybubbles wasn’t cleared
for!
Neal Stephenson
does a wonderful job weaving this theme throughout his magnificent book Cryptonomicon. This tension is also discussed in Thomas Johnson’s
American Cryptology during the Cold War, the NSA’s official internal history
that was declassified a year or so ago. Both are fascinating reading.
I’d love to
hear more examples of overphishing. Got
any?