Monday, March 29, 2010

SSL & Big Government. Where's Phil Zimmerman?

What an interesting year 2010 is already turning out to be in technology, politics, and life as we know it. More censorship battles are going on than ever before (e.g. Google vs. the Great Firewall of China) and the possibility of more ramp up of governments' control over Internet traffic in their respective companies. Australia has content filters on all ISPs in the name of decency, but political dissident websites have slipped into the "indecent" categories. The UK and US are pushing harder to take control of private access to the Internet. Iran shuts down all Internet access within the country during elections. Now this, reports that governments are manipulating the hierarchical Certificate Authority model to eavesdrop on "secure" encrypted connections over the Internet-- and vendors are creating turn-key appliances to make it easy. Do "netizens" still have a Bill of Rights? Who's watching the watchers?

Enter exhibit A: "Has SSL become pointless?" An article on the plausibility of state-sponsored eavesdropping by political coercion of Certificate Authorities to produce duplicate (faked) SSL certificates for Big Brother devices.
In the draft of a research paper released today (PDF available here), Soghoian and Stamm tell the story of a recent security conference where at least one vendor touted its ability to be dropped seamlessly among a cluster of IP hosts, intercept traffic among the computers there, and echo the contents of that traffic using a tunneling protocol. The tool for this surveillance, marketed by an Arizona-based firm called Packet Forensics, purports to leverage man-in-the-middle attack strategies against SSL's underlying cryptographic protocol.
...
As the researchers report, in a conversation with the vendor's CEO, he confirmed that government agencies can compel certificate authorities (CAs) such as VeriSign to provide them with phony certificates that appear to be signed by legitimate root certificates.
...
The researchers have developed a threat model based on their discoveries, superimposing government agencies in the typical role of the malicious user. They call this model the compelled certificate creation attack. As Soghoian and Stamm write, "When compelling the assistance of a CA, the government agency can either require the CA to issue it a specific certificate for each Web site to be spoofed, or, more likely, the CA can be forced to issue an intermediate CA certificate that can then be re-used an infinite number of times by that government agency, without the knowledge or further assistance of the CA. In one hypothetical example of this attack, the US National Security Agency (NSA) can compel VeriSign to produce a valid certificate for the Commercial Bank of Dubai (whose actual certificate is issued by Etisalat, UAE), that can be used to perform an effective man-in-the-middle attack against users of all modern browsers."
There's more info from Wired on the subject as well.

All of this is calling for a return to our roots. Where's Phil Zimmermann when we need him now?

Phil created PGP (Pretty Good Privacy) during the political crypto export wars, creating the first implementation of the "web of trust" model which is an alternative to the hierarchical model that Certificate Authorities use today in SSL Public Key Infrastructure (PKI). Firefox 3 already saw the introduction of some Web-of-Trust-like features for unsigned SSL certs. If you've ever browsed to an HTTPS site using a self-signed certificate, then you have probably seen the dialog box that asks you if you would like to save the the "exception" to trust that SSL certificate, which is very similar to how SSH works in the command line environment. Essentially, that's the basic premise behind the academic researcher's "CertLock" Firefox add-on, which is forthcoming, but extending the web-of-trust model to all SSL certs encountered and adding some decision support for what certificates to trust based on attribute/key changes.

In the hierarchical model which we have today, a bunch of "authorities" tell us which SSL certificates to trust. That's how we operate today. One CA (Certificate Authority) could tell us a cert for somebank.com at IP address A.B.C.D is OK, while a second CA could also exert that a completely different cert for somebank.com hosted at IP address E.F.G.H is also good. Who is the ultimate authority? You are. But you know that your Grandmother may have a hard time telling which certs to trust, which is why this problem exists and exactly why the hierarchical model exists in the first place. In the Web-of-Trust model, there are no authorities. You trust companyA.com and if companyA.com trusts companyB.com you can automatically trust companyB.com, too (or not, it's up to you). You build up links that represent people vouching for other people, just like real life. If you trust in somebody who is not worthy of that trust, then bad things can happen, just like real life.

In the hierarchical model, you're basically outsourcing those trust decisions to third parties you've never met. You're asking all of them--at the same time-- to guarantee your banking connection is secure. Or your connection to Facebook is secure. Or your connection to a politically dissident web forum is secure. I repeat: you're asking every single CA, each of which is an organization of people that you have never met, to all make these decisions for you simultaneously. Does that sound crazy? You bet. What if, in the real world analogue of this, you outsourced to a collection of "authorities" which TV shows you should watch, which products you should buy, and which politicians should get your vote? [In the U.S. we may already have that with the Big Media corporations, but thank goodness for the Internet, er, wait, well, before we knew about governments abusing SSL certificates anyway.]

It's in this hierarchical model that governments can subvert the confidentiality of the communications. And if governments can do this at-will by forcing Certificate Authorities within their jurisdiction to create fraudulent, duplicate certificates, what's going to stop the ISPs or snoops-for-hire that setup the intercepts from saving copies of pertinent data from themselves, outside of the original intent (regardless of its legal status in your home country)? Probably not much. Maybe an audit trail. Maybe. But likely even that is up for manipulation. After all, look at how poorly the Government Accounting Office ranks the various branches of the U.S. federal government's IT Systems-- many of them are receiving failing grades, yet, they still are in operation. Can you trust them with your data?

My browser has over 100 Certificate Authority certificates in it by default. I know each cert represents probably a dozen or more people who can have a certificate issued from the CA, but assuming it's only a single person per certificate, there certainly aren't 100 people out there I would trust in those aspects of my life. [If 100 doesn't seem that high, just count how many Facebook friends you have that you wouldn't really want to know {your credit card number, your plans for next Friday night, the way you voted at the last election, etc.}.

Perhaps we've gone soft. Perhaps we find hassles using PGP to encrypt messages sent through our favorite free webmail service. Perhaps we're trusting that somebody else is securing our information for us. Whatever it is, perhaps we should read Phil Zimmermann's original words, back when the fight for e-mail privacy was so vivid in our daily lives (before most Internet users could even spell "Internet"). Perhaps then we'll revive the fight for privacy in our web traffic as well and look to solutions like the forthcoming CertLock or maybe a full Web-of-Trust SSL implementation built into each of our browsers, rather than leaving all of our security decisions up to so many semi-trustworthy and unknown Certificate Authorities. Back then, the "activist" in each one of us-- each security professional-- told people to use PGP to encrypt ALL email. Why? Because if you didn't, the messages that were encrypted automatically stand out, like you "have something to hide". It's nonsense if you do or don't, but encrypting all the time doesn't reveal anything in the traffic pattern analysis. Perhaps we should revert to that and be more vigilant in our CA selection.

The following are Phil Zimmermann's own words for why he created PGP (Pretty Good Privacy)
:

Why I Wrote PGP

Part of the Original 1991 PGP User's Guide (updated in 1999)
"Whatever you do will be insignificant, but it is very important that you do it."
–Mahatma Gandhi.
It's personal. It's private. And it's no one's business but yours. You may be planning a political campaign, discussing your taxes, or having a secret romance. Or you may be communicating with a political dissident in a repressive country. Whatever it is, you don't want your private electronic mail (email) or confidential documents read by anyone else. There's nothing wrong with asserting your privacy. Privacy is as apple-pie as the Constitution.
The right to privacy is spread implicitly throughout the Bill of Rights. But when the United States Constitution was framed, the Founding Fathers saw no need to explicitly spell out the right to a private conversation. That would have been silly. Two hundred years ago, all conversations were private. If someone else was within earshot, you could just go out behind the barn and have your conversation there. No one could listen in without your knowledge. The right to a private conversation was a natural right, not just in a philosophical sense, but in a law-of-physics sense, given the technology of the time.
But with the coming of the information age, starting with the invention of the telephone, all that has changed. Now most of our conversations are conducted electronically. This allows our most intimate conversations to be exposed without our knowledge. Cellular phone calls may be monitored by anyone with a radio. Electronic mail, sent across the Internet, is no more secure than cellular phone calls. Email is rapidly replacing postal mail, becoming the norm for everyone, not the novelty it was in the past.
Until recently, if the government wanted to violate the privacy of ordinary citizens, they had to expend a certain amount of expense and labor to intercept and steam open and read paper mail. Or they had to listen to and possibly transcribe spoken telephone conversation, at least before automatic voice recognition technology became available. This kind of labor-intensive monitoring was not practical on a large scale. It was only done in important cases when it seemed worthwhile. This is like catching one fish at a time, with a hook and line. Today, email can be routinely and automatically scanned for interesting keywords, on a vast scale, without detection. This is like driftnet fishing. And exponential growth in computer power is making the same thing possible with voice traffic.
Perhaps you think your email is legitimate enough that encryption is unwarranted. If you really are a law-abiding citizen with nothing to hide, then why don't you always send your paper mail on postcards? Why not submit to drug testing on demand? Why require a warrant for police searches of your house? Are you trying to hide something? If you hide your mail inside envelopes, does that mean you must be a subversive or a drug dealer, or maybe a paranoid nut? Do law-abiding citizens have any need to encrypt their email?
What if everyone believed that law-abiding citizens should use postcards for their mail? If a nonconformist tried to assert his privacy by using an envelope for his mail, it would draw suspicion. Perhaps the authorities would open his mail to see what he's hiding. Fortunately, we don't live in that kind of world, because everyone protects most of their mail with envelopes. So no one draws suspicion by asserting their privacy with an envelope. There's safety in numbers. Analogously, it would be nice if everyone routinely used encryption for all their email, innocent or not, so that no one drew suspicion by asserting their email privacy with encryption. Think of it as a form of solidarity.
Senate Bill 266, a 1991 omnibus anticrime bill, had an unsettling measure buried in it. If this non-binding resolution had become real law, it would have forced manufacturers of secure communications equipment to insert special "trap doors" in their products, so that the government could read anyone's encrypted messages. It reads, "It is the sense of Congress that providers of electronic communications services and manufacturers of electronic communications service equipment shall ensure that communications systems permit the government to obtain the plain text contents of voice, data, and other communications when appropriately authorized by law." It was this bill that led me to publish PGP electronically for free that year, shortly before the measure was defeated after vigorous protest by civil libertarians and industry groups.
The 1994 Communications Assistance for Law Enforcement Act (CALEA) mandated that phone companies install remote wiretapping ports into their central office digital switches, creating a new technology infrastructure for "point-and-click" wiretapping, so that federal agents no longer have to go out and attach alligator clips to phone lines. Now they will be able to sit in their headquarters in Washington and listen in on your phone calls. Of course, the law still requires a court order for a wiretap. But while technology infrastructures can persist for generations, laws and policies can change overnight. Once a communications infrastructure optimized for surveillance becomes entrenched, a shift in political conditions may lead to abuse of this new-found power. Political conditions may shift with the election of a new government, or perhaps more abruptly from the bombing of a federal building.
A year after the CALEA passed, the FBI disclosed plans to require the phone companies to build into their infrastructure the capacity to simultaneously wiretap 1 percent of all phone calls in all major U.S. cities. This would represent more than a thousandfold increase over previous levels in the number of phones that could be wiretapped. In previous years, there were only about a thousand court-ordered wiretaps in the United States per year, at the federal, state, and local levels combined. It's hard to see how the government could even employ enough judges to sign enough wiretap orders to wiretap 1 percent of all our phone calls, much less hire enough federal agents to sit and listen to all that traffic in real time. The only plausible way of processing that amount of traffic is a massive Orwellian application of automated voice recognition technology to sift through it all, searching for interesting keywords or searching for a particular speaker's voice. If the government doesn't find the target in the first 1 percent sample, the wiretaps can be shifted over to a different 1 percent until the target is found, or until everyone's phone line has been checked for subversive traffic. The FBI said they need this capacity to plan for the future. This plan sparked such outrage that it was defeated in Congress. But the mere fact that the FBI even asked for these broad powers is revealing of their agenda.
Advances in technology will not permit the maintenance of the status quo, as far as privacy is concerned. The status quo is unstable. If we do nothing, new technologies will give the government new automatic surveillance capabilities that Stalin could never have dreamed of. The only way to hold the line on privacy in the information age is strong cryptography.
You don't have to distrust the government to want to use cryptography. Your business can be wiretapped by business rivals, organized crime, or foreign governments. Several foreign governments, for example, admit to using their signals intelligence against companies from other countries to give their own corporations a competitive edge. Ironically, the United States government's restrictions on cryptography in the 1990's have weakened U.S. corporate defenses against foreign intelligence and organized crime.
The government knows what a pivotal role cryptography is destined to play in the power relationship with its people. In April 1993, the Clinton administration unveiled a bold new encryption policy initiative, which had been under development at the National Security Agency (NSA) since the start of the Bush administration. The centerpiece of this initiative was a government-built encryption device, called the Clipper chip, containing a new classified NSA encryption algorithm. The government tried to encourage private industry to design it into all their secure communication products, such as secure phones, secure faxes, and so on. AT&T put Clipper into its secure voice products. The catch: At the time of manufacture, each Clipper chip is loaded with its own unique key, and the government gets to keep a copy, placed in escrow. Not to worry, though–the government promises that they will use these keys to read your traffic only "when duly authorized by law." Of course, to make Clipper completely effective, the next logical step would be to outlaw other forms of cryptography.
The government initially claimed that using Clipper would be voluntary, that no one would be forced to use it instead of other types of cryptography. But the public reaction against the Clipper chip was strong, stronger than the government anticipated. The computer industry monolithically proclaimed its opposition to using Clipper. FBI director Louis Freeh responded to a question in a press conference in 1994 by saying that if Clipper failed to gain public support, and FBI wiretaps were shut out by non-government-controlled cryptography, his office would have no choice but to seek legislative relief. Later, in the aftermath of the Oklahoma City tragedy, Mr. Freeh testified before the Senate Judiciary Committee that public availability of strong cryptography must be curtailed by the government (although no one had suggested that cryptography was used by the bombers).
The government has a track record that does not inspire confidence that they will never abuse our civil liberties. The FBI's COINTELPRO program targeted groups that opposed government policies. They spied on the antiwar movement and the civil rights movement. They wiretapped the phone of Martin Luther King. Nixon had his enemies list. Then there was the Watergate mess. More recently, Congress has either attempted to or succeeded in passing laws curtailing our civil liberties on the Internet. Some elements of the Clinton White House collected confidential FBI files on Republican civil servants, conceivably for political exploitation. And some overzealous prosecutors have shown a willingness to go to the ends of the Earth in pursuit of exposing sexual indiscretions of political enemies. At no time in the past century has public distrust of the government been so broadly distributed across the political spectrum, as it is today.
Throughout the 1990s, I figured that if we want to resist this unsettling trend in the government to outlaw cryptography, one measure we can apply is to use cryptography as much as we can now while it's still legal. When use of strong cryptography becomes popular, it's harder for the government to criminalize it. Therefore, using PGP is good for preserving democracy. If privacy is outlawed, only outlaws will have privacy.
It appears that the deployment of PGP must have worked, along with years of steady public outcry and industry pressure to relax the export controls. In the closing months of 1999, the Clinton administration announced a radical shift in export policy for crypto technology. They essentially threw out the whole export control regime. Now, we are finally able to export strong cryptography, with no upper limits on strength. It has been a long struggle, but we have finally won, at least on the export control front in the US. Now we must continue our efforts to deploy strong crypto, to blunt the effects increasing surveillance efforts on the Internet by various governments. And we still need to entrench our right to use it domestically over the objections of the FBI.
PGP empowers people to take their privacy into their own hands. There has been a growing social need for it. That's why I wrote it.
Philip R. Zimmermann
Boulder, Colorado
June 1991 (updated 1999)

[In the PDF to the article written by the researchers, a special thanks was called out to certain paper reviewers, including Jon Callas of PGP Corp with whom so much debate has transpired. To mangle Shakespeare: What a tangled web-of-trust we weave!]


UPDATED 4/12/2010: Bruce Schneier's and Matt Blaze's commentary.