A Secure Backdoor Is Feasible

2015.08.05   prev     next

FOR the record, I’m fine with Apple’s policy of providing secure, end-to-end encryption with no backdoor for the government, Apple, or anyone else.

But to play devil’s advocate just a little: One of the anti-backdoor arguments being bandied about is, “Even assuming we trust our government to use a backdoor for legitimate purposes only, and only when necessary, I don’t believe that Apple can create a backdoor for our government that wouldn’t also open the door to skilled hackers, and to other governments. If anyone can get in, bad guys will get in.”

The purpose of this article is to argue that no, the above argument is incorrect, and a secure, emergency backdoor can be created. Here’s how:

  • Apple creates a private key BDpriv (backdoor private) known only to Apple, and keeps this in a secure location (like a vault).
  • Apple includes the corresponding public key, BDpub, in iOS.
  • When two iOS devices have successfully negotiated a temporary session key (SK) for a secure communication session, then they include SK along with each packet of communication, but in an encrypted form: specifically, encrypted with BDpub. So each packet contains (A) user data encrypted with SK, and (B) SK encrypted with BDpub.
  • The receiving iOS device ignores (tosses) part B, which it can’t decrypt anyway, and uses SK (which it already has) to decrypt part A, the user data.
  • If/when the government intercepts (and records) iOS-to-iOS packet data that it has a need to decrypt, then Apple can assist either by providing BDpriv (not such a good idea), or better yet, simply by using BDpriv to decrypt the data, but without ever handing over BDpriv to anyone.

This is a very simple system, and if implemented correctly, would be just as secure as is SSL when you access your bank account, and just as secure as iOS-to-iOS communications currently are (provided you don’t consider Apple’s ability to emergency-crack your encryption to be in-and-of-itself an in­se­cu­ri­ty).

Under the above-described system, Apple would not store any user data that it isn’t storing now. It would be up to the government to intercept communication packets en route, then request that Apple decrypt them.

So in a nutshell: While I laud Apple’s policy of not having any such backdoor, I have to disagree with the technical claim that such a backdoor cannot be securely implemented. It can.

Jailbreak?

What if the criminals jailbreak their iOS devices and modify them to not include the BDpub-encrypted SK, or include instead fake, random data, that is not useful for backdoor decryption?

The system could be designed so that both participating criminals would have to have the modification. But, of course, they both could. Other than continuing to work toward a jailbreak-proof version of iOS, I don’t know what could be done about that. And any criminals who can physically crack open their iOS device and replace its ROM chip will always be able to jailbreak.

But at least anyone using out-of-the-box iOS would not be able to communicate criminal intent without worry that the government might read it. And in any case, jailbreaking to disable the backdoor is not a path to enabling hackers and other governments to use the backdoor.

 

Update 2015.08.10 — Witness the following conversation (Apple Context Machine #318), between Bryan Chaffin and Jeff Gamet:

BC: The reality is ... let’s say Apple and Google are forced to build-in backdoors. Let’s just make that an argument here.

JG: OK.

BC: And so now, when the government delivers a warrant, Apple can turn around and use that backdoor, and decrypt iMessages between one target and another target, hand that information over to the government, and the bad guys are stopped. ’Kay?

JG: Well, on the surface, that doesn’t sound so bad, because now the bad guy’s caught.

BC: Right. Yayyy, everything’s wonderful. But the cascading effect is this: ... if Apple has the keys to the back door, the bad guys will find the keys.

JG: Yes.

BC: Right.

JG: And that’s a huge problem.

BC: That’s a universally accepted truth in the world of people who actually know how this stuff works.

...

BC: I know we’ve talked about this before on the show. And none of that’s changed; these are the same old arguments. And the thing is that we had all these arguments in the ’90s, and the government lost. And the government should have lost. Encryption protects us from the bad guys even while it also prevents the good guys, the theoretical good guys, from accessing our stuff when they have a warrant and the right to it.

JG: Well, data protection: you can’t expect it to be a mutually exclusive thing, where it’s protected on one side, but— or from one group but not from another? It’s a binary thing; it’s protected or it’s not. And it doesn’t matter if it’s hardware-based or if it’s software-based; if you have created a backdoor, it’s no longer effectively protected. From anyone.

BC: Right. From anyone! From anyone. And yet, despite the fact that we’ve already gone through this, despite the numerous white papers, and all kinds of technical papers, and philosophical papers that’ve been written, and published, and filed, and just added to this topic, we’ve got the government doing it again ...

Yikes. Looks like this meme’s gon­na be hard to dispel.

 

Update 2015.11.03 — With the UK about to require backdoors in encrypted tech products, the heat is way up on this issue. Chaffin’s pounding the hammer again on today’s TMO Daily Observations:

If the government has a backdoor into encrypted communications, then everybody has access to that same backdoor. And that will always happen; this is a known reality to security experts.

And today Glenn Fleishman wrote on Macworld:

The problem with a backdoor is that there’s no way to create a way into a secure system that only the “good guys” can use.

On an only dimly related topic in the same article, he approvingly reiterated Amnesty International’s claim that torture doesn’t work:

Sometimes, torture is involved, which Amnesty International would remind us is both against international law and doesn’t help acquire useful operational in­for­ma­tion.

Two bogus truisms in one article and I couldn’t resist posting to its comments section, as follows:

At the risk of sounding like an insensitive bastard, allow me to suggest that the idea that torture “doesn’t help acquire useful operational information” is pretty silly. If you torture someone until they reveal the password to their encrypted hard drive, and now you’re in and can see the contents of their hard drive, then guess what: it worked! The idea that torture “doesn’t work” is one of those moral-obligation memes — disagreement is silently understood to be a red-flag that you’re a bad person.

A similar moral-obligation meme seems to be forming around the idea that a hacker-proof backdoor isn’t feasible. Sorry again — it’s very feasible. Search “a secure backdoor is feasible” to find my detailed explanation.

My post went into “moderation” — and surprise, surprise, it was unceremoniously deleted. I didn’t even get a message telling me about that, much less mentioning why. Several other comments were subsequently allowed, all expressing fawning agreement with the author. I don’t know that Fleishman is responsible, but somebody at Macworld sure is.

There are true things that some people really don’t want to know, and they’ll do whatever they can to make sure no one else does either.

 

Update 2015.11.12 — Cory Doctorow, from last May:

It’s impossible to overstate how bonkers the idea of sabotaging cryptography is to people who understand information security. ... Use deliberately compromised cryptography, that has a back door that only the “good guys” are supposed to have the keys to, and you have effectively no security. You might as well skywrite it as encrypt it with pre-broken, sabotaged encryption.

I wonder if this is one of the aforementioned “security experts,” “who actually knows how this stuff works,” on whose authority we are supposed to take it that the above-described BDpriv scheme would magically collapse?

Doctorow, just in case you’d forgotten, is the guy who, when iPad launched, said that he wouldn’t buy one and thought you should­n’t either, because:

open platforms and experimental amateurs ... eventually beat out the spendy, slick pros. ... Relying on incumbents to produce your revolutions is not a good strategy. They’re apt to take all the stuff that makes their products great and try to use technology to charge you extra for it, or prohibit it altogether.

The next year he informed us that:

A tablet without software is just an inconveniently fragile and poorly reflective mirror, so the thing I want to be sure of when I buy a device is that I don’t have to implicitly trust one corporation’s judgment about what software I should and shouldn’t be using.

And, of course, we all heeded his warning against iBooks:

Digital Distribution and the Whip Hand: Don’t Get iTunesed with your eBooks ... Any time someone puts a lock on something you own against your wishes, and doesn’t give you the key, they’re not doing it for your benefit.

 

Update 2015.11.19 — Bryan Chaffin again, in ACM #333:

A backdoor available to one is available to all. This is known. This isn’t theory. This isn’t speculation. This isn’t me crying about the government coming to get us. This is a reality of encryption. A backdoor available to one is available — to — all.

I don’t mean to pick on Chaffin here. I really like his work, particularly on The Apple Death Knell Counter. But I have to say it: The above quote is one of the best examples I’ve ever found of the idea that if you repeat something over and over again, in the most dogmatic terms possible, you can make it true — even though it actually isn’t.

 

Update 2015.12.25 — Gamet and Chaffin again, with Dave Hamilton on TMO Daily Observations:

JG: What we’re seeing right now from, in this case, the UK government, are statements — and I’m paraphrasing — this [a law-required backdoor] is not going to impact security; this is not going to impact people’s privacy; this is all just about protecting everyone. Don’t worry. It’s all fine.

BC: The whole thing is nonsense. We’ve talked about this topic.

DH: Maybe it’s not nonsense. Right? I mean, maybe there’s some thing, that they could say, hey look, you haven’t thought about this. And we’d all say, hmmm. But we need to hear that first.

JG: We do. And I —

DH: Sorry, Bryan. I didn’t mean to interrupt.

JG: Oh, yes. Bryan, you were talking; go ahead.

BC: Uh, it’s nonsense.

DH: (laughing) OK, it’s nonsense.

JG: Well said.

BC: We’ve talked about this several times. We’ve talked about this on probably every podcast that we’ve been on. We’ve written about this a lot. And of course, lots of other people have talked about this and written about this, as well. The security experts all call it nonsense. Apple is essentially calling it nonsense. I mean, the politicians have the information they need to not be idiots about this. And yet they want, apparently, to be idiots. And it’s super frustrating.

JG: I agree. It is frustrating.

And on ACM #338:

JG: Like we have said so many times, a backdoor isn’t just for one person; it’s there for everyone.

BC: It’s there for everybody to find.

JG: Yes.

BC: And don’t believe that because we say it, because we’re just repeating our betters here. Right? We’re repeating the encryption nerds, and the math wonks, and the people who have been studying this for decades. They’re the ones who say this.

JG: Right. The serious privacy and encryption advocates.

BC: Yeah. The people who actually know.

I certainly can’t claim these guys aren’t in good company: Here’s Tim Cook himself, being interviewed by Charlie Rose on 60 Minutes last Sunday:

[I]f there’s a way to get in, then somebody will find the way in. There have been people that suggest that we should have a back door. But the reality is if you put a back door in, that back door’s for everybody, for good guys and bad guys.

Hats off to Cook for refusing (thus far) to put in a backdoor. Hats off to Cook for trying to discourage governments from passing laws that would require a backdoor. And when Cook says that a backdoor won’t do much to catch the really bad guys because they’ll just find a way to communicate that’s out of reach of that backdoor, I have to agree, he’s probably right.

But when he says what I just quoted him saying to Rose — then I have to say, no, that isn’t true.

What is Cook going to do if/when some major nation (e.g. the USA, the UK, or China) passes a law requiring a backdoor? My guess is, he’ll implement something pretty much exactly like I describe at the top of this article. And it will not give backdoor access to hackers, foreign governments, unspecified bad guys, or “everybody.” It will be usable only by Apple, and Apple will use it only when they get a court order telling them to do so.

 

Update 2016.01.15 — New York is working on a bill that would require all smartphones sold in the state to be unlockable/decryptable by their manufacturers. How could Apple comply with such a bill? Simple: Make available (to any connected device that requests it) a concatenation of the iPhone’s 256-bit data encryption key and the user’s unlock code encrypted with BDpub. Only Apple (the holder of BDpriv) will be able to do anything with that value.

Again — I’ll be happy if New York doesn’t pass this bill. And if they do pass it, I’ll be fine if Apple decides not to sell its products in New York until the law is repealed. But if Apple wants to securely, safely comply with such a law? It can. Easily.

 

Update 2016.01.21 — Now California is working on a similar bill. Can’t see Apple stopping sales of its products in its home state.

(Update: Defeated.)

 

Update 2016.02.20 — A bad analogy is making the rounds. It goes like something like this: “Requiring Apple to put in a backdoor is like if door lock manufacturers had to give a master key to the government, so the police could enter your house any time they thought it was the right thing to do. If there’s no master key to the door locks on your house, why should there be a backdoor to your phone?”

Ram

The flaw in that analogy is that the government does have such a master key — it’s called a battering ram. And the nice thing about the ram is that it leaves obvious evidence of its use, so they can’t come in when you’re not there and snoop without you knowing something happened. Nor can a rogue officer use the ram to snoop without leaving ample evidence for other officers that it was done.

The analogy of the battering ram fits the BDpriv scheme pretty well. There would be no way for the government to use BDpriv to quietly snoop; they would have to serve Apple with an order to decrypt specific data, and such a request presumably would be a matter of public record. (If it wasn’t, Apple could simply refuse to do it, and/or publicize the request them­selves.)

A police master key to your house’s door locks would be analogous to Apple handing over BDpriv to the government, saying, “Do with this whatever you think best.”

Manufacturer-Held Master Key?

Now — is BDpriv analogous to a door-lock master key that is held by the lock manufacturer, and used only by that manufacturer when issued a court order to do so? Kind-of. But the analogy is strained. First, the police have battering rams, and don’t need the lock manufacturer’s cooperation to use them. And second, a door-lock master key inevitably would be duplicated (or reverse-engineered from a lock), and fall into the wrong hands — including the hands of the police. But BDpriv — kept in an Apple vault, and used only by Apple, at Apple, under Apple’s in-house procedures designed to prevent both undocumented use by a roque employee and copies of BDpriv leaving the vault — would not inevitably fall into the wrong hands.

The main weakness of BDpriv (as compared to battering rams) is that there would be no way to know if it had somehow gotten out of Apple’s control, and was being quietly used by the parties that possessed it.

Pro-Backdoor?

Am I starting to become pro-backdoor? I’m aware that it must sound like I am. Let’s just say this: I would be OK with the BDpriv scheme I’ve outlined here. But I would also be very much OK with Tim Cook fighting this successfully, and never implementing any backdoor at all.

 

Update 2016.02.22 — Richard A. Epstein of Stanford University’s Hoover Institution, writes about “Apple’s iPhone Blunder”:

I participated in hearings that then-Senator John Ashcroft held in March 1998, and spoke in opposition to [a government-mandated backdoor], along with Kathleen Sullivan, then a professor of law at Stanford Law School. The greatest risk of the built-in back door is that the government will not be the only party that will enter through it. Back doors necessarily compromise the integrity of a security system.

Nope.

 

Update 2016.02.24Cory Doctorow in The Guardian quadruples-down on secure-backdoors-aren’t-possible-they’re-not-they’re-not-they’re-not:

“The FBI wants a backdoor only it can use — but wanting it doesn’t make it possible”

“The thing about this controversy is that it isn’t one. Independent cryptographers are virtually unanimous in their view that you can’t properly secure a system while simultaneously ensuring that it ships with a pre-broken mode that police can exploit. The fact that this would be useful doesn’t make it possible: as security experts Meredith Whittaker and Ben Laurie recently wrote: ‘Wanting it badly isn’t enough.’”

That’s funny — I don’t want it, and I didn’t have any trouble at all figuring out how to do it. It’s pretty simple, actually.

And aren’t these the same people who say that if hackers, foreign governments, etc., want it badly enough, they will find a way to exploit a backdoor? They can find a way in if they just want it badly?

“Law enforcement would also be assisted by anti-gravity devices, time machines, psychic powers, and the ability to selectively reverse entropy, but that doesn’t make them possible. Likewise uncontroversial is the gravity of the cybersecurity question.”

Sure wish I could give you psychic reverse gravity and all that, Cory — but great news! You can have a hacker-proof backdoor on all smartphones just as soon as you (and/or your elected legislators) want to! Time to celebrate.

And this must be your lucky day, because I’ve got even better news: If hackers, foreign governments, and random bad guys have anything to gain by discovering psychic reverse gravity? Then they’ll just find a way! They always do; they can just find a way to do it. So very soon now, humanity will have those marvelous powers! I can hardly wait.

“There’s precedent for this kind of contradiction, where something urgent is considered a settled matter in expert circles, but is still a political football in policy circles: climate change. Denialism is a deadly feature of 21st-century life.”

So now I’m not just “denialist” for knowing how to make a secure backdoor — I’m also deadly. OK. I can live with that.

“One form of math denial is the belief in the ability to make computers that prevent copyright infringement. Computers only ever work by making copies: restricting copying on the internet is like restricting wetness in water.”

Somebody, quick: Tell Apple that its controlled App Store can’t be done, so they need to shut it down right away, and just give away all the apps for free. And take back all that money that the authors made; that wasn’t really possible. (If you think it was, you’re in denial!)

“It’s tempting to play along with them here, offer them more magic beans in the form of backdoors that we pretend only the good guys can fit through ...”

What’s tempting about it? Just call them deadly denialists, and you’re done.

 

Update 2016.02.26 — John McAfee, famed creator of Windows antivirus software that borders on being its own kind of malware, speaks about the possibility of an iPhone backdoor:

“Here’s the problem. The way that the FBI wants Apple to work is to provide a new version of their operating system, which will give a backdoor into Apple’s encryption. Let me tell you what that actually means. A backdoor is not something that can ever be kept secret. So within a month, every bad hacker in the world will have that backdoor. The Chinese will have it. The Russians will have it. And everyone who uses an iPhone: You’re gonna lose your bank account, your credit cards are gonna be maxed out, your social security number and your identity is gonna be stolen, and we’re gonna have chaos. These are the facts of life.”

The values of Apple’s OS/app signing keys have stayed secret since the iPhone came out in 2007, but a backdoor key — stored in the same manner and secure facility as those signing keys — would leak within a month. And we know this, because it’s just the facts of life.

Thanks, John. That was very informative.

 

Update 2016.03.08 — Something just like my BDpriv is apparently called a “golden key” (meaning a master key, a key that opens all doors). On Exponent #68, Ben Thompson describes a system where there is a golden key in (say) Apple’s possession, that can be used to unlock the encryption — to which James Allworth re­sponds:

JA: What you’re describing is functionally the same as not having encryption at all.

BT: ... Having a golden key is, yes, in the long run the same as not having encryption at all. ... Encryption, the actual scrambling of data on a disk, is a binary thing. It’s either truly encrypted, or it’s not.

JA: Right.

So having the user-data encryption key discoverable by the holder of BDpriv would be the equivalent of having no encryption at all? It wouldn’t even fall, say, in-between no encryption and Apple’s currently impervious-even-to-Apple encryption? Thompson answers that question a bit later in the show:

BT: There is a spectrum. There is not a spectrum when it comes to encryption; there is a spectrum when it comes to security.

Well, I guess that settles that. And lest I had any further doubts, here’s Jason Snell in Six Colors:

Piers Morgan: “Let Apple open the damn back door, let the FBI know what’s on the phone, then close it again. This isn’t difficult.”

Congratulations, Piers Morgan — you can join the tech geniuses on the Washington Post editorial board in the Golden Key society. They’re the ones who wrote: “With all their wizardry, perhaps Apple and Google could invent a kind of secure golden key they would retain and use only when a court has approved a search warrant.” (That’s not how cryptography works.) But nothing is difficult when you believe in wizardry.

So if I think BDpriv would actually work — to the point that I could be dissuaded only by someone explaining very specifically why it wouldn’t — then I literally believe in wizardry. OK, Jason. You must be right. Somehow. I certainly don’t want people to think I believe in wizardry.

But for the time being, the only weakness of BDpriv of which I’m aware is that it might leak out of Apple’s control. Somehow. Even though Apple currently signs its own OS with key(s) that it has to keep secret — and somehow, they don’t leak out. Wizardry, perhaps?

 

Update 2016.03.19 — Fleishman was the guest on John “Daring Fireball” Gruber’s “The Talk Show With John Gruber” (#149). As a long-time fan, I’ve actually gone out of my way to keep Gruber’s name off of this page so far. But in this latest episode, he takes the bluster a full notch higher than anyone I’ve quoted here yet; therefore it becomes unavoidable.

JG: You know, I’ll just admit it, and Hillary Clinton has espoused the same opinion, is a belief in this magical thinking that if we just put smart enough people into a room together, that they can come up with a way that this backdoor solution would only be available to law enforcement.

Not saying I’m the smartest person in the room, but I described it all by myself (see the top of this page), and no magic was — nor is — required.

That, we’re not asking you to make a backdoor that anybody could get into, we just want a backdoor that law enforcement can get into, when we have a warrant. Which sounds reasonable, and in some fictional, other universe, where that’s mathematically possible, that might be great.

Bad news: If you’re reading this article right now, you’ve just warped off to an alternate universe where secure backdoors are eminently possible (if not advisable). I sure hope you can find a way back to your own universe, where secure backdoors just don’t work.

I actually, I think that there’s good reasons why a civil libertarian would be opposed even to that. Let me just put this out there. And I tend to lean that way. I would listen to the argument, but I tend to lean towards, even if that were possible, I don’t think it’s a good idea, and I think it’s contrary to the values that are already in our Bill Of Rights. But it is an idea.

I have to agree: If, if it could be done (somehow, who the hell knows how, like in an alternate universe with different math??) — maybe it still shouldn’t be.

But the simple truth is that it’s math— all experts agree, and everybody who understands encryption. I mean, this is, it, I don’t think that you— it’s, it’s more than even, like, uh, I mean, it’s like provably incorrect.

The proof is out there? A link would be very helpful.

You know, like, as opposed to, let’s say, climate change, where you can say, you can argue that only 98 or 99% of expert climate scientists agree that what we’re seeing is man-made. I mean, with cryptography and backdoors it’s 100% a­gree­ment.

GF: You’re totally right. I just realized I haven’t seen any crypto deniers out there saying this is possible. I’ve only seen politicians and law enforcement.

JG: Right. Right.

GF: That’s fascinating.

Let me make sure I understand this correctly: Describing my BDpriv scheme (scroll to top) makes me a crypto denier, in the same league with anti-science nutballs in general. (In this case, anti-math nutballs!) Well, that’s certainly one way to win an argument.

But in this case, we nutballs don’t even exist, because there is “100% agreement” on this issue, not even “98 or 99%.” How can that be? Oh yeah, I forgot — I’m from another universe. A fictional one.

I want Cook to win this battle, but not by any means necessary. The civil libertarian argument, I think, is strong. The “it can’t be done; everyone will have access to it” argument isn’t just weak — it’s completely false.

 

Update 2016.03.20 — Later in the same episode:

JG: One of the things that depresses me about the current state of decades-long discourse in the United States is the polarization of politics, and that so many issues are so clearly polarized, and that we’ve self-sorted on these various lines into the two parties, and that there’s no interchange between them. It warms my heart that on this particular issue, it doesn’t fall on one line or the other ...

Yes, the polarization of politics is quite depressing. Would you like to know why it happens? Because people on both sides of an argument think they can win by dismissing all in-between positions, by embracing a false dichotomy. It happens because both sides think, if the only choices are my way or Hitler’s way, then people will choose my way!

Here’s how it’s done:

Either there’s no backdoor for anyone, or there’s a backdoor everyone will be able to use! Either it’s encrypted with no backdoor, or it’s not encrypted at all, and may as well be sky-written! If you disagree, you’re a crypto-denier, you literally believe in wizardry, you’re engaging in magical thinking, you think you live in another, fictional universe, and you don’t understand how encryption works. Among those who do understand it, there is nothing less than 100% agreement on this!

That’s how things get polarized. Fast.

 

Update 2016.03.21 — added “a concatenation of the iPhone’s 256-bit data encryption key and”

 

Update 2016.03.31 — Still later in the same episode:

JG: It’s easy for a layperson to believe in the magic solution of a way for the government to get in, but nobody else. You kind-of have to be gently informed of the basic way that encryption works to understand just how dangerous and impossible it is to say, the only people who can get in are the U.S. federal government, with a warrant. It just doesn’t work that way, once you create a backdoor.

GF: I know; that’s right.

 

Update 2016.06.07 — UK backs down on requiring a backdoor.

 

Update 2016.06.20 — Russia considering requiring a backdoor in all messaging apps.

 

Update 2016.08.12 — Microsoft just leaked a private key, and the usual suspects are immediately buzzing with the idea that this somehow demonstrates the fundamental insecurity of encryption backdoors. On AppleInsider Podcast #81, Victor Marks and Mikey Campbell opine:

VM: You named the words, “golden key,” right? And once such a key exists, it is impossible to keep it secured.

MC: Yeah.

VM: You can keep things quiet or secret for a time. But you can’t keep them secret indefinitely. Because Microsoft has a golden key that governs the secure boot of Windows, their core operating system, their core product, and the thing upon which they built their whole empire, right?

MC: Yep.

VM: The golden key escaped into the wild, didn’t it?

MC: It did.

VM: How apocalyptically bad is that for Microsoft?

MC: The ramifications are— they’re pretty extreme. ... [Apple] specifically designed theirs not to have this said golden key.

Microsoft Windows already has the encryption backdoor that Ap­ple refuses to implement? Hmmm.

Jeff Gamet and Bryan Chaffin weigh in on TMO Daily Ob­ser­va­tions:

JG: Microsoft gave us a perfect example of why it’s so important not to have backdoors into our operating systems, by accidentally letting the master keys — the golden keys, so to speak — for their encryption, out into the wild. Bryan, you wrote about this. So, fill us in please?

BC: Well, what Microsoft did, apparently earlier in the year, was to accidentally release a key that allows — a backdoor, basically — that allows someone to install whatever they want, even on a device that has what is called secure-boot. ... [this leak] does serve as an expert example of why backdoors existing are a bad idea. Not only does the existence of a backdoor make a target for our government, foreign governments, authoritarian regimes, criminal organizations, terrorist organizations, curious hackers, not only does the existence become an automatic target, but sometimes even the legitimate holders of a key can mishandle it.

Immediately after the above, Dave Hamilton tries to inject a dose of sanity to the discussion by explaining the necessity of having these keys at all, and he comes tantalizingly close to saying that this Microsoft leak event isn’t an example of backdoors being in­se­cure.

It isn’t. Just like Apple, Microsoft has private/public-key pairs that it uses to secure its products’ bootup processes and OS update processes, with the private key never needing to exist anywhere outside of headquarters. Unlike Apple, Microsoft somehow managed to leak one of those private keys to the public. That key was not cracked, found, or otherwise extracted by any government, regime, criminal, terrorist, or hacker. It was accidentally leaked by Microsoft.

Could a backdoor key be accidentally leaked? Of course. But the security of your data already depends on avoiding such leaks. Nobody knows how to make a secure product that doesn’t depend on private keys that are held (and not leaked) by the maker of that product.

Later in the AppleInsider episode:

VM: The FBI ... they wanna restart the encryption debate.

MC: Shocking.

VM: I know. Shocking, if true. People need to know how to back off of something when it’s not gonna work for them. Of course, people in government never do. But it’s frustrating that we’re still having this, after it’s been demonstrated repeatedly that math doesn’t work that way. Math doesn’t work the way that [FBI director James Co­mey] wishes it did.

Microsoft accidentally leaked a private key because math doesn’t work the way the FBI wishes it did? Hmmm.

 

Update 2016.12.21 — The U.S. Congress’s Encryption Working Group just issued a report in which it discouraged legislation to require backdoors. I’m glad that the U.S. government seems to be backing down from Apple on this issue. However, the report includes the following:

Any measure that weakens encryption works against the national interest.
...
Congress should not weaken this vital technology because doing so works against the national interest.
...
How would consumers’ privacy and data security suffer if encryption were weakened?

The direct implication is that a backdoor involves weakening the encryption. That is simply false. In the BDpriv scheme (watch the below-linked “Method...” video), the encryption of user data would not be weakened in the slightest. The only difference would be that the holder of BDpriv (Apple) would uniquely possess the ability to acquire the session (decryption) key.

Only by redefining “weakened” to include the existence of any backdoor, can backdoors be argued (circularly) to weaken encryption. But that’s like saying that your device’s encryption is “weakened” because you wrote down your device-unlock code on a piece of paper and put it in your safety deposit box at your bank. The encryption has not become weaker in any way — the hypothetical possibility that legal authorities might force their way into your deposit box, and find your unlock code there, is not an encryption “weak­ness.”

There seems to be an unstated assumption floating around that the way a backdoor works is that you introduce some hidden weakness to the encryption technique, so that a government agency with strong computers (and knowledge of the weakness and how to exploit it) can brute-force the encryption via this weakness. This idea is also vaguely implied when people say that hackers will “find” the backdoor — i.e. they’ll discover the weakness and figure out how to exploit it.

This idea is completely wrong. A backdoor does not (or certainly need not) involve any such hidden weakness. The only thing that needs to be hidden is the actual value of BDpriv; all other features of the backdoor scheme can be loudly publicized to the world without ill effect. And the user-data encryption is not weakened — or even changed, for that matter — at all. It’s just straight-up AES.

 

Update 2016.12.25 — That Congressional group has Chaffin and Gamet back on the subject (ACM #390):

BC: If we want to have security from all the bad guys out there — we gotta have security! And the good guys, we can’t give just the good guys access to this. Now we’ve talked a lot about this on this show, and on TDO. I believe that Dave Hamilton and John F. Braun have talked a lot about it on Mac Geek Gab. We’ve written thousands of words on the subject.

JG: And John Martellaro has had security experts on Background Mode.

BC: Yup. And, the long and the short of it is, there’s really no compromise.

JG: No. It’s either, we have true encryption and security for our da­ta, or we do not.

BC: Or we do not. We do not! And so the Encryption Working Group was tasked with studying this, because— listen, to Jeff and I [sic], and again, to anyone with a basic understanding of reality, when it comes to encryption, it’s very obvious how this works. As we just described it.

JG: Mmm-hmm.

BC: To, but— reasonably speaking, and yeah, I’m being a jerk when I put it that way, but reasonably speaking, there are lots of folks who don’t have a basic understanding of how encryption works. And to them, it’s just very obvious that law enforcement should be able to get into the device owned or used by a bad guy. It’s just obvious. You wanna stop the bad guys, and so of course they should have access to it. Because they don’t understand the repercussions; they don’t understand how this stuff actually works.

JG: Right. And, you know, that’s understandable that someone, or groups, could jump to a wrong conclusion based on their lack of knowledge and understanding a­bout a topic like this. But for the people that are in a position where they will be setting policy for how our data is protected, like the government, if they’re going to mandate backdoors into our data, they should be really diving into this in a serious way, so they understand what the implications are, and can make an informed policy decision.

I’m no politician, but after hearing this, boy do I feel informed.

Chaffin quotes from the Encryption Working Group:

BC: “Technology companies, civil society advocates,” — this is a quote — “a number of federal agencies, and some members of the academic community argue that encryption protects hundreds of millions of people against theft, fraud, and other criminal acts. Cryptography experts and information security professionals believe that it is exceedingly difficult and impractical, if not impossible, to devise and implement a system that gives law enforcement exceptional access to encrypted data without also compromising security against hackers, industrial spies, and other malicious actors.”

JG: Ah, that’s awesome.

What really awesome is that I devised just such a system about a year-and-a-half ago. And it wasn’t even difficult, not to mention impossible. As far as implementation is concerned, the only hard part is keeping BDpriv secure within Ap­ple’s HQ, and Apple apparently already nailed that problem about a decade ago when they figured out how to keep their iOS signing keys from leaking.

BC: “Further, requiring exceptional access to encrypted data would, by definition, prohibit encryption design best practices, such as,” quote-unquote, “forward secrecy, from being implemented.”

Nope. The session key (“SK” in my “Method..” video below) is different every session. So is the user’s unlock code. So is the phone’s data encryption key (DEK). If Apple reveals any of those things to the FBI for a mass-shooter’s iPhone, it wouldn’t compromise any other past, or future, values of SK, DEK, or unlock codes. Only the security of the specific iPhone (or the specific transmissions) under warrant would be compromised. Not even future FaceTime calls (e.g.) made with the same phone would be compromised, at all. They use an entirely new, random session key.*

JG: I’m totally loving this. This is exactly what they should be saying.

BC: Right, because it happens to be reality.

JG: Yes.

When people I respect, and with whom I almost always concur, insist that such-and-such is “reality” — I really would like to agree with them. But should I say that an encryption backdoor can’t work, even if I know it can? Somewhere, somehow — somebody has to explain, specifically, how my backdoor plan would fail, before I can say, with a straight face, that I believe it would.

 

*Update 2016.12.27 — I am assuming that Apple does not leak BDpriv. The only danger against “forward secrecy” is that if they did leak it, recorded data from years ago could be cracked. Besides, of course, not leaking BDpriv (just like they currently must not leak SIGpriv), a “forward secrecy” solution might go like this: Use a different BDpriv every year, with iPhones pre-loaded with fifty years of BDpub. When any year’s BDpriv becomes three years out-of-use, permanently destroy it, so it can never be leaked (nor even obtained by any level of force or intimidation).

 

Update 2017.03.09 — Gamet again on TDO:

What [FBI Director James Comey] wants is backdoors. And that’s where the problem comes in. Because, as we’ve said so many times, a backdoor not an exclusive thing. If you create a backdoor for the FBI, and the CIA, and whatever intelligence agency, you’ve created a backdoor that other governments, hackers, other criminals, can use. All they have to do is figure out how to get through the hoops to do it, and that’s it.

All they have to do is break the strong encryption in which the backdoor data is wrapped. Just jump through the hoops, and that’s it!

Come to think of it, the bad guys don’t even need a backdoor. They can break the existing system right now: Just figure out how to do it, jump through the hoops, and that’s it — encryption cracked! What is Apple going to do about that? We’re all totally vulnerable, right now.

 

Update 2017.05.01 — More wisdom from Doctorow: DRM will be all gone by 2025! So if you have a profitable app on the App Store, be sure to save some of the money: The party’s over just eight years from now.

 

Update 2017.06.19 — EU now considering requiring secure encryption, and outlawing backdoors.

 

Update 2017.07.14 — Australia working on law that would require government access to encrypted messages.

 

Update 2017.08.18 — Help Net Security has the definitive word:

How security pros look at encryption backdoors

The majority of IT security professionals believe encryption backdoors are ineffective and potentially dangerous, with 91 percent saying cybercriminals could take advantage of government-mandated encryption backdoors.

What do the other 9% say — that the Dead Sea Scrolls told them a secure backdoor is possible?

Encryption backdoors create vulnerabilities that can be exploited by a wide range of malicious actors, including hostile or abusive government agencies.

Oh, well, that settles it.

 

Update 2017.11.01 — Russia’s ban on VPNs goes into effect.

 

Update 2017.11.08 — It’s been over two years since he first started talking about this, but Chaffin’s tune has just gotten more entrenched than even I thought possible (ACM #436):

A backdoor created for one is available to all. This is a hard-and-fast rule. There is no way around the laws of reality in this particular regard.

The specific, exact way that my BDpriv plan would fail is — drumroll please — that it just would because that’s a hard-and-fast law of reality. Ahhh. I’m so glad I understand that now. Why did I ever think my idea could work?

The FBI guy ... may have evidence right there in his hands, and he just can’t get to it. I get how frustrating that is. But you can’t divorce that from the reality that encryption is binary: you’ve got it or you don’t!

Say...suppose, hypothetically, that Apple had the power to silently push a mandatory update to your iPhone that would cause all your private messages to be quietly echoed to the FBI, sans encryption — but Apple hadn’t ever used that power — would that be a case of you, the user, having encryption, or not having it? Because, you know, it’s a binary thing. You’ve either got it or you don’t.

And while you’re contemplating that, also contemplate this: It’s actually not hypothetical. Apple has that exact power, right now.

JG: What about the argument ... that Apple could just keep the code, and that way it would be safe. How do you feel about that argument?

BC: That argument doesn’t hold any intellectual water whatsoever. For one thing, if it exists, it’ll eventually be exploited.

JG: Mmm-hmm.

BC: And if it exists and is sitting on a phone that is within the FBI’s holding, not Apple’s holding? I don’t see how that could possibly be within Apple’s care.

JG: Mmm-hmm.

Because Apple couldn’t possibly have the backdoor key in their private possession? Got it; I stand corrected.

BC: [In this scheme] someone within Apple has access to it. And if someone within Apple has access to it, they are subject to extortion. Threats. Bribes.

JG: A mistake.

Someone who has Chaffin’s (or Gamet’s) ear, please ask him to re-watch Apple’s Ivan Krstić speaking at Black Hat (starting at 32:25). Somehow, this doesn’t sound very vulnerable to threats, bribes, or mistakes. And if it was, then we’d all be vulnerable, right now, because Krstić isn’t talking about hypothetical security keys that don’t yet exist.

 

Update 2017.11.09 — U.S. DOJ’s Rod Rosenstein (as quoted by Cyrus Farivar in Ars Technica):

[T]ech companies are moving in the opposite direction [of cooperation with government]. They’re moving in favor of more and more warrant-proof encryption.

Law 101, Rod: There’s no such thing as a legal order to do the impossible. No one can comply — or refuse to comply — with any such order. There would not even be a way to determine whether or not the subject of such an order was in compliance.

There is such as thing as crack-proof encryption. (It’s been around for many decades.) But no encryption system, or anything else in this world, is “warrant-proof.” That descriptor lies somewhere between oxymoronic and meaningless.

That said... here’s the opposite sophistry, from Farivar:

The DOJ’s position runs counter to the consensus of information security experts, who say that it is impossible to build the strongest encryption system possible that would also allow the government access under certain conditions.

The feasibility/infeasibility of a secure encryption backdoor, like the feasibility/infeasibility of an airplane or an atomic bomb, isn’t consensus. Either it can be done, or it can’t.

 

Update 2017.12.05 — Germany working on law to require backdoors in all modern tech devices.

 

Update 2018.02.26 — Israeli security firm Cellebrite, the company that cracked the San Bernardino iPhone 5C for the FBI, now says it can force its way into any locked iPhone, up-to-and-including the most current models. If it’s true, I’d sure like to know how.

 

Update 2018.03.07 — Cyrus Farivar of Ars Technica:

FBI again calls for magical solution to break into encrypted phones

Nothing magical about it; just watch the “Method” video linked at the bottom of this page!

[FBI Director Christopher] Wray again did not outline any specific ... technical solution that would provide both strong encryption and allow the government to access encrypted devices when it has a warrant.

I did. Two years ago.

A key escrow system, with which the FBI or another entity would be able to unlock a device given a certain set of circumstances, is by definition weaker than what cryptographers would traditionally call “strong encryption.”

By definition? So if I write my iPhone’s unlock code on a slip of paper and drop it into my safe deposit box at my bank, the encryption on my iPhone becomes weaker by definition? Um, OK. I guess that’s one way to win an ar­gu­ment.

 

Update 2018.03.15 — Starting at $15,000, GrayKey device claims to be able to unlock any recent iPhone. Sure would like to know how this works, too. Looks like Apple has some work to do on its Secure Enclave, and some explaining to do as to how it was possible to thwart the existing version.

 

Update 2018.03.25 — Charlie Savage reporting for The New York Times:

Based on [security researchers’] research, Justice Department officials are convinced that mechanisms allowing access to the data can be engineered without intolerably weakening the devices’ security against hacking.

Not intolerably weakening? How about not weakening at all? It’s right here, guys.

“Building an exceptional access system is a complicated engineering problem with many parts that all have to work perfectly in order for it to be secure, and no one has a solution to it,” said Susan Landau, a Tufts University computer security professor.

No one except yours truly, and it’s not complicated at all.

A National Academy of Sciences committee completed an 18-month study of the encryption debate, publishing a report last month. While it largely described challenges to solving the problem, one section cited presentations by several technologists who are developing potential approaches.

Uh, and how much did that cost?

But one alternative being worked on by [Ray] Ozzie and others is receiving particular attention inside the government. The idea is that when devices encrypt themselves, they would generate a special access key that could unlock their data without the owner’s passcode. This electronic key would be stored on the device itself, inside part of its hard drive that would be separately encrypted — so that only the manufacturer, in response to a court order, could open it.

Bingo! Give that man a cigar. And please send a copy of that plan to Tufts right away. Might save them a lot of head-scratching.

 

Update 2018.04.25ACM #459:

JG: Do you remember when the San Bernardino brouhaha was happening? And we had DOJ officials, we had James Comey from the FBI, and they’re all saying, look, you know when we have these backdoors, they would be kept safe, they would be used only for [purposes] intended.

BC: Yeah, we’ll leave ’em in the hands of the company. That was one of the things that James Comey was arguing. Only Apple would have the “FBI OS” [backdoor].

JG: Right. And people were saying, so there, that proves that this will be safe. And Apple was saying, no, look, if we make it, eventually it’s going to leak. When people make these sorts of tools, they leak, period.

BC: Right.

JG: And I remember the DOJ just shooting that down, saying no-no-no, that’s not a problem. And yet, we have this [Grayshift data breach], and we have the Cell­e­brite leak, and Cellebrite being the big company out of Israel that hacks into devices for government agencies. ... We already have demonstrable evidence that this stuff leaks.

Except Apple’s signing keys. Somehow (magically?), they don’t leak. And it’s a damn good thing they don’t, because if they did, the security of your iPhone — the one you’re using right now, with no FBI backdoor at all — would be massively compromised.

 

Update 2018.05.02 — Reform Government Surveillance (RGS), a coalition of about nine companies (including Apple), has a list of five principles, which center around the idea that government should limit its data-access demands, and should be publicly open about what it demands, from whom, etc. This all seems pretty reasonable to me.

But today, they added a sixth principle, about “Strong Encryption,” which includes the following:

Requiring technology companies to engineer vulnerabilities into their products and services would undermine the security and privacy of our users, as well as the world’s information technology infrastructure.

By referring to backdoors as “vulnerabilities,” the RGS slickly implies that a backdoor would make strong encryption no longer strong, i.e. vulnerable to hacking and cracking.

 

Update 2018.05.02 — David Ruiz of the Electronic Frontier Foundation, in “There is No Middle Ground on Encryption”:

Opponents of encryption imagine that there is a “middle ground” approach that allows for strong encryption but with “exceptional access” for law enforcement. ... Despite this renewed rhetoric, most experts continue to agree that exceptional access, no matter how you implement it, weakens security.

Most experts? What do the rest of them say?

 

Update 2018.05.08 — Apple soon to update iOS with something called “USB restricted mode,” which should defeat GrayKey and anything like it. That’s great — but I’d still like to know how it was ever possible to do this, and why a USB restricted mode is needed to defeat it. The Secure Enclave should make anything like GrayKey completely impossible.

 

Update 2018.08.15David Barnes in The Hill:

“There is no such thing as a ‘safe backdoor’ in encryption”

“[Fed]s are now pressuring tech companies to create so-called ‘backdoors’ ... These backdoors would grant them access to Americans’ personal data through a supposedly secure channel. ... technology experts warn that tech companies cannot build a backdoor that would guarantee only law-abiding officials have access. If you create a way in, somebody you don’t want to get in will find it.”

“You are the parent of a seven-year-old mischief-maker. You tell the child, ‘I’m leaving the house for an hour. There is a bag of candy hidden somewhere, but don’t look for it because it’s so well hidden you cannot ever find it.’ ... When bad actors are told there’s a government-mandated backdoor, they’re going to search for it if they know it exists.”

They’ll search for the value of the backdoor key for only a few quadrillion centuries, and then, ta-da, they’ll find it! Just like they’re currently searching for the values of the signing keys that secure iOS updates and iOS apps.

 

Update 2018.08.16 — Australia working on law that would incarcerate for ten years any suspect who refuses to unlock their mobile device. (Existing law already allows for a two-year incarceration.) The proposed law would also require all mobile service providers, mobile device makers, and app authors “to hand over any decryption keys they hold.” Further, the law would compel those parties “to build new capabilities that would help the government access a target’s information where possible.”

 

Update 2018.08.15 — David Barnes in The Hill:

“There is no such thing as a ‘safe backdoor’ in encryption”

“Want to send your text messages and emails privately? ... Need to transfer money to a relative through online banking? Encryption. ... But some federal officials are now pressuring tech companies to create so-called ‘backdoors’ that allow law enforcement to work around encrypted devices. ... technology experts warn that tech companies cannot build a backdoor that would guarantee only law-abiding officials have access. If you create a way in, somebody you don’t want to get in will find it.”

“Consider this analogy. You are the parent of a seven-year-old mischief-maker. You tell the child, ‘I’m leaving the house for an hour. There is a bag of candy hidden somewhere, but don’t look for it because it’s so well hidden you cannot ever find it.’”

Consider this analogy: If you write down your decryption password and put it in your safety deposit box, your kids will find it. Your enemies will find it. Somebody other than you will find it, because they just will. They always, always do.

 

Update 2018.08.17 — U.S. government asks federal court to hold Facebook in contempt of court for not breaking the encryption of its users’ private messages when asked to do so.

 

Update 2018.09.03 — Five-country coalition (Australia, Canada, New Zealand, UK, USA) calls for encryption backdoors in statement issued after August 28 Gold Coast conference.

 

Update 2018.11.27 — The most current version of iOS reportedly completely defeats GrayKey, but a new company, DriveSavers, now says that they can break into any iPhone (but they aren’t saying how).

 

Update 2018.12.11 — Jeffrey Goldberg of 1Password blogs about Australia’s “Assistance and Access Act.” He commendably draws attention to a disturbing possibility: This law might enable the government to privately approach any employee of a company, and order that individual to secretly install backdoors in the company’s products, under threat of prosecution for refusing to do so, or even for telling any other person about it! That sounds pretty bad to me, and realistically, quite possible.

However, he can’t resist mixing it up with the ever-popular “weakness” schlock:

A back door is a deliberate and hidden weakness in a system that is designed to allow certain people to bypass the security of the system. We have argued on multiple occasions that not only do back doors weaken security for everyone, but that a system in which a back door can (easily) be inserted is inherently weaker than a system in which a back door cannot (easily) be inserted.

I guess you could make a backdoor that relies on “hidden weakness,” like, if you were really stupid. But the backdoor I detailed in about ten minutes certainly would not rely on any such thing — it would be neither hidden, nor a weakness.

And since Apple (if it wanted to) could very easily use a push-update to insert my style of backdoor into all iOS devices, at any time (it could use its existing signing keys as the backdoor’s keys), then is iOS “inherently weaker” than if Apple wasn’t easily able to do that? (And exactly how would that work, anyway!?)

 

Update 2019.06.01 — Cellebrite now claims it can get into any iOS device ever made.

 

Update 2019.06.28 — U.S. govt. considering national ban on unbreakable, end-to-end encryption.

 

Update 2019.07.23Kate Cox in ArsTechnica:

FBI Director Christopher Wray said last year that developing a process for allowing government officials lawful entry into encrypted communications would “entail varying degrees of innovation by the industry,” but he said he didn’t “buy the claim that it’s impossible.” But no matter how many times government officials try to will such an option into existence, what they claim to want isn’t actually possible. Security experts and product makers have said time and time again that introducing a backdoor — an access portal for a specific entity to gain access through — into an encryption scheme weakens the whole thing.

When the experts say it can’t be done, it’s best not to even try.

 

Update 2019.10.04Sean Gallagher in Ars Technica:

Here we go again.

Yup.

[E]ncryption is available in e­nough forms already that blocking its [non-backdoored] use by major service providers won’t stop criminals from encrypting their mes­sag­es.

Yup.

[B]ackdoored encryption is fragile at best and likely to be quickly broken.

Nope.

Laws don’t change mathematics

Yup.

[A] government-mandated backdoor would be risky at best.

Nope.

 

Update 2020.01.14 — Well, it’s all back in the news again, with a new DOJ-Apple legal fight in the wings. Gruber:

There is no magic way to allow law enforcement to access encrypted contents without allowing everyone else the same path.

Nothing magical about it, that’s for sure.

Mathematics doesn’t discern between “good guys” and “bad guys”.

If “good guys” means the holder(s) of the decryption key, and “bad guys” means everybody else, then yes, the mathematics of encryption absolutely does differentiate between the good guys and the bad guys. If it didn’t, encryption would be useless, and Apple’s current iOS security would be completely impossible.

Malcom Owen in AppleInsider:

[U.S. Attorney General William] Barr has previously waded into the debate, calling for the creation of backdoors that somehow do not weaken encryption, yet still provide access for law enforcement officials.

Somehow indeed.

 

Update 2020.05.19 — Gruber rightly (IMO) calls out FBI director Chris Wray’s false statements about Apple. But, as usual, his phraseology is chosen in such a way as to create implied statements-of-fact with which I must specifically disagree:

[The FBI’s] badmouthing of Apple’s intentions in this case is just another example of their trying to scare people into supporting legislation to make secure encryption illegal.

A backdoor, however undesirable, would not render encryption any less secure than it is now.

[Barr:] “We are confident that technology companies are capable of building secure products that protect user information and, at the same time, allow for law enforcement access when permitted by a judge.”

This is not mathematically possible, and newsrooms should stop publishing these claims from law enforcement officials without comment from encryption experts.

A secure backdoor is trivially easy: Just store the user’s passcode in iCloud. Retrieve it if/when a judge says you must. That’s all there is to it — nothing mathematically impossible about it. And if any “encryption experts” have an explanation of how that plan would fail (how it would give access to bad guys, etc.), they should put that explanation on the web where anti-backdoor bloggers can link to it, not just repeatedly assure us that experts know it can’t be done.

Saying you want technology companies to make a backdoor that only “good guys” can use is like saying you want guns that only “good guys” can fire. It’s not possible, and no credible cryptographer would say that it is. You might as well say that you want Apple to come up with a way for 1 + 1 to equal 3.

When I want to know how storing the user’s passcode in iCloud would not be a safe backdoor for legal emergencies, I am not enlightened by being told that 1 + 1 does not equal 3 — any more than I would be enlightened by, “You’re wrong, you’re ignorant, you’re crazy, go away.”

A gun stored in a vault at police headquarters cannot be fired by everyone. The house key in your pocket does distinguish between who has it (you) and who doesn’t (everyone else). The passcode you currently use to unlock your i­Phone does distinguish between who knows it (you) and who doesn’t (everyone else). This is the basis of all data security currently in use, and would also be the basis of the security of a backdoor, if Apple were ever legally required to create one.

If such a law passes, and if all legal appeals are exhausted without success — will Apple’s compliance with that law come in the form of switching away from strong AES to a weakened, vulnerable form of encryption, a system that is discoverable-then-usable by hackers, bad guys, and everyone? Or will Apple just store the user’s passcode in iCloud, leaving all their encryption as-is, and giving backdoor access to no one but themselves?

I think we all know the answer to that question.

 

Update 2021.11.23 — From Gruber’s comments on Apple’s lawsuit against NGO Group:

Lastly, the phrase “the immense resources and capabilities of nation-states”. This is Apple hammering home the fact that deliberate backdoors would be exploited. They’re up against countries with, effectively, infinite money and resources to find and exploit accidental vulnerabilities. If there were deliberate backdoors, the game would be over before it started.

An intentional backdoor would be more vulnerable than an unintentional security flaw? I’m not sure I could say that with a straight face if my life depended on it.

And what does “over before it started” mean, that a backdoor would give access to bad guys, like right away? So not only would the backdoor key leak out of Apple — despite Apple being able to keep its signing keys from leaking for, oh, the entire life of iPhone so far — it wouldn’t leak eventually, but rather immediately?

That strikes me as the sort of say-anything-to-win logic that we would expect from the “stop the steal” crowd.

 

Update 2022.01.17 — Gruber’s latest:

“The digital equivalent of alchemy” — what a great turn of phrase to describe the persistent notion that it’s possible to create strong encryption with a backdoor that only “good guys” can use.

Sure do wish I could turn lead into gold as easily as I can detail a secure encryption backdoor. Because then I would be many, many times as wealthy as John Gruber.

 

Update 2022.12.08 — William Gallagher of AppleInsider closes his latest article with the following blanket statement-of-fact, devoid of detail or justification:

It is not possible to add a backdoor that only law enforcement can use. Any backdoor, any circumvention of encryption, effectively cancels all user privacy protection because bad actors will exploit it.

 

Update 2023.02.23 — Meredith Whittaker, president of Signal (encrypted messaging app), says it is “magical thinking” to think we could create exceptional access “only for the good guys,” and that “[e]ncryption is either protecting everyone or it is broken for everyone.”

 

Update 2023.03.09 — Gruber a­gain, on WhatsApp’s plan (which I wholeheartedly approve) to exit the U.K. if a backdoor is mandated:

Cryptographers and privacy experts agree that end-to-end encryption is the only way to guarantee privacy. Dum-dum elected officials around the globe have a persistent “it must be possible” fantasy that it’s possible to create an encryption system with backdoor keys that would only be available to “the good guys”.

Last night I had a dum-dum fantasy that this video actually exists.

It’s not even a matter of willingness. It’s not technically possible for WhatsApp or Signal or iMessage or any platform that’s end-to-end encrypted to use some weaker backdoor-able encryption on a country-by-country basis. The platform is either truly end-to-end encrypted or it’s not. They can’t just flip and switch and let U.K. Whats­App users use an entirely different non-E2E protocol.

Except by including a public-key-encrypted session key with every data packet. (See above-linked vid­e­o — if it actually exists!)

 

Update 2023.04.19 — Signal and WhatsApp have both publicly announced that they will exit the U.K. if required to put in a backdoor, a decision I enthusiastically endorse.

But in their latest missive, they describe a backdoor as “weakening encryption,” and Gruber, while discussing this development, once again has to throw in a few of these:

Some reports are portraying this as though these services would begrudgingly comply if the law passes, but they can’t. ... It’s tough, messaging-wise, because coming right out and saying that sounds like these companies won’t comply, by choice. Laypeople seemingly can’t be made to understand that a “good-guys-only back door” is cryptographically impossible. But that’s the truth.

Here it is folks, easy as pie, and it’s been there for like, seven years now? In all that time, Gruber either hasn’t watched it, or is pretending he hasn’t. 😕

 

Update 2023.07.20 — Gruber still insisting that “add[ing] ‘good guys only’ back doors” is “like trying to comply with a law that declares 1 + 1 = 3”.

 

Update 2023.08.23 — Gruber’s latest on this topic asks an important question (which I’m sure he meant rhetorically, but I’ll answer it straight up):

If it were possible for, say, Signal, to silently disable E2EE but still have messages go through, how could users ever trust the service?

It is possible for Signal to do that. They control their service, completely. They choose not to do that. The only way you can trust them to continue making that choice is if you just trust them. Sorry, but that’s all there is to it.

Not to impugn Signal, but I think maybe Apple is the only company I currently trust in this manner. But be clear: The reason there is no backdoor in Apple’s system is because Apple continuously chooses not to install one. Because I like Apple, and believe in its products and its honesty, I trust it to continue making that choice.

Now, the usual thing:

Removing E2EE wouldn’t require some mere tweak to the protocols, it would require replacing the protocols entirely (with inherently insecure ones).

Here’s the tweak. You’re welcome.

 

Update 2024.04.17 — Gruber posts a delightful article suggesting that Meta’s best option might be to exit the EU over its newly planned requirement that Meta provide its services without targeted advertising and without a user fee.

But mixed in with it, another swipe at secure backdoors:

Invent some novel way to generate as much revenue per non-targeted ad as targeted ones. This is the “nerd harder” fantasy solution, a la demanding that secure end-to-end encryption provide back doors a­vail­a­ble only to “the good guys”.

Harder? I didn’t even nerd hard. I nerded super-easy, and there it was. That was eight years ago, and today I know exactly what the counterargument to that video is: Just ignore it, and pretend it does­n’t exist.

 

See also:
iOS Jailbreaking — A Perhaps-Biased Assessment
&
A Secure Backdoor Is Feasible
&
Method of Implementing A Secure Backdoor In Mobile Devices
&
When Starting A Game of Chicken With Apple, Expect To Lose
&
Make Your Own FBI iPhone Backdoor, Right Now
&
Tim Sweeney Plays Dumb
&
Apple Wants User/Developer Choice; Its Enemies Want Apple Ruin

 

prev     next

 

 

Hear, hear

prev     next

Best recent articles

Make Your Own FBI Backdoor, Right Now

Polygon Triangulation With Hole

The Legacy of Windows Phone

Palm Fan

Vivek Wadhwa, Scamster Bitcoin Doomsayer

Fanboy Features (regularly updated)

When Starting A Game of Chicken With Apple, Expect To Lose — hilarious history of people who thought they could bluff Apple into doing whatever they wanted.

A Memory of Gateway — news chronology of Apple’s ascendancy to the top of the technology mountain.

iPhone Party-Poopers Redux and Silly iPad Spoilsports — amusing litanies of industry pundits desperately hoping iPhone and iPad will go away and die.

Embittered Anti-Apple Belligerents — general anger at Apple’s gi-normous success.

RSS FEED

My books

Now available on Apple Books!

   

Links

Daring Fireball

The Loop

RoughlyDrafted

Macalope

Red Meat

Despair, Inc.

Real Solution #9 (Mambo Mania Mix) over stock nuke tests. (OK, somebody made them rip out the music — try this instead.)

Ernie & Bert In Casino

Great Explanation of Star Wars

Best commercials (IMO) from Super Bowl 41, 43, 45, 46, 47, 53 and 55

Kirk & Spock get Closer

American football explained.

TV: Severance; Succession; The Unlikely Murderer; Survivor; The Jinx; Breaking Bad; Inside Amy Schumer

God’s kitchen

Celebrity Death Beeper — news you can use.

Making things for the web.

RedQueenCoder.

My vote for best commercial ever. (But this one’s a close second, and I love this one too.)

Recent commercials I admire: KFC, Audi, Volvo

Best reggae song I’ve discovered in quite a while: Virgin Islands Nice

d120 dice: You too (like me) can be the ultimate dice nerd.

WiFi problems? I didn’t know just how bad my WiFi was until I got eero.

Favorite local pad thai: Pho Asian Noodle on Lane Ave. Yes, that place; blame Taco Bell for the amenities. Use the lime, chopsticks, and sriracha. Yummm.

Um, could there something wrong with me if I like this? Or this?

This entire site as a zip file — last updated 2024.08.16

Previous articles

Fitness Startup Is Hard

Sweeney Translation

Collatz, Revisited

Downtown Isn’t Coming Back

Stig

Gaston

Nuclear War

Wolfspeare

Engström’s Motive

Google’s Decision

Warrening

The Two Envelopes Problem, Solved

The Practical Smartphone Buyer

Would Apple Actually Exit the EU Or UK?

See You Looked

Blackjack Strategy Card (Printable)

Swan Device 1956 — Probable Shape

Pu

RGB-To-Hue Conversion

Polygon Triangulation With Hole

One-Point Implosion: “Palm Fan”

Implosion: Were Those Two-Speed Lenses Really Necessary?

Apple Wants User/Developer Choice; Its Enemies Want Apple Ruin

Tim Sweeney Plays Dumb

The Jury of One

The Lesson of January 6

Amnesia Is Not A Good Plot

I Was Eating for 300 lbs, Not 220

Action Arcade Sounds and Reality

The Flea Market and the Retail Store

Squaring the Impossible

Yes, Crocodiles Are Dinosaurs — Duh

Broccoli and Apples Are Not the Antidote To Donuts and Potato Chips

Cydia and “Competition”

The Gift of Nukes

Prager University and the Anti-Socialists’ Big Blind Spot

In Defense of Apple’s 30% Markup, Part 2

In Defense of Apple’s 30% Markup

Make Your Own FBI Backdoor, Right Now

Storm

The Legacy of Windows Phone

Mindless Monsters

To the Bitter End

“Future Shock” Shock

Little Plutonium Boy

The iPhone Backdoor Already Exists

The Impulse To Be Lazy

HBO’s “Meth Storm” BS

Judos vs. Pin Place

Vizio M-Series 65" LCD (“LED”) TV — Best Settings (IMHO)

Tasting Vegemite (Bucket List)

The IHOP Coast

The Surprise Quiz Paradox, Solved

Apple, Amazon, Products, and Services — Not Even Close

Nader’s Open Blather

Health — All Or Nothing?

Vivek Wadhwa, Scamster Bitcoin Doomsayer

Backwards Eye Wiring — the Optical Focus Hypothesis

Apple’s Cash Is Not the Key

Nothing More Angry Than A Cornered Anti-Apple

Let ’Em Glow

The Ultimate, Simple, Fair Tax

Compassion and Vision

When Starting A Game of Chicken With Apple, Expect To Lose

The Caveat

Superb Owl

NavStar

Basic Reproduction Number

iBook Price-Fixing Lawsuit Redux — Apple Won

Delusion Made By Google

Religion Is A Wall

It’s Not A Criticism, It’s A Fact

Michigan Wolverines 2014 Football Season In Review

Sprinkler Shopping

Why There’s No MagSafe On the New MacBook

Sundar Pichai Says Devices Will Fade Away

The Question Every Apple Naysayer Must Answer

Apple’s Move To TSMC Is Fine For Apple, Bad For Samsung

Method of Implementing A Secure Backdoor In Mobile Devices

How I Clip My Cat’s Nails

Die Trying

Merger Hindsight

Human Life Decades

Fire and the Wheel — Not Good Examples of A Broken Patent System

Nobody Wants Public Transportation

Seasons By Temperature, Not Solstice

Ode To Coffee

Starting Over

FaceBook Messenger — Why I Don’t Use It

Happy Birthday, Anton Leeuwenhoek

Standard Deviation Defined

Not Hypocrisy

Simple Guide To Progress Bar Correctness

A Secure Backdoor Is Feasible

Don’t Blink

Predictive Value

Answering the Toughest Question About Disruption Theory

SSD TRIM Command In A Nutshell

The Enderle Grope

Aha! A New Way To Screw Apple

Champagne, By Any Other Maker

iOS Jailbreaking — A Perhaps-Biased Assessment

Embittered Anti-Apple Belligerents

Before 2001, After 2001

What A Difference Six Years Doesn’t Make

Stupefying New Year’s Stupidity

The Innovator’s Victory

The Cult of Free

Fitness — The Ultimate Transparency

Millions of Strange Devotees and Fanatics

Remember the iPod Killers?

Theory As Simulation

Four Analysts

What Was Christensen Thinking?

The Grass Is Always Greener — Viewing Angle

Is Using Your Own Patent Still Allowed?

The Upside-Down Tech Future

Motive of the Anti-Apple Pundit

Cheating Like A Human

Disremembering Microsoft

Security-Through-Obscurity Redux — The Best of Both Worlds

iPhone 2013 Score Card

Dominant and Recessive Traits, Demystified

Yes, You Do Have To Be the Best

The United States of Texas

Vertical Disintegration

He’s No Jobs — Fire Him

A Players

McEnroe, Not Borg, Had Class

Conflict Fades Away

Four-Color Theorem Analysis — Rules To Limit the Problem

The Unusual Monopolist

Reasonable Projection

Five Times What They Paid For It

Bypassable Security Certificates Are Useless

I’d Give My Right Arm To Go To Mars

Free Advice About Apple’s iOS App Store Guidelines

Inciting Violence

One Platform

Understanding IDC’s Tablet Market Share Graph

I Vote Socialist Because...

That Person

Product Naming — Google Is the Other Microsoft

Antecessor Hypotheticum

Apple Paves the Way For Apple

Why — A Poem

App Anger — the Supersized-Mastodon-In-the-Room That Marco Arment Doesn’t See

Apple’s Graphic Failure

Why Microsoft Copies Apple (and Google)

Coders Code, Bosses Boss

Droidfood For Thought

Investment Is Not A Sure Thing

Exercise is Two Thirds of Everything

Dan “Real Enderle” Lyons

Fairness

Ignoring the iPod touch

Manual Intervention Should Never Make A Computer Faster

Predictions ’13

Paperless

Zeroth — Why the Century Number Is One More Than the Year Number

Longer Than It Seems

Partners: Believe In Apple

Gun Control: Best Arguments

John C. Dvorak — Translation To English

Destructive Youth

Wiens’s Whine

Free Will — The Grand Equivocation

What Windows-vs.-Mac Actually Proved

A Tale of Two Logos

Microsoft’s Three Paths

Amazon Won’t Be A Big Winner In the DOJ’s Price-Fixing Suit

Infinite Sets, Infinite Authority

Strategy Analytics and Long Term Accountability

The Third Stage of Computing

Why 1 Isn’t Prime, 2 Is Prime, and 2 Is the Only Even Prime

Readability BS

Lie Detection and Psychos

Liking

Steps

Microsoft’s Dim Prospects

Humanity — Just Barely

Hanke-Henry Calendar Won’t Be Adopted

Collatz Conjecture Analysis (But No Proof; Sorry)

Rock-Solid iOS App Stability

Microsoft’s Uncreative Character

Microsoft’s Alternate Reality Bubble

Microsoft’s Three Ruts

Society’s Fascination With Mass Murder

PlaysForSure and Wikipedia — Revisionism At Its Finest

Procrastination

Patent Reform?

How Many Licks

Microsoft’s Incredible Run

Voting Socialist

Darwin Saves

The Size of Things In the Universe

The Self-Fulfilling Prophecy That Wasn’t

Fun

Nobody Was In Love With Windows

Apples To Apples — How Anti-Apple Pundits Shoot Themselves In the Foot

No Holds Barred

Betting Against Humanity

Apple’s Premium Features Are Free

Why So Many Computer Guys Hate Apple

3D TV With No Glasses and No Parallax/Focus Issues

Waves With Particle-Like Properties

Gridlock Is Just Fine

Sex Is A Fantasy

Major Player

Why the iPad Wannabes Will Definitely Flop

Predators and Parasites

Prison Is For Lotto Losers

The False Dichotomy

Wait and See — Windows-vs-Mac Will Repeat Itself

Dishonesty For the Greater Good

Barr Part 2

Enough Information

Zune Is For Apple Haters

Good Open, Bad Open

Beach Bodies — Who’s Really Shallow?

Upgrade? Maybe Not

Eliminating the Impossible

Selfish Desires

Farewell, Pirate Cachet

The Two Risk-Takers

Number of Companies — the Idiocy That Never Dies

Holding On To the Solution

Apple Religion

Long-Term Planning

What You Have To Give Up

The End of Elitism

Good and Evil

Life

How Religion Distorts Science

Laziness and Creativity

Sideloading and the Supersized-Mastodon-In-the-Room That Snell Doesn’t See

Long-Term Self-Delusion

App Store Success Won’t Translate To Books, Movies, and Shows

Silly iPad Spoilsports

I Disagree

Five Rational Counterarguments

Majority Report

Simply Unjust

Zooman Science

Reaganomics — Like A Diet — Works

Free R&D?

Apple’s On the Right Track

Mountains of Evidence

What We Do

Hope Conquers All

Humans Are Special — Just Not That Special

Life = Survival of the Fittest

Excuse Me, We’re Going To Build On Your Property

No Trademark iWorries

Knowing

Twisted Excuses

The Fall of Google

Real Painters

The Meaning of Kicking Ass

How To Really Stop Casual Movie Disc Ripping

The Solitary Path of the High-Talent Programmer

Fixing, Not Preaching

Why Blackmail Is Still Illegal

Designers Cannot Do Anything Imaginable

Wise Dr. Drew

Rats In A Too-Small Cage

Coming To Reason

Everything Isn’t Moving To the Web

Pragmatics, Not Rights

Grey Zone

Methodologically Dogmatic

The Purpose of Language

The Punishment Defines the Crime

Two Many Cooks

Pragmatism

One Last Splurge

Making Money

What Heaven and Hell Are Really About

America — The Last Suburb

Hoarding

What the Cloud Isn’t For

Diminishing Returns

What You’re Seeing

What My Life Needs To Be

Taking An Early Retirement

Office Buildings

A, B, C, D, Pointless Relativity

Stephen Meyer and Michael Medved — Where Is ID Going?

If You Didn’t Vote — Complain Away

iPhone Party-Poopers Redux

What Free Will Is Really About

Spectacularly Well

Pointless Wrappers

PTED — The P Is Silent

Out of Sync

Stupid Stickers

Security Through Normalcy

The Case For Corporate Bonuses

Movie Copyrights Are Forever

Permitted By Whom?

Quantum Cognition and Other Hogwash

The Problem With Message Theory

Bell’s Boring Inequality and the Insanity of the Gaps

Paying the Rent At the 6 Park Avenue Apartments

Primary + Reviewer — An Alternative IT Plan For Corporations

Yes Yes Yes

Feelings

Hey Hey Whine Whine

Microsoft About Microsoft Visual Microsoft Studio Microsoft

Hidden Purple Tiger

Forest Fair Mall and the Second Lamborghini

Intelligent Design — The Straight Dope

Maxwell’s Demon — Three Real-World Examples

Zealots

Entitlement BS

Agenderle

Mutations

Einstein’s Error — The Confusion of Laws With Their Effects

The Museum Is the Art

Polly Sooth the Air Rage

The Truth

The Darkness

Morality = STDs?

Fulfilling the Moral Duty To Disdain

MustWinForSure

Choice

Real Design

The Two Rules of Great Programming

Cynicism

The End of the Nerds

Poverty — Humanity’s Damage Control

Berners-Lee’s Rating System = Google

The Secret Anti-MP3 Trick In “Independent Women” and “You Sang To Me”

ID and the Large Hadron Collider Scare

Not A Bluff

The Fall of Microsoft

Life Sucks When You’re Not Winning

Aware

The Old-Fashioned Way

The Old People Who Pop Into Existence

Theodicy — A Big Stack of Papers

The Designed, Cause-and-Effect Brain

Mosaics

IC Counterarguments

The Capitalist’s Imaginary Line

Education Isn’t Everything

I Don’t Know

Funny iPhone Party-Poopers

Avoiding Conflict At All Costs

Behavior and Free Will, Unconfused

“Reduced To” Absurdum

Suzie and Bubba Redneck — the Carriers of Intelligence

Everything You Need To Know About Haldane’s Dilemma

Darwin + Hitler = Baloney

Meta-ware

Designed For Combat

Speed Racer R Us

Bold — Uh-huh

Conscious of Consciousness

Future Perfect

Where Real and Yahoo Went Wrong

The Purpose of Surface

Eradicating Religion Won’t Eradicate War

Documentation Overkill

A Tale of Two Movies

The Changing Face of Sam Adams

Dinesh D’Souza On ID

Why Quintic (and Higher) Polynomials Have No Algebraic Solution

Translation of Paul Graham’s Footnote To Plain English

What Happened To Moore’s Law?

Goldston On ID

The End of Martial Law

The Two Faces of Evolution

A Fine Recommendation

Free Will and Population Statistics

Dennett/D’Souza Debate — D’Souza

Dennett/D’Souza Debate — Dennett

The Non-Euclidean Geometry That Wasn’t There

Defective Attitude Towards Suburbia

The Twin Deficit Phantoms

Sleep Sync and Vertical Hold

More FUD In Your Eye

The Myth of Rubbernecking

Keeping Intelligent Design Honest

Failure of the Amiga — Not Just Mismanagement

Maxwell’s Silver Hammer = Be My Honey Do?

End Unsecured Debt

The Digits of Pi Cannot Be Sequentially Generated By A Computer Program

Faster Is Better

Goals Can’t Be Avoided

Propped-Up Products

Ignoring ID Won’t Work

The Crabs and the Bucket

Communism As A Side Effect of the Transition To Capitalism

Google and Wikipedia, Revisited

National Geographic’s Obesity BS

Cavemen

Theodicy Is For Losers

Seattle Redux

Quitting

Living Well

A Memory of Gateway

Is Apple’s Font Rendering Really Non-Pixel-Aware?

Humans Are Complexity, Not Choice

A Subtle Shift

Moralism — The Emperor’s New Success

Code Is Our Friend

The Edge of Religion

The Dark Side of Pixel-Aware Font Rendering

The Futility of DVD Encryption

ID Isn’t About Size or Speed

Blood-Curdling Screams

ID Venn Diagram

Rich and Good-Looking? Why Libertarianism Goes Nowhere

FUV — Fear, Uncertainty, and Vista

Malware Isn’t About Total Control

Howard = Second Coming?

Doomsday? Or Just Another Sunday

The Real Function of Wikipedia In A Google World

Objective-C Philosophy

Clarity From Cisco

2007 Macworld Keynote Prediction

FUZ — Fear, Uncertainty, and Zune

No Fear — The Most Important Thing About Intelligent Design

How About A Rational Theodicy

Napster and the Subscription Model

Intelligent Design — Introduction

The One Feature I Want To See In Apple’s Safari.