Darel Rex Finley in 888

A Secure Backdoor Is Feasible

2015.08.05   prev     next

FOR the record, I’m fine with Apple’s policy of providing secure, end-to-end encryption with no backdoor for the government, Apple, or anyone else.

But to play devil’s advocate just a little: One of the anti-backdoor arguments being bandied about is, “Even assuming we trust our government to use a backdoor for legitimate purposes only, and only when necessary, I don’t believe that Apple can create a backdoor for our government that wouldn’t also open the door to skilled hackers, and to other governments. If anyone can get in, bad guys will get in.”

The purpose of this article is to argue that no, the above argument is incorrect, and a secure, emergency backdoor can be created. Here’s how:

  • Apple creates a private key BDpriv (backdoor private) known only to Apple, and keeps this in a secure location (like a vault).

  • Apple includes the corresponding public key, BDpub, in iOS.

  • When two iOS devices have successfully negotiated a temporary session key (SK) for a secure communication session, then they include SK along with each packet of communication, but in an encrypted form: specifically, encrypted with BDpub. So each packet contains (A) user data encrypted with SK, and (B) SK encrypted with BDpub.

  • The receiving iOS device ignores (tosses) part B, which it can’t decrypt anyway, and uses SK (which it already has) to decrypt part A, the user data.

  • If/when the government intercepts (and records) iOS-to-iOS packet data that it has a need to decrypt, then Apple can assist either by providing BDpriv (not such a good idea), or better yet, simply by using BDpriv to decrypt the data, but without ever handing over BDpriv to anyone.

This is a very simple system, and if implemented correctly, would be just as secure as is SSL when you access your bank account, and just as secure as iOS-to-iOS communications currently are (provided you don’t consider Apple’s ability to emergency-crack your encryption to be in-and-of-itself an in­se­cu­ri­ty).

Under the above-described system, Apple would not store any user data that it isn’t storing now. It would be up to the government to intercept communication packets en route, then request that Apple decrypt them.

So in a nutshell: While I laud Apple’s policy of not having any such backdoor, I have to disagree with the technical claim that such a backdoor cannot be securely implemented. It can.


What if the criminals jailbreak their iOS devices and modify them to not include the BDpub-encrypted SK, or include instead fake, random data, that is not useful for backdoor decryption?

The system could be designed so that both participating criminals would have to have the modification. But, of course, they both could. Other than continuing to work toward a jailbreak-proof version of iOS, I don’t know what could be done about that. And any criminals who can physically crack open their iOS device and replace its ROM chip will always be able to jailbreak.

But at least anyone using out-of-the-box iOS would not be able to communicate criminal intent without worry that the government might read it. And in any case, jailbreaking to disable the backdoor is not a path to enabling hackers and other governments to use the backdoor.


Update 2015.08.10 — Witness the following conversation (Apple Context Machine #318), between Bryan Chaffin and Jeff Gamet:

BC: The reality is ... let’s say Apple and Google are forced to build-in backdoors. Let’s just make that an argument here.


BC: And so now, when the government delivers a warrant, Apple can turn around and use that backdoor, and decrypt iMessages between one target and another target, hand that information over to the government, and the bad guys are stopped. ’Kay?

JG: Well, on the surface, that doesn’t sound so bad, because now the bad guy’s caught.

BC: Right. Yayyy, everything’s wonderful. But the cascading effect is this: ... if Apple has the keys to the back door, the bad guys will find the keys.

JG: Yes.

BC: Right.

JG: And that’s a huge problem.

BC: That’s a universally accepted truth in the world of people who actually know how this stuff works.


BC: I know we’ve talked about this before on the show. And none of that’s changed; these are the same old arguments. And the thing is that we had all these arguments in the ’90s, and the government lost. And the government should have lost. Encryption protects us from the bad guys even while it also prevents the good guys, the theoretical good guys, from accessing our stuff when they have a warrant and the right to it.

JG: Well, data protection: you can’t expect it to be a mutually exclusive thing, where it’s protected on one side, but— or from one group but not from another? It’s a binary thing; it’s protected or it’s not. And it doesn’t matter if it’s hardware-based or if it’s software-based; if you have created a backdoor, it’s no longer effectively protected. From anyone.

BC: Right. From anyone! From anyone. And yet, despite the fact that we’ve already gone through this, despite the numerous white papers, and all kinds of technical papers, and philosophical papers that’ve been written, and published, and filed, and just added to this topic, we’ve got the government doing it again ...

Yikes. Looks like this meme’s gon­na be hard to dispel.


Update 2015.11.03 — With the UK about to require backdoors in encrypted tech products, the heat is way up on this issue. Chaffin’s pounding the hammer again on today’s TMO Daily Observations:

If the government has a backdoor into encrypted communications, then everybody has access to that same backdoor. And that will always happen; this is a known reality to security experts.

And today Glenn Fleishman wrote on Macworld:

The problem with a backdoor is that there’s no way to create a way into a secure system that only the “good guys” can use.

On an only dimly related topic in the same article, he approvingly reiterated Amnesty International’s claim that torture doesn’t work:

Sometimes, torture is involved, which Amnesty International would remind us is both against international law and doesn’t help acquire useful operational in­for­ma­tion.

Two bogus truisms in one article and I couldn’t resist posting to its comments section, as follows:

At the risk of sounding like an insensitive bastard, allow me to suggest that the idea that torture “doesn’t help acquire useful operational information” is pretty silly. If you torture someone until they reveal the password to their encrypted hard drive, and now you’re in and can see the contents of their hard drive, then guess what: it worked! The idea that torture “doesn’t work” is one of those moral-obligation memes — disagreement is silently understood to be a red-flag that you’re a bad person.

A similar moral-obligation meme seems to be forming around the idea that a hacker-proof backdoor isn’t feasible. Sorry again — it’s very feasible. Search “a secure backdoor is feasible” to find my detailed explanation.

My post went into “moderation” — and surprise, surprise, it was unceremoniously deleted. I didn’t even get a message telling me about that, much less mentioning why. Several other comments were subsequently allowed, all expressing fawning agreement with the author. I don’t know that Fleishman is responsible, but somebody at Macworld sure is.

There are true things that some people really don’t want to know, and they’ll do whatever they can to make sure no one else does either.


Update 2015.11.12 — Cory Doctorow, from last May:

It’s impossible to overstate how bonkers the idea of sabotaging cryptography is to people who understand information security. ... Use deliberately compromised cryptography, that has a back door that only the “good guys” are supposed to have the keys to, and you have effectively no security. You might as well skywrite it as encrypt it with pre-broken, sabotaged encryption.

I wonder if this is one of the aforementioned “security experts,” “who actually knows how this stuff works,” on whose authority we are supposed to take it that the above-described BDpriv scheme would magically collapse?

Doctorow, just in case you’d forgotten, is the guy who, when iPad launched, said that he wouldn’t buy one and thought you should­n’t either because “open platforms and experimental amateurs ... eventually beat out the spendy, slick pros. ... Relying on incumbents to produce your revolutions is not a good strategy. They’re apt to take all the stuff that makes their products great and try to use technology to charge you extra for it, or prohibit it altogether.” The next year he informed us that, “A tablet without software is just an inconveniently fragile and poorly reflective mirror, so the thing I want to be sure of when I buy a device is that I don’t have to implicitly trust one corporation’s judgment about what software I should and shouldn’t be using.” And, of course, we all heeded his warning against iBooks, “Digital Distribution and the Whip Hand: Don’t Get iTunesed with your eBooks ... Any time someone puts a lock on something you own against your wishes, and doesn’t give you the key, they’re not doing it for your benefit.”


Update 2015.11.19 — Bryan Chaffin again, in ACM #333:

A backdoor available to one is available to all. This is known. This isn’t theory. This isn’t speculation. This isn’t me crying about the government coming to get us. This is a reality of encryption. A backdoor available to one is available — to — all.

I don’t mean to pick on Chaffin here. I really like his work, particularly on The Apple Death Knell Counter. But I have to say it: The above quote is one of the best examples I’ve ever found of the idea that if you repeat something over and over again, in the most dogmatic terms possible, you can make it true — even though it actually isn’t.


Update 2015.12.25 — Gamet and Chaffin again, with Dave Hamilton on TMO Daily Observations:

JG: What we’re seeing right now from, in this case, the UK government, are statements — and I’m paraphrasing — this [a law-required backdoor] is not going to impact security; this is not going to impact people’s privacy; this is all just about protecting everyone. Don’t worry. It’s all fine.

BC: The whole thing is nonsense. We’ve talked about this topic.

DH: Maybe it’s not nonsense. Right? I mean, maybe there’s some thing, that they could say, hey look, you haven’t thought about this. And we’d all say, hmmm. But we need to hear that first.

JG: We do. And I —

DH: Sorry, Bryan. I didn’t mean to interrupt.

JG: Oh, yes. Bryan, you were talking; go ahead.

BC: Uh, it’s nonsense.

DH: (laughing) OK, it’s nonsense.

JG: Well said.

BC: We’ve talked about this several times. We’ve talked about this on probably every podcast that we’ve been on. We’ve written about this a lot. And of course, lots of other people have talked about this and written about this, as well. The security experts all call it nonsense. Apple is essentially calling it nonsense. I mean, the politicians have the information they need to not be idiots about this. And yet they want, apparently, to be idiots. And it’s super frustrating.

JG: I agree. It is frustrating.

And on ACM #338:

JG: Like we have said so many times, a backdoor isn’t just for one person; it’s there for everyone.

BC: It’s there for everybody to find.

JG: Yes.

BC: And don’t believe that because we say it, because we’re just repeating our betters here. Right? We’re repeating the encryption nerds, and the math wonks, and the people who have been studying this for decades. They’re the ones who say this.

JG: Right. The serious privacy and encryption advocates.

BC: Yeah. The people who actually know.

I certainly can’t claim these guys aren’t in good company: Here’s Tim Cook himself, being interviewed by Charlie Rose on 60 Minutes last Sunday:

[I]f there’s a way to get in, then somebody will find the way in. There have been people that suggest that we should have a back door. But the reality is if you put a back door in, that back door’s for everybody, for good guys and bad guys.

Hats off to Cook for refusing (thus far) to put in a backdoor. Hats off to Cook for trying to discourage governments from passing laws that would require a backdoor. And when Cook says that a backdoor won’t do much to catch the really bad guys because they’ll just find a way to communicate that’s out of reach of that backdoor, I have to agree, he’s probably right.

But when he says what I just quoted him saying to Rose — then I have to say, no, that isn’t true.

What is Cook going to do if/when some major nation (e.g. the USA, the UK, or China) passes a law requiring a backdoor? My guess is, he’ll implement something pretty much exactly like I describe at the top of this article. And it will not give backdoor access to hackers, foreign governments, unspecified bad guys, or “everybody.” It will be usable only by Apple, and Apple will use it only when they get a court order telling them to do so.


Update 2016.01.15 — New York is working on a bill that would require all smartphones sold in the state to be unlockable/decryptable by their manufacturers. How could Apple comply with such a bill? Simple: Make available (to any connected device that requests it) a concatenation of the iPhone’s 256-bit data encryption key and the user’s unlock code encrypted with BDpub. Only Apple (the holder of BDpriv) will be able to do anything with that value.

Again — I’ll be happy if New York doesn’t pass this bill. And if they do pass it, I’ll be fine if Apple decides not to sell its products in New York until the law is repealed. But if Apple wants to securely, safely comply with such a law? It can. Easily.


Update 2016.01.21 — Now California is working on a similar bill. Can’t see Apple stopping sales of its products in its home state.

(Update: Defeated.)


Update 2016.02.20 — A bad analogy is making the rounds. It goes like something like this: “Requiring Apple to put in a backdoor is like if door lock manufacturers had to give a master key to the government, so the police could enter your house any time they thought it was the right thing to do. If there’s no master key to the door locks on your house, why should there be a backdoor to your phone?”


The flaw in that analogy is that the government does have such a master key — it’s called a battering ram. And the nice thing about the ram is that it leaves obvious evidence of its use, so they can’t come in when you’re not there and snoop without you knowing something happened. Nor can a rogue officer use the ram to snoop without leaving ample evidence for other officers that it was done.

The analogy of the battering ram fits the BDpriv scheme pretty well. There would be no way for the government to use BDpriv to quietly snoop; they would have to serve Apple with an order to decrypt specific data, and such a request presumably would be a matter of public record. (If it wasn’t, Apple could simply refuse to do it, and/or publicize the request them­selves.)

A police master key to your house’s door locks would be analogous to Apple handing over BDpriv to the government, saying, “Do with this whatever you think best.”

Manufacturer-Held Master Key?

Now — is BDpriv analogous to a door-lock master key that is held by the lock manufacturer, and used only by that manufacturer when issued a court order to do so? Kind-of. But the analogy is strained. First, the police have battering rams, and don’t need the lock manufacturer’s cooperation to use them. And second, a door-lock master key inevitably would be duplicated (or reverse-engineered from a lock), and fall into the wrong hands — including the hands of the police. But BDpriv — kept in an Apple vault, and used only by Apple, at Apple, under Apple’s in-house procedures designed to prevent both undocumented use by a roque employee and copies of BDpriv leaving the vault — would not inevitably fall into the wrong hands.

The main weakness of BDpriv (as compared to battering rams) is that there would be no way to know if it had somehow gotten out of Apple’s control, and was being quietly used by the parties that possessed it.


Am I starting to become pro-backdoor? I’m aware that it must sound like I am. Let’s just say this: I would be OK with the BDpriv scheme I’ve outlined here. But I would also be very much OK with Tim Cook fighting this successfully, and never implementing any backdoor at all.


Update 2016.02.22 — Richard A. Epstein of Stanford University’s Hoover Institution, writes about “Apple’s iPhone Blunder”:

I participated in hearings that then-Senator John Ashcroft held in March 1998, and spoke in opposition to [a government-mandated backdoor], along with Kathleen Sullivan, then a professor of law at Stanford Law School. The greatest risk of the built-in back door is that the government will not be the only party that will enter through it. Back doors necessarily compromise the integrity of a security system.



Update 2016.02.24Cory Doctorow in The Guardian quadruples-down on secure-backdoors-aren’t-possible-they’re-not-they’re-not-they’re-not:

“The FBI wants a backdoor only it can use — but wanting it doesn’t make it possible”

“The thing about this controversy is that it isn’t one. Independent cryptographers are virtually unanimous in their view that you can’t properly secure a system while simultaneously ensuring that it ships with a pre-broken mode that police can exploit. The fact that this would be useful doesn’t make it possible: as security experts Meredith Whittaker and Ben Laurie recently wrote: ‘Wanting it badly isn’t enough.’”

That’s funny — I don’t want it, and I didn’t have any trouble at all figuring out how to do it. It’s pretty simple, actually.

And aren’t these the same people who say that if hackers, foreign governments, etc., want it badly enough, they will find a way to exploit a backdoor? They can find a way in if they just want it badly?

“Law enforcement would also be assisted by anti-gravity devices, time machines, psychic powers, and the ability to selectively reverse entropy, but that doesn’t make them possible. Likewise uncontroversial is the gravity of the cybersecurity question.”

Sure wish I could give you psychic reverse gravity and all that, Cory — but great news! You can have a hacker-proof backdoor on all smartphones just as soon as you (and/or your elected legislators) want to! Time to celebrate.

And this must be your lucky day, because I’ve got even better news: If hackers, foreign governments, and random bad guys have anything to gain by discovering psychic reverse gravity? Then they’ll just find a way! They always do; they can just find a way to do it. So very soon now, humanity will have those marvelous powers! I can hardly wait.

“There’s precedent for this kind of contradiction, where something urgent is considered a settled matter in expert circles, but is still a political football in policy circles: climate change. Denialism is a deadly feature of 21st-century life.”

So now I’m not just “denialist” for knowing how to make a secure backdoor — I’m also “deadly.” OK. I can live with that.

“One form of math denial is the belief in the ability to make computers that prevent copyright infringement. Computers only ever work by making copies: restricting copying on the internet is like restricting wetness in water.”

Somebody, quick: Tell Apple that its controlled App Store can’t be done, so they need to shut it down right away, and just give away all the apps for free. And take back all that money that the authors made; that wasn’t really possible. (If you think it was, you’re in denial!)

“It’s tempting to play along with them here, offer them more magic beans in the form of backdoors that we pretend only the good guys can fit through ...”

What’s tempting about it? Just call them deadly denialists, and you’re done.


Update 2016.03.08 — Something just like my BDpriv is apparently called a “golden key” (meaning a master key, a key that opens all doors). On Exponent #68, Ben Thompson describes a system where there is a golden key in (say) Apple’s possession, that can be used to unlock the encryption — to which James Allworth re­sponds:

JA: What you’re describing is functionally the same as not having encryption at all.

BT: ... Having a golden key is, yes, in the long run the same as not having encryption at all. ... Encryption, the actual scrambling of data on a disk, is a binary thing. It’s either truly encrypted, or it’s not.

JA: Right.

So having the user-data encryption key discoverable by the holder of BDpriv would be the equivalent of having no encryption at all? It wouldn’t even fall, say, in-between no encryption and Apple’s currently impervious-even-to-Apple encryption? Thompson answers that question a bit later in the show:

BT: There is a spectrum. There is not a spectrum when it comes to encryption; there is a spectrum when it comes to security.

Well, I guess that settles that. And lest I had any further doubts, here’s Jason Snell in Six Colors:

Piers Morgan: “Let Apple open the damn back door, let the FBI know what’s on the phone, then close it again. This isn’t difficult.”

Congratulations, Piers Morgan — you can join the tech geniuses on the Washington Post editorial board in the Golden Key society. They’re the ones who wrote: “With all their wizardry, perhaps Apple and Google could invent a kind of secure golden key they would retain and use only when a court has approved a search warrant.” (That’s not how cryptography works.) But nothing is difficult when you believe in wizardry.

So if I think BDpriv would actually work — to the point that I could be dissuaded only by someone explaining very specifically why it wouldn’t — then I literally believe in wizardry. OK, Jason. You must be right. Somehow. I certainly don’t want people to think I believe in wizardry.

But for the time being, the only weakness of BDpriv of which I’m aware is that it might leak out of Apple’s control. Somehow. Even though Apple currently signs its own OS with key(s) that it has to keep secret — and somehow, they don’t leak out.


Update 2016.03.19 — Fleishman was the guest on John “Daring Fireball” Gruber’s “The Talk Show With John Gruber” (#149). As a long-time fan, I’ve actually gone out of my way to keep Gruber’s name off of this page so far. But in this latest episode, he takes the bluster a full notch higher than anyone I’ve quoted here yet; therefore it becomes unavoidable.

JG: You know, I’ll just admit it, and Hillary Clinton has espoused the same opinion, is a belief in this magical thinking that if we just put smart enough people into a room together, that they can come up with a way that this backdoor solution would only be available to law enforcement.

Not saying I’m the smartest person in the room, but I described it all by myself (see the top of this page), and no magic was — nor is — required.

That, we’re not asking you to make a backdoor that anybody could get into, we just want a backdoor that law enforcement can get into, when we have a warrant. Which sounds reasonable, and in some fictional, other universe, where that’s mathematically possible, that might be great.

Bad news: If you’re reading this article right now, you’ve just warped off to an alternate universe where secure backdoors are eminently possible (if not advisable). I sure hope you can find a way back to your own universe, where secure backdoors just don’t work.

I actually, I think that there’s good reasons why a civil libertarian would be opposed even to that. Let me just put this out there. And I tend to lean that way. I would listen to the argument, but I tend to lean towards, even if that were possible, I don’t think it’s a good idea, and I think it’s contrary to the values that are already in our Bill Of Rights. But it is an idea.

I have to agree: If, if it could be done (somehow, who the hell knows how, like in an alternate universe with different math??) — maybe it still shouldn’t be.

But the simple truth is that it’s math— all experts agree, and everybody who understands encryption. I mean, this is, it, I don’t think that you— it’s, it’s more than even, like, uh, I mean, it’s like provably incorrect.

The proof is out there? A link would be very helpful.

You know, like, as opposed to, let’s say, climate change, where you can say, you can argue that only 98 or 99% of expert climate scientists agree that what we’re seeing is man-made. I mean, with cryptography and backdoors it’s 100% a­gree­ment.

GF: You’re totally right. I just realized I haven’t seen any crypto deniers out there saying this is possible. I’ve only seen politicians and law enforcement.

JG: Right. Right.

GF: That’s fascinating.

Let me make sure I understand this correctly: Describing my BDpriv scheme (scroll to top) makes me a crypto denier, in the same league with anti-science nutballs in general. (In this case, anti-math nutballs!) Well, that’s certainly one way to win an argument.

But in this case, we nutballs don’t even exist, because there is “100% agreement” on this issue, not even “98 or 99%.” How can that be? Oh yeah, I forgot — I’m from another universe. A fictional one.

I want Cook to win this battle, but not by any means necessary. The civil libertarian argument, I think, is strong. The “it can’t be done; everyone will have access to it” argument isn’t just weak — it’s completely false.


Update 2016.03.20 — Later in the same episode:

JG: One of the things that depresses me about the current state of decades-long discourse in the United States is the polarization of politics, and that so many issues are so clearly polarized, and that we’ve self-sorted on these various lines into the two parties, and that there’s no interchange between them. It warms my heart that on this particular issue, it doesn’t fall on one line or the other ...

Yes, the polarization of politics is quite depressing. Would you like to know why it happens? Because people on both sides of an argument think they can win by dismissing all in-between positions, by embracing a false dichotomy. It happens because both sides think, if the only choices are my way or Hitler’s way, then people will choose my way!

Here’s how it’s done:

Either there’s no backdoor for anyone, or there’s a backdoor everyone will be able to use! Either it’s encrypted with no backdoor, or it’s not encrypted at all, and may as well be sky-written! If you disagree, you’re a crypto-denier, you literally believe in wizardry, you’re engaging in magical thinking, you think you live in another, fictional universe, and you don’t understand how encryption works. Among those who do understand it, there is nothing less than 100% agreement on this!

That’s how things get polarized. Fast.


Update 2016.03.21 — added “a concatenation of the iPhone’s 256-bit data encryption key and”


Update 2016.03.31 — Still later in the same episode:

JG: It’s easy for a layperson to believe in the magic solution of a way for the government to get in, but nobody else. You kind-of have to be gently informed of the basic way that encryption works to understand just how dangerous and impossible it is to say, the only people who can get in are the U.S. federal government, with a warrant. It just doesn’t work that way, once you create a backdoor.

GF: I know; that’s right.


Update 2016.06.07 — UK backs down on requiring a backdoor.


Update 2016.06.20 — Russia considering requiring a backdoor in all messaging apps.


Update 2016.08.12 — Microsoft just leaked a private key, and the usual suspects are immediately buzzing with the idea that this somehow demonstrates the fundamental insecurity of encryption backdoors. On AppleInsider Podcast #81, Victor Marks and Mikey Campbell opine:

VM: You named the words, “golden key,” right? And once such a key exists, it is impossible to keep it secured.

MC: Yeah.

VM: You can keep things quiet or secret for a time. But you can’t keep them secret indefinitely. Because Microsoft has a golden key that governs the secure boot of Windows, their core operating system, their core product, and the thing upon which they built their whole empire, right?

MC: Yep.

VM: The golden key escaped into the wild, didn’t it?

MC: It did.

VM: How apocalyptically bad is that for Microsoft?

MC: The ramifications are— they’re pretty extreme. ... [Apple] specifically designed theirs not to have this said golden key.

Microsoft Windows already has the encryption backdoor that Ap­ple refuses to implement? Hmmm.

Jeff Gamet and Bryan Chaffin weigh in on TMO Daily Ob­ser­va­tions:

JG: Microsoft gave us a perfect example of why it’s so important not to have backdoors into our operating systems, by accidentally letting the master keys — the golden keys, so to speak — for their encryption, out into the wild. Bryan, you wrote about this. So, fill us in please?

BC: Well, what Microsoft did, apparently earlier in the year, was to accidentally release a key that allows — a backdoor, basically — that allows someone to install whatever they want, even on a device that has what is called secure-boot. ... [this leak] does serve as an expert example of why backdoors existing are a bad idea. Not only does the existence of a backdoor make a target for our government, foreign governments, authoritarian regimes, criminal organizations, terrorist organizations, curious hackers, not only does the existence become an automatic target, but sometimes even the legitimate holders of a key can mishandle it.

Immediately after the above, Dave Hamilton tries to inject a dose of sanity to the discussion by explaining the necessity of having these keys at all, and he comes tantalizingly close to saying that this Microsoft leak event isn’t an example of backdoors being in­se­cure.

It isn’t. Just like Apple, Microsoft has private/public-key pairs that it uses to secure its products’ bootup processes and OS update processes, with the private key never needing to exist anywhere outside of headquarters. Unlike Apple, Microsoft somehow managed to leak one of those private keys to the public. That key was not cracked, found, or otherwise extracted by any government, regime, criminal, terrorist, or hacker. It was accidentally leaked by Microsoft.

Could a backdoor key be accidentally leaked? Of course. But the security of your data already depends on avoiding such leaks. Nobody knows how to make a secure product that doesn’t depend on private keys that are held (and not leaked) by the maker of that product.

Later in the AppleInsider episode:

VM: The FBI ... they wanna restart the encryption debate.

MC: Shocking.

VM: I know. Shocking, if true. People need to know how to back off of something when it’s not gonna work for them. Of course, people in government never do. But it’s frustrating that we’re still having this, after it’s been demonstrated repeatedly that math doesn’t work that way. Math doesn’t work the way that [FBI director James Co­mey] wishes it did.

Microsoft accidentally leaked a private key because math doesn’t work the way the FBI wishes it did? Hmmm.


Update 2016.12.21 — The U.S. Congress’s Encryption Working Group just issued a report in which it discouraged legislation to require backdoors. I’m glad that the U.S. government seems to be backing down from Apple on this issue. However, the report includes the following:

Any measure that weakens encryption works against the national interest.
Congress should not weaken this vital technology because doing so works against the national interest.
How would consumers’ privacy and data security suffer if encryption were weakened?

The direct implication is that a backdoor involves weakening the encryption. That is simply false. In the BDpriv scheme (watch the below-linked “Method...” video), the encryption of user data would not be weakened in the slightest. The only difference would be that the holder of BDpriv (Apple) would uniquely possess the ability to acquire the session (decryption) key.

Only by redefining “weakened” to include the existence of any backdoor, can backdoors be argued (circularly) to weaken encryption. But that’s like saying that your device’s encryption is “weakened” because you wrote down your device-unlock code on a piece of paper and put it in your safety deposit box at your bank. The encryption has not become weaker in any way — the hypothetical possibility that legal authorities might force their way into your deposit box, and find your unlock code there, is not an encryption “weak­ness.”

There seems to be an unstated assumption floating around that the way a backdoor works is that you introduce some hidden weakness to the encryption technique, so that a government agency with strong computers (and knowledge of the weakness and how to exploit it) can brute-force the encryption via this weakness. This idea is also vaguely implied when people say that hackers will “find” the backdoor — i.e. they’ll discover the weakness and figure out how to exploit it.

This idea is completely wrong. A backdoor does not (or certainly need not) involve any such hidden weakness. The only thing that needs to be hidden is the actual value of BDpriv; all other features of the backdoor scheme can be loudly publicized to the world without ill effect. And the user-data encryption is not weakened — or even changed, for that matter — at all. It’s just straight-up AES.


Update 2016.12.25 — That Congressional group has Chaffin and Gamet back on the subject (ACM #390):

BC: If we want to have security from all the bad guys out there — we gotta have security! And the good guys, we can’t give just the good guys access to this. Now we’ve talked a lot about this on this show, and on TDO. I believe that Dave Hamilton and John F. Braun have talked a lot about it on Mac Geek Gab. We’ve written thousands of words on the subject.

JG: And John Martellaro has had security experts on Background Mode.

BC: Yup. And, the long and the short of it is, there’s really no compromise.

JG: No. It’s either, we have true encryption and security for our da­ta, or we do not.

BC: Or we do not. We do not! And so the Encryption Working Group was tasked with studying this, because— listen, to Jeff and I [sic], and again, to anyone with a basic understanding of reality, when it comes to encryption, it’s very obvious how this works. As we just described it.

JG: Mmm-hmm.

BC: To, but— reasonably speaking, and yeah, I’m being a jerk when I put it that way, but reasonably speaking, there are lots of folks who don’t have a basic understanding of how encryption works. And to them, it’s just very obvious that law enforcement should be able to get into the device owned or used by a bad guy. It’s just obvious. You wanna stop the bad guys, and so of course they should have access to it. Because they don’t understand the repercussions; they don’t understand how this stuff actually works.

JG: Right. And, you know, that’s understandable that someone, or groups, could jump to a wrong conclusion based on their lack of knowledge and understanding a­bout a topic like this. But for the people that are in a position where they will be setting policy for how our data is protected, like the government, if they’re going to mandate backdoors into our data, they should be really diving into this in a serious way, so they understand what the implications are, and can make an informed policy decision.

I’m no politician, but after hearing this, boy do I feel informed.

Chaffin quotes from the Encryption Working Group:

BC: “Technology companies, civil society advocates,” — this is a quote — “a number of federal agencies, and some members of the academic community argue that encryption protects hundreds of millions of people against theft, fraud, and other criminal acts. Cryptography experts and information security professionals believe that it is exceedingly difficult and impractical, if not impossible, to devise and implement a system that gives law enforcement exceptional access to encrypted data without also compromising security against hackers, industrial spies, and other malicious actors.”

JG: Ah, that’s awesome.

What really awesome is that I devised just such a system about a year-and-a-half ago. And it wasn’t even difficult, not to mention impossible. As far as implementation is concerned, the only hard part is keeping BDpriv secure within Ap­ple’s HQ, and Apple apparently already nailed that problem about a decade ago when they figured out how to keep their iOS signing keys from leaking.

BC: “Further, requiring exceptional access to encrypted data would, by definition, prohibit encryption design best practices, such as,” quote-unquote, “forward secrecy, from being implemented.”

Nope. The session key (“SK” in my “Method..” video below) is different every session. So is the user’s unlock code. So is the phone’s data encryption key (DEK). If Apple reveals any of those things to the FBI for a mass-shooter’s iPhone, it wouldn’t compromise any other past, or future, values of SK, DEK, or unlock codes. Only the security of the specific iPhone (or the specific transmissions) under warrant would be compromised. Not even future FaceTime calls (e.g.) made with the same phone would be compromised, at all. They use an entirely new, random session key.*

JG: I’m totally loving this. This is exactly what they should be saying.

BC: Right, because it happens to be reality.

JG: Yes.

When people I respect, and with whom I almost always concur, insist that such-and-such is “reality” — I really would like to agree with them. But should I say that an encryption backdoor can’t work, even if I know it can? Somewhere, somehow — somebody has to explain, specifically, how my backdoor plan would fail, before I can say, with a straight face, that I believe it would.


*Update 2016.12.27 — I am assuming that Apple does not leak BDpriv. The only danger against “forward secrecy” is that if they did leak it, recorded data from years ago could be cracked. Besides, of course, not leaking BDpriv (just like they currently must not leak SIGpriv), a “forward secrecy” solution might go like this: Use a different BDpriv every year, with iPhones pre-loaded with fifty years of BDpub. When any year’s BDpriv becomes three years out-of-use, permanently destroy it, so it can never be leaked (nor even obtained by any level of force or intimidation).


Update 2017.03.09 — Gamet again on TDO:

What [FBI Director James Comey] wants is backdoors. And that’s where the problem comes in. Because, as we’ve said so many times, a backdoor not an exclusive thing. If you create a backdoor for the FBI, and the CIA, and whatever intelligence agency, you’ve created a backdoor that other governments, hackers, other criminals, can use. All they have to do is figure out how to get through the hoops to do it, and that’s it.

All they have to do is break the strong encryption in which the backdoor data is wrapped. Just jump through the hoops, and that’s it!

Come to think of it, the bad guys don’t even need a backdoor. They can break the existing system right now: Just figure out how to do it, jump through the hoops, and that’s it — encryption cracked! What is Apple going to do about that? We’re all totally vulnerable, right now.


Update 2017.05.01 — More wisdom from Doctorow: DRM will be all gone by 2025! So if you have a profitable app on the App Store, be sure to save some of the money: The party’s over just eight years from now.


Update 2017.06.19 — EU now considering requiring secure encryption, and outlawing backdoors.


Update 2017.07.14 — Australia working on law that would require government access to encrypted messages.


Update 2017.08.18 — Help Net Security has the definitive word:

How security pros look at encryption backdoors

The majority of IT security professionals believe encryption backdoors are ineffective and potentially dangerous, with 91 percent saying cybercriminals could take advantage of government-mandated encryption backdoors.

What do the other 9% say — that the Dead Sea Scrolls told them a secure backdoor is possible?

Encryption backdoors create vulnerabilities that can be exploited by a wide range of malicious actors, including hostile or abusive government agencies.

Oh, well, that settles it.


Update 2017.11.01 — Russia’s ban on VPNs goes into effect.


Update 2017.11.08 — It’s been over two years since he first started talking about this, but Chaffin’s tune has just gotten more entrenched than even I thought possible (ACM #436):

A backdoor created for one is available to all. This is a hard-and-fast rule. There is no way around the laws of reality in this particular regard.

The specific, exact way that my BDpriv plan would fail is — drumroll please — that it just would because that’s a hard-and-fast law of reality. Ahhh. I’m so glad I understand that now. Why did I ever think my idea could work?

The FBI guy ... may have evidence right there in his hands, and he just can’t get to it. I get how frustrating that is. But you can’t divorce that from the reality that encryption is binary: you’ve got it or you don’t!

Say...suppose, hypothetically, that Apple had the power to silently push a mandatory update to your iPhone that would cause all your private messages to be quietly echoed to the FBI, sans encryption — but Apple hadn’t ever used that power — would that be a case of you, the user, having encryption, or not having it? Because, you know, it’s a binary thing. You’ve either got it or you don’t.

And while you’re contemplating that, also contemplate this: It’s actually not hypothetical. Apple has that exact power, right now.

JG: What about the argument ... that Apple could just keep the code, and that way it would be safe. How do you feel about that argument?

BC: That argument doesn’t hold any intellectual water whatsoever. For one thing, if it exists, it’ll eventually be exploited.

JG: Mmm-hmm.

BC: And if it exists and is sitting on a phone that is within the FBI’s holding, not Apple’s holding? I don’t see how that could possibly be within Apple’s care.

JG: Mmm-hmm.

Because Apple couldn’t possibly have the backdoor key in their private possession? Got it; I stand corrected.

BC: [In this scheme] someone within Apple has access to it. And if someone within Apple has access to it, they are subject to extortion. Threats. Bribes.

JG: A mistake.

Someone who has Chaffin’s (or Gamet’s) ear, please ask him to re-watch Apple’s Ivan Krstić speaking at Black Hat (starting at 32:25). Somehow, this doesn’t sound very vulnerable to threats, bribes, or mistakes. And if it was, then we’d all be vulnerable, right now, because he’s not talking about hypothetical security keys that don’t yet exist.


Update 2017.11.09 — U.S. DOJ’s Rod Rosenstein (as quoted by Cyrus Farivar in Ars Technica):

[T]ech companies are moving in the opposite direction [of cooperation with government]. They’re moving in favor of more and more warrant-proof encryption.

Law 101, Rod: There’s no such thing as a legal order to do the impossible. No one can comply — or refuse to comply — with any such order. There would not even be a way to determine whether or not the subject of such an order was in compliance.

There is such as thing as crack-proof encryption. (It’s been around for many decades.) But no encryption system, or anything else in this world, is “warrant-proof.” That descriptor lies somewhere between oxymoronic and meaningless.

That said... here’s the opposite sophistry, from Farivar:

The DOJ’s position runs counter to the consensus of information security experts, who say that it is impossible to build the strongest encryption system possible that would also allow the government access under certain conditions.

The feasibility/infeasibility of a secure encryption backdoor, like the feasibility/infeasibility of an airplane or an atomic bomb, isn’t consensus. Either it can be done, or it can’t.


Update 2017.12.05 — Germany working on law to require backdoors in all modern tech devices.


See also:
iOS Jailbreaking — A Perhaps-Biased Assessment
A Secure Backdoor Is Feasible
Method of Implementing A Secure Backdoor In Mobile Devices


prev     next



Hear, hear

prev     next

Best Recent Articles

Method of Implementing A Secure Backdoor In Mobile Devices

When Starting A Game of Chicken With Apple, Expect To Lose

How I Clip My Cat’s Nails

Seasons By Temperature, Not Solstice

It’s Not A Criticism, It’s A Fact

Features (Regularly Updated)

A Memory of Gateway — news chronology of Apple’s ascendancy to the top of the technology mountain.

iPhone Party-Poopers Redux and Silly iPad Spoilsports — amusing litanies of industry pundits desperately hoping iPhone and iPad will go away and die.

Embittered Anti-Apple Belligerents — general anger at Apple’s gi-normous success.


My books

Now available on the iBookstore!



Daring Fireball

The Loop



Red Meat

Despair, Inc.

Real Solution #9 (Mambo Mania Mix) over stock nuke tests. (OK, somebody made them rip out the music — try this instead.)

Ernie & Bert In Casino

Great Explanation of Star Wars

Best commercials (IMO) from Superbowl 41, 43, 45, 46, and 47

Kirk & Spock get Closer

American football explained.

TV: Better Call Saul; Homeland; Survivor; The Jinx; Breaking Bad; Inside Amy Schumer

God’s kitchen

Celebrity Death Beeper — news you can use.

Making things for the web.

My vote for best commercial ever. (But this one’s a close second, and I love this one too.)

Recent commercials I admire: KFC, Audi

Best reggae song I’ve discovered in quite a while: Virgin Islands Nice

Pinball Arcade: Unbelievably accurate simulation of classic pinball machines from the late ’70s through the ’90s, with new ones added periodically. Like MAME for pinball — maybe better.

d120 dice: You too (like me) can be the ultimate dice nerd.

WiFi problems? I didn’t know just how bad my WiFi was until I got eero.

Favorite local pad thai: Pho Asian Noodle on Lane Ave. Yes, that place; blame Taco Bell for the amenities. Use the lime, chopsticks, and sriracha. Yummm.

Um, could there something wrong with me if I like this? Or this?

This entire site as a zip file — last updated 2018.02.01

Previous articles

Nothing More Angry Than A Cornered Anti-Apple

Let ’Em Glow

The Ultimate, Simple, Fair Tax

Compassion and Vision

When Starting A Game of Chicken With Apple, Expect To Lose

The Caveat

Superb Owl


Basic Reproduction Number

iBook Price-Fixing Lawsuit Redux — Apple Won

Delusion Made By Google

Religion Is A Wall

It’s Not A Criticism, It’s A Fact

Michigan Wolverines 2014 Football Season In Review

Sprinkler Shopping

Why There’s No MagSafe On the New Mac­Book

Sundar Pichai Says Devices Will Fade Away

The Question Every Ap­ple Naysayer Must An­swer

Apple’s Move To TSMC Is Fine For Apple, Bad For Samsung

Method of Implementing A Secure Backdoor In Mobile Devices

How I Clip My Cat’s Nails

Die Trying

Merger Hindsight

Human Life Decades

Fire and the Wheel — Not Good Examples of A Broken Patent System

Nobody Wants Public Transportation

Seasons By Temperature, Not Solstice

Ode To Coffee

Starting Over

FaceBook Messenger — Why I Don’t Use It

Happy Birthday, Anton Leeuwenhoek

Standard Deviation De­fined

Not Hypocrisy

Simple Guide To Pro­gress Bar Correctness

A Secure Backdoor Is Feasible

Don’t Blink

Predictive Value

Answering the Toughest Question About Disruption Theory

SSD TRIM Command In A Nutshell

The Enderle Grope

Aha! A New Way To Screw Apple

Champagne, By Any Other Maker

iOS Jailbreaking — A Perhaps-Biased Assessment

Embittered Anti-Apple Belligerents

Before 2001, After 2001

What A Difference Six Years Doesn’t Make

Stupefying New Year’s Stupidity

The Innovator’s Victory

The Cult of Free

Fitness — The Ultimate Transparency

Millions of Strange Dev­o­tees and Fanatics

Remember the iPod Killers?

Theory As Simulation

Four Analysts

What Was Christensen Thinking?

The Grass Is Always Greener — Viewing An­gle

Is Using Your Own Pat­ent Still Allowed?

The Upside-Down Tech Future

Motive of the Anti-Ap­ple Pundit

Cheating Like A Human

Disremembering Mi­cro­soft

Security-Through-Obscurity Redux — The Best of Both Worlds

iPhone 2013 Score Card

Dominant and Recessive Traits, Demystified

Yes, You Do Have To Be the Best

The United States of Texas

Vertical Disintegration

He’s No Jobs — Fire Him

A Players

McEnroe, Not Borg, Had Class

Conflict Fades Away

Four-Color Theorem A­nal­y­sis — Rules To Limit the Problem

The Unusual Mo­nop­o­list

Reasonable Projection

Five Times What They Paid For It

Bypassable Security Certificates Are Useless

I’d Give My Right Arm To Go To Mars

Free Advice About Apple’s iOS App Store Guidelines

Inciting Violence

One Platform

Understanding IDC’s Tablet Market Share Graph

I Vote Socialist Be­cause...

That Person

Product Naming — Google Is the Other Microsoft

Antecessor Hypotheticum

Apple Paves the Way For Apple

Why — A Poem

App Anger — the Supersized-Mastodon-In-the-Room That Marco Arment Doesn’t See

Apple’s Graphic Failure

Why Microsoft Copies Apple (and Google)

Coders Code, Bosses Boss

Droidfood For Thought

Investment Is Not A Sure Thing

Exercise is Two Thirds of Everything

Dan “Real Enderle” Ly­ons


Ignoring the iPod touch

Manual Intervention Should Never Make A Computer Faster

Predictions ’13


Zeroth — Why the Century Number Is One More Than the Year Number

Longer Than It Seems

Partners: Believe In Ap­ple

Gun Control: Best Ar­gu­ments

John C. Dvorak — Translation To English

Destructive Youth

Wiens’s Whine

Free Will — The Grand Equivocation

What Windows-vs.-Mac Actually Proved

A Tale of Two Logos

Microsoft’s Three Paths

Amazon Won’t Be A Big Winner In the DOJ’s Price-Fixing Suit

Infinite Sets, Infinite Authority

Strategy Analytics and Long Term Ac­count­a­bil­i­ty

The Third Stage of Computing

Why 1 Isn’t Prime, 2 Is Prime, and 2 Is the Only Even Prime

Readability BS

Lie Detection and Psy­chos



Microsoft’s Dim Pros­pects

Humanity — Just Barely

Hanke-Henry Calendar Won’t Be Adopted

Collatz Conjecture A­nal­y­sis (But No Proof; Sorry)

Rock-Solid iOS App Stability

Microsoft’s Uncreative Character

Microsoft’s Alternate Reality Bubble

Microsoft’s Three Ruts

Society’s Fascination With Mass Murder

PlaysForSure and Wikipedia — Revisionism At Its Finest


Patent Reform?

How Many Licks

Microsoft’s Incredible Run

Voting Socialist

Darwin Saves

The Size of Things In the Universe

The Self-Fulfilling Prophecy That Wasn’t


Nobody Was In Love With Windows

Apples To Apples — How Anti-Apple Pundits Shoot Themselves In the Foot

No Holds Barred

Betting Against Hu­man­i­ty

Apple’s Premium Features Are Free

Why So Many Computer Guys Hate Apple

3D TV With No Glasses and No Parallax/Focus Issues

Waves With Particle-Like Properties

Gridlock Is Just Fine

Sex Is A Fantasy

Major Player

Why the iPad Wannabes Will Definitely Flop

Predators and Parasites

Prison Is For Lotto Losers

The False Dichotomy

Wait and See — Windows-vs-Mac Will Repeat Itself

Dishonesty For the Greater Good

Barr Part 2

Enough Information

Zune Is For Apple Haters

Good Open, Bad Open

Beach Bodies — Who’s Really Shallow?

Upgrade? Maybe Not

Eliminating the Im­pos­si­ble

Selfish Desires

Farewell, Pirate Cachet

The Two Risk-Takers

Number of Companies — the Idiocy That Never Dies

Holding On To the Solution

Apple Religion

Long-Term Planning

What You Have To Give Up

The End of Elitism

Good and Evil


How Religion Distorts Science

Laziness and Creativity

Sideloading and the Supersized-Mastodon-In-the-Room That Snell Doesn’t See

Long-Term Self-De­lu­sion

App Store Success Won’t Translate To Books, Movies, and Shows

Silly iPad Spoilsports

I Disagree

Five Rational Coun­ter­ar­gu­ments

Majority Report

Simply Unjust

Zooman Science

Reaganomics — Like A Diet — Works

Free R&D?

Apple’s On the Right Track

Mountains of Evidence

What We Do

Hope Conquers All

Humans Are Special — Just Not That Special

Life = Survival of the Fittest

Excuse Me, We’re Going To Build On Your Property

No Trademark iWorries


Twisted Excuses

The Fall of Google

Real Painters

The Meaning of Kicking Ass

How To Really Stop Casual Movie Disc Ripping

The Solitary Path of the High-Talent Pro­gram­mer

Fixing, Not Preaching

Why Blackmail Is Still Illegal

Designers Cannot Do Anything Imaginable

Wise Dr. Drew

Rats In A Too-Small Cage

Coming To Reason

Everything Isn’t Moving To the Web

Pragmatics, Not Rights

Grey Zone

Methodologically Dogmatic

The Purpose of Lan­guage

The Punishment Defines the Crime

Two Many Cooks


One Last Splurge

Making Money

What Heaven and Hell Are Really About

America — The Last Suburb


What the Cloud Isn’t For

Diminishing Returns

What You’re Seeing

What My Life Needs To Be

Taking An Early Re­tire­ment

Office Buildings

A, B, C, D, Pointless Relativity

Stephen Meyer and Michael Medved — Where Is ID Going?

If You Didn’t Vote — Complain Away

iPhone Party-Poopers Redux

What Free Will Is Really About

Spectacularly Well

Pointless Wrappers

PTED — The P Is Silent

Out of Sync

Stupid Stickers

Security Through Nor­mal­cy

The Case For Corporate Bonuses

Movie Copyrights Are Forever

Permitted By Whom?

Quantum Cognition and Other Hogwash

The Problem With Message Theory

Bell’s Boring Inequality and the Insanity of the Gaps

Paying the Rent At the 6 Park Avenue A­part­ments

Primary + Reviewer — An Alternative IT Plan For Corporations

Yes Yes Yes


Hey Hey Whine Whine

Microsoft About Microsoft Visual Microsoft Studio Microsoft

Hidden Purple Tiger

Forest Fair Mall and the Second Lamborghini

Intelligent Design — The Straight Dope

Maxwell’s Demon — Three Real-World Ex­am­ples


Entitlement BS



Einstein’s Error — The Confusion of Laws With Their Effects

The Museum Is the Art

Polly Sooth the Air Rage

The Truth

The Darkness

Morality = STDs?

Fulfilling the Moral Du­ty To Disdain



Real Design

The Two Rules of Great Programming


The End of the Nerds

Poverty — Humanity’s Damage Control

Berners-Lee’s Rating System = Google

The Secret Anti-MP3 Trick In “Independent Women” and “You Sang To Me”

ID and the Large Had­ron Collider Scare

Not A Bluff

The Fall of Microsoft

Life Sucks When You’re Not Winning


The Old-Fashioned Way

The Old People Who Pop Into Existence

Theodicy — A Big Stack of Papers

The Designed, Cause-and-Effect Brain


IC Counterarguments

The Capitalist’s Imaginary Line

Education Isn’t Eve­ry­thing

I Don’t Know

Funny iPhone Party-Poopers

Avoiding Conflict At All Costs

Behavior and Free Will, Unconfused

“Reduced To” Ab­sur­dum

Suzie and Bubba Redneck — the Carriers of Intelligence

Everything You Need To Know About Haldane’s Dilemma

Darwin + Hitler = Ba­lo­ney


Designed For Combat

Speed Racer R Us

Bold — Uh-huh

Conscious of Con­scious­ness

Future Perfect

Where Real and Yahoo Went Wrong

The Purpose of Surface

Eradicating Religion Won’t Eradicate War

Documentation Overkill

A Tale of Two Movies

The Changing Face of Sam Adams

Dinesh D’Souza On ID

Why Quintic (and Higher) Polynomials Have No Algebraic Solution

Translation of Paul Graham’s Footnote To Plain English

What Happened To Moore’s Law?

Goldston On ID

The End of Martial Law

The Two Faces of Ev­o­lu­tion

A Fine Rec­om­men­da­tion

Free Will and Population Statistics

Dennett/D’Souza Debate — D’Souza

Dennett/D’Souza Debate — Dennett

The Non-Euclidean Ge­om­e­try That Wasn’t There

Defective Attitude Towards Suburbia

The Twin Deficit Phan­toms

Sleep Sync and Vertical Hold

More FUD In Your Eye

The Myth of Rub­ber­neck­ing

Keeping Intelligent Design Honest

Failure of the Amiga — Not Just Mis­man­age­ment

Maxwell’s Silver Hammer = Be My Honey Do?

End Unsecured Debt

The Digits of Pi Cannot Be Sequentially Generated By A Computer Program

Faster Is Better

Goals Can’t Be Avoided

Propped-Up Products

Ignoring ID Won’t Work

The Crabs and the Bucket

Communism As A Side Effect of the Transition To Capitalism

Google and Wikipedia, Revisited

National Geographic’s Obesity BS


Theodicy Is For Losers

Seattle Redux


Living Well

A Memory of Gateway

Is Apple’s Font Rendering Really Non-Pixel-Aware?

Humans Are Complexity, Not Choice

A Subtle Shift

Moralism — The Emperor’s New Success

Code Is Our Friend

The Edge of Religion

The Dark Side of Pixel-Aware Font Rendering

The Futility of DVD En­cryp­tion

ID Isn’t About Size or Speed

Blood-Curdling Screams

ID Venn Diagram

Rich and Good-Looking? Why Libertarianism Goes Nowhere

FUV — Fear, Uncertainty, and Vista

Malware Isn’t About Total Control

Howard = Second Com­ing?

Doomsday? Or Just Another Sunday

The Real Function of Wikipedia In A Google World

Objective-C Philosophy

Clarity From Cisco

2007 Macworld Keynote Prediction

FUZ — Fear, Uncertainty, and Zune

No Fear — The Most Important Thing About Intelligent Design

How About A Rational Theodicy

Napster and the Subscription Model

Intelligent Design — Introduction

The One Feature I Want To See In Apple’s Safari.