Darel Rex Finley in 888

The Innovator’s Victory

2014.11.17   prev     next

Homesteaders came next, swarming onto land once considered unsuitable for crops because it averaged less than twenty inches of rain a year. Unscrupulous promoters promised that the very act of farming would increase the precipitation. “Rain follows the plow,” they said. ...

Special excursion trains brought prospective buyers to the region by the thousands. Salesmen assured them that none other than the former Chancellor of The University of Kansas had determined that the climate was undergoing a permanent shift: Precipitation was increasing while the winds were slowing down. Another expert declared that removing the cover of prairie grasses allowed more rainfall to penetrate the soil. ...

[Donald Worster:] “The Great Plow-Up had going for it ample rainfall for a period of ten or fifteen years. And it just kept encouraging more and more— people thought, as they think again and again, that we’ve turned the corner on climate here. We can see far into the future, and it’s all wet. They knew that there’d been severe droughts in the 1890s, but people forgot about those. They thought, that was the past, this is now, and this’ll be the future.”

—Ken Burns: The Dust Bowl

ANALYSTS and CEOs who believe that the fifteen-year dominance of tech by Microsoft and its OEMs was the normal state of affairs — and thus likely to repeat ad infinitum — are like wheat farmers in the Dust Bowl. One failed crop of products after another, one choking storm of losses and layoffs after another, can’t convince them that the golden age of Wintel won’t reappear next year. For people who are in one way or another invested in the success of Microsoft, its dominance of computing throughout the ’80s and ’90s cannot be perceived as a bizarre, fortuitous anomaly, a stroke of incredibly good luck for Gates and company, but instead must be portrayed as in some way the natural order of things.

It’s not. The success of Windows was an anomaly, and a pretty freaky one at that.

Home Computing

Personal computing (then “home computing”) first blossomed in the late ’70s. Several platforms quickly emerged, but they were all tinkerer/hobbyist affairs. None of them had mainstream appeal as something that the average, non-technical non-nerd would want to use. Most buyers of home computers expected to program them. And so it didn’t matter much if there wasn’t a big software ecosystem for any particular platform. Buyers shopped price or technical features, and many owned more than one platform. Prices were low, in the consumer range, and the hardware was correspondingly cheap (but still revolutionary for its time).

Business

So at the beginning of the ’80s, there was no substantial market for home computer software, and what market there was, was not dominated by any market-majority platform. Apple was perhaps closest to having such a position, but still wasn’t there yet.

Into this majority-player vacuum stepped the then-colossus of corporate computing, IBM. Throughout the ’60s and ’70s, IBM had developed a reputation for brutally crushing competitors via underhanded tactics (that got it in a lot of trouble with the government but never to an extent that made the company regret having used them). By the dawn of the ’80s, most businesspeople were acutely attuned to the FUD (fear, uncertainty, and doubt) factor, which whispered, “you’ll never get fired for buying IBM.” The question of who was making the better product became secondary compared to the question of who would still exist five years down the road.

IBM took full (and, in hindsight, their last) advantage of this FUD by marching into the microcomputer market with their IBM PC. Smartly, they designed the device to appeal to business users: a pricey, tank-like, metal box with a high-persistence green-screen that delivered a text-only display, but in a very crisp, 80-column, upper-and-lowercase font, with which the user ran the computer using the mainframe-like MS-DOS. The keyboard was heavy and very clicky, almost reminiscent of a typewriter.

Business users loved it, and looked on all other microcomputing platforms as cheap toys for home hobbyists. The appeal to business users won the day — their high-volume use couldn’t help but spill over into the home market, relentlessly relegating Apple and the other home computer makers to fend for the low end of that market. (Apple was perhaps worst-positioned for this storm, as its computer was aimed higher in the market than those of Commodore, Tandy, etc., and thus was most harmed by IBM’s complete takeover of the high end.) Swiftly, the IBM PC created a market-majority software platform, which also was the first really big third-party-app ecosystem in computing.

Oops

But then, a funny thing happened. Almost as soon as they had created it, IBM abruptly lost the PC market to the cloners. In what might be the biggest blunder in the history of business, IBM somehow failed to adequately protect its rights to the essential specs of its own product, and found itself in a situation where any company could make cheap PC clones that would run all the same software that ran on the PC. The cloners didn’t have to pay IBM a nickel, or even get permission to do it.

IBM tried valiantly to posture its PC as superior to the clones, but in vain: the same businesses (and especially home users) that had handed IBM control of personal computing, now took it away and handed it to the cloners. The IBM PC fast-faded to an irrelevant shell of its former self.

Manna For Gates

And who was the big winner of the cloners’ victory? Microsoft. The cloners didn’t have to get approval from IBM, but to be able to run all that PC software, they did have to license MS-DOS, the operating system of the IBM PC, from Microsoft. And Microsoft did not accidentally let go of its rights to that OS. So Microsoft suddenly found itself in de facto control of the whole PC market, able to dictate terms and specs to a host of cloners (thereafter known as “Microsoft OEMs”). And over the next several years, Microsoft ballooned into a money-machine powerhouse like nothing else in tech, making CEO Bill Gates the richest person in the world.

Vicious

In 1984, after all of the above had come to pass, Apple tried to re-enter the market with the brand-new Macintosh. It had some distinct advantages (graphical user interface, mouse, high-resolution bitmapped graphics) and some disadvantages (black-and-white, pricey, tiny screen). But the overwhelming problem for the Mac wasn’t anything as technical as that — the MS-DOS PC simply had a huge market of apps, and the Mac was starting basically from zero. The “virtuous circle” was working for MS-DOS, while the “vicious circle” was working against the Mac — i.e. most people want to buy the platform for which apps are being written, and they want to write apps for the platform that most people are buying. The only chance the Mac had to buck this circle was its distinct hardware advantages, but they weren’t enough. Even before Microsoft came out with Windows, the Mac’s peak market share in personal computers was sub-10%.

And, of course, Microsoft did come out with Windows, an obvious, cheesy rip-off of the Mac. Apple sued, unsurprisingly, and came very close to winning big. But in the end — after a jury had awarded Apple a massive verdict — appeals-court judge Vaughn Walker threw the whole case out the window, sinking the Mac’s hopes, and cementing the market dominance of Windows for many years to come.

Random

The take-home lesson of this story seems to be that new markets are chaotic and random, and until a solid victor is established, unpredictable events can have profound effects on how things shake out. How would things have turned out if IBM had stuck to mainframes and never entered the microcomputer market (or only much later)? What would have been the end result if Microsoft hadn’t dared to copy the Mac, or if judge Walker had decided to uphold the jury’s verdict? What if IBM had properly retained the rights to its own product, and demanded exclusivity for its OS? There’s no reason why things had to wind up the way they did; they could have been very different.

Pundits

But many tech pundits preferred not to see this, and instead tried to convince themselves (and anyone within earshot) that the victory of Microsoft and its OEMs was somehow inevitable. All through Apple’s 2000-to-present ascendancy, these pundits repeatedly mispredicted that Microsoft and its OEM gang (or a similar OEM alignment under Google’s Android) would supplant every Apple creation. And although some pundits learned their lesson after one or two comically incorrect forecasts, most of them reacted to Apple’s repeated successes with ever-shriller cries that the company had peaked, or that its heyday was over, or that it was about to fall away into has-been irrelevance. The common theme among them seemed to be that Apple’s post-2000 success is the freak anomaly, which soon will be re-replaced with the normal control of tech by a gang of un-innovative OEMs, led by an equally un-innovative app platform manager.

Most of these pundits had no advanced intellectual theory to back their beliefs. They just stated them as if they were obvious, and offered no explanation why. “We’ve seen this movie before,” was about as analytical as they got, never offering anything like a sophisticated reason to believe in Apple’s impending downfall.

Ivory Tower

But one did: Harvard’s Clay Christensen.

Although it can only ever be my unverifiable suspicion that Christensen shaped his theory to satisfy the desire to cast Microsoft’s pre-2000 dominance as a perpetual, natural phenomenon, we can definitely say that he has taken full opportunity to portray that dominance as bolstering his theory, and has used his theory as justification to believe that Apple’s post-2000 dominance is not long for this world.

Famous as the creator of “disruption theory,” Christensen posits that an innovator like Apple can enjoy only temporary success with a new-product-category breakthrough, before being ravaged by “modularity” — commodity component manufacturers banding together to make a good-enough, copycat product that beats out the innovator’s original. To stave off doom, the innovator must be perpetually on the run, skipping from one new-product-innovation to the next.

To get a taste of the thinking, here’s James Allworth, Christensen’s prime protégé and co-author of Christensen’s 2012 book, How Will You Measure Your Life?, describing Apple’s current level of success as being virtually a violation of physics (Exponent #24):

What caused [the ’90s Mac] to be crappy? If it was so far in front, how were modular players able to catch it up, and surpass it? So, these theories are never perfectly predictive. We’ve talked about this in terms of it being a very long, multivariable equation, and each one explains some of it. And I think what the theory would assert is that what you [Ben Thompson] just described is the standard thing that happens in an integrated-vs.-modular world. At the start, when the complexity and solving the problem is actually figuring out how all the components best work together, that the integrated player wins. But once those pieces have been figured out, it’s actually the modular players, who are just best able to focus on their own little piece of the puzzle, that will end up winning. And that’s why Microsoft, Intel, like Wintel ended up beating— the theory’s explanation would be, Wintel beat Apple on the basis of that. The reason that the Mac sucked was because they were trying to do too many things. And an organization, a typical organization that bites off that many things is not gonna be able to do it.

Now, what get’s interesting is the question around, it appears that if you believe that, and you believe that the theories that we’re referring to, integration-vs.-modularity and disruption, hold for a large explanatory power in that multivariable equation, that somehow Apple right now is defying gravity. It’s defying the theory. And I think that’s what makes it interesting. And I think your explanation of [Apple customers’] great experience is very compelling in explaining why they’ve managed to do as well as they have. What’s interesting— I mean, there’s also another interesting question to me, which is, like, why is it the way it is right now, versus the way it played out previously? Because I think the way it played out previously is the way that it typically does play out. Like, it makes sense to my mind that that’s what happens; the integrated player can’t develop all these different pieces, and sustain scale, and figure it out once all the interdependencies have been worked out. And yet, somehow Apple’s managing to do it right now.

Somehow indeed. The essential flaw in Allworth’s logic comes right at the start: The ’90s Mac was not “far in front;” it was always a sub-10% niche player. Crucial facts like this get brushed under the rug in the Apple naysayers’ history books, and such is essential to their case. Here’s Christensen himself explaining how Apple’s pre-2000 history fits into his thesis (as quoted by Horace Dediu):

[T]o be fast and flexible and responsive, the architecture of the product has to evolve towards a modular architecture, because modularity enables you to upgrade one piece of the system without having to redesign everything, and you can mix and match and plug and play best of breed components to give every customer exactly what they need. ...

So here the rough stages of value added in the computer industry, and during the first two decades, it was essentially dominated by vertically integrated companies because they had to be integrated given the way you had to compete at the time. We could actually insert right in here “Apple Computer.” ... Do you remember in the early years of the PC industry Apple with its proprietary architecture? Those Macs were so much better than the IBM’s. They were so much more convenient to use, they rarely crashed, and the IBM’s were kludgy machines that crashed a lot, because in a sense, that open architecture was prematurely modular.

But then as the functionality got more than good enough, then there was scope, and you could back off of the frontier of what was technologically possible, and the PC industry flipped to a modular architecture. And the vendor of the proprietary system, Apple continues probably to make the neatest computers in the world, but they become a niche player because as the industry disintegrates like this, it’s kind of like you ran the whole industry through a baloney slicer, and it became dominated by a horizontally stratified population of independent companies who could work together at arm’s length interfacing by industry standards.

There’s really no nice way to say it: Christensen’s history of Apple is fiction. There is no mention of the fact that the Microsoft-plus-OEMs arrangement was in full force before the Mac even hit the market. There is no mention of the fact that the Mac never broke 10% market share. And his bit about crashiness is utterly false; early Macs crashed plenty.

Things turned out the way they did not because they “had to,” but because of a fortuitous sequence of freak events. No business theory predicts these events, nor can such be defended by these events; they illustrate only the unpredictability of what happens in an unstable, new market. Christensen is forcing history to fit his theory, apparently because he’s simply in love with it (and/or has driven his reputation so deeply into it that it’s too late to back out now).

Probably Christensen’s most famous misprediction occurred in mid-2007, just as the iPhone was being released:

[T]he prediction of [my disruption] theory would be that Apple won’t succeed with the iPhone. They’ve launched an innovation that the existing players in the industry are heavily motivated to beat: It’s not [truly] disruptive. History speaks pretty loudly on that, that the probability of success is going to be limited.

Note that Christensen here is quite specific, not just that the iPhone won’t be a success, but that it is the prediction of his theory that it won’t be a success. And of course, we all know what happened: The iPhone is among the all-time most successful products ever made. What does that say about Christensen and/or his theory?

Suppose that, sometime in the past couple years, Christensen had written a chapter-length article, explaining, very specifically, his misprediction of iPhone failure. (Christensen has written several books, many articles, and he teaches classes and give speeches all the time — he writes for a living, and an article of the size I am about to describe would be no big effort for him.) Now suppose that in part one of this article, he explained in full detail, step-by-step, how he applied his theory in mid-2007 and arrived at this result. What data did he use, how did the theory interpret that data, etc.

Suppose that part two of the article identified — again in high detail — exactly what went wrong in part one, that would cause it to predict such an extreme opposite of actual events. Then suppose that in part three, he re-applied his theory — this time correctly, without the mistakes identified in part two — to the data available in mid-2007, and this time he arrived at the retrodiction that the iPhone would be a phenomenal success.

The paper could conclude with a section explaining how users of Christensen’s disruption theory (including Christensen himself) could avoid making similar mistakes, and thus hopefully be able to apply the theory in a way that is unlikely to predict failure for highly successful products, or vice-versa.

Now, such a paper could still be very wrong. The analytical mistakes identified in part two might not be the actual mistakes that led the theory astray. The re-analysis in part three might arrive at the correct conclusion due more to 20/20 hindsight than any actual abilities of the theory to accurately assess a product’s prospects. But — at least the existence of such a paper (written in good faith) would indicate that Christensen takes his own theory seriously! It would show that he really believes his theory has useful value, and is attempting to keep it that way.

So do we actually have a paper like this from Christensen? I haven’t exhaustively researched his writings, so I can’t say definitively that no such document exists. Nevertheless, I think if there was such an article, I would have heard about it by now. Which means we can reasonably conclude that Christensen isn’t even trying to maintain his theory as an item of genuine academic value. Or put more bluntly, he doesn’t really believe his own theory — it’s just a position, a pretentious posture, an intelligent-sounding re-description of events that have already occurred.

Disrupted

Last June, another Harvard professor, Jill Lepore, wrote possibly the first high-profile takedown of Christensen’s theory, “The Disruption Machine — What the gospel of innovation gets wrong” (The New Yorker), and Christensen et al. didn’t take it very well. With zero pushback from co-host Thompson, Allworth (Exponent #8) described Lepore’s criticism as “strangely personal” even though the whole article was about nothing but Christensen’s theory and how his examples of his theory in action don’t really fit the theory. Allworth apparently did not think it strangely personal when Christensen immediately responded to Lepore’s article with a flustered, defensive, Businessweek interview, in which he called her criticisms “egregious ... truly egregious,” as well as “a criminal act of dishonesty,” and said her article was an attempt “to discredit Clay Christensen, in a really mean way.” He also referred to himself in the third person while repeatedly referring to Lepore as “Jill.” The interviewer called him on it in the final question:

You keep referring to Lepore by her first name. Do you know her?

I’ve never met her in my life.

For all his flabbergasted paroxysms, in the Businessweek interview Christensen did manage a tiny attempt to explain (spin?) his 2007 misprediction of iPhone failure (quoted here in its entirety):

So the iPhone: There’s a piece of the puzzle that I did not understand. What I missed is that the smartphone was competing a­gainst the laptop disruptively. I framed it not as Apple is disrupting the laptop, but rather [the iPhone] is a sustaining innovation against Nokia. I just missed that. And it really helped me with the theory, because I had to figure out: Who are you disrupting?

Even taking this apparent mea culpa at face value, one may wonder: If the master of disruption theory can see a mistake like this only in several-year hindsight, then how is anyone supposed to make use of the theory?

But never mind that; there’s a much bigger, factual problem here. In the iPhone’s first five years, all four mid-2007 makers of successful smartphones — Nokia, Palm, RIM, and Motorola — fell into abysmal malaise, at best. And yet... the iPhone succeeded by disrupting laptops?? No. It didn’t. The iPhone succeeded by blowing away the prominent smartphones of the time. Christensen didn’t “miss” anything, or fail to “understand” some sideways “disruption” — he was just wrong. Flat-out wrong. It’s not mean to say so, it’s not egregious, and it’s definitely not criminal. Christensen thought the iPhone wouldn’t stand up to other smartphones, and it did. He was simply, purely, 180° incorrect.

Four months later, Christensen collected himself for a more intellectual-sounding defense, in Business Insider’s, “Harvard Management Legend Clay Christensen Defends His ‘Disruption’ Theory, Explains The Only Way Apple Can Win,” in which he was interviewed by Business Insider founder Henry Blodget. In the interview’s introduction, Blodget tells us that “Harvard’s Clay Christensen is today’s most influential modern management thinker,” and that he (Blodget) is “a huge devotee” of Christensen’s fame-founding book, The Innovator’s Dilemma. Blodget also calls Lepore’s article “a very personal attack.” None of this should be surprising coming from Blodget, who helpfully informed his readers in the spring of 2011:

“Android Is Destroying Everyone ... iPhone Dead In Water”

“Apple is fighting a very similar war to the one it fought — and lost — in the 1990s.”

And who in the fall of 2013 warned:

“What Apple does not seem to understand, however, is the fate that almost all niche platform providers eventually succumb to — gradual loss of influence, power, and profitability, followed by irrelevance. If you don’t understand what happens to platform providers that lose momentum and become niche players, just look at BlackBerry. ... What Apple zealots who crow about the company’s current profitability should recognize is that BlackBerry was highly profitable only a few years ago.”

“[E]ven if Apple does not get completely marginalized, [its] strategy will likely hurt the company and its shareholders (and its customers!) over the long haul. And it could be disastrous.”

So it would appear that Christensen found a friendlier forum this time.

Though Christensen’s composure is notably recovered from his Businessweek embarassment, he still manages to tell us that Lepore’s article was “personal,” and that “all of the points that she raised were not just wrong, but they were lies. Ours is the only theory in business that actually has been tested in the marketplace over and over again. ... for her to take that on, to take me on and the theory on — I don’t know where the meanness came from.” He also refers to himself in the third person again, and it’s unclear if he is willing to use Lepore’s surname; the only place in the transcript where he refers to Lepore by any name at all reads “[Lepore]” (i.e. in brackets), leaving the reader to wonder what he actually may have said.

He does little to convince his audience that the essential meat of his theory actually survives Lepore’s points unscathed, but he does manage to vomit up more unsupported insistence that Apple is running out of road:

CC: ... Apple won’t succeed, because in the end modularity always wins.

HB: And what the Apple believers will say is, “... they’re going to sell 65 million iPhone 6’s in the first quarter, so could you be more wrong?” That’s what they will say.

CC: That’s right. And what Clay will say in response is that you can never predict where the technology will come from, but you can predict with perfect certainty that if Apple is having that extraordinary experience, the people with modularity are striving. You can predict that they are motivated to figure out how to emulate what [Apple is] offering, but with modularity. And so ultimately, unless there is no ceiling, at some point Apple hits the ceiling. So their options are hopefully they can come up with another product category or something that is proprietary because they really are good at developing products that are proprietary. Most companies have that insight into closed operating systems once, they hit the ceiling, and then they crash.

Allworth also pounds this perspective on a regular basis. Here he is on Exponent #11:

“The way I like to think about this is whether it makes sense to be integrated or modular. That varies over time based on where you are in the product category life cycle. And at the beginning of that, when performance is absolutely not good enough, it makes most sense to be integrated, because you have most control over all the different components. As time goes by, the modular players can see how the integrated player has put everything together, and they can start to copy it, and pull it all apart, and focus on their own little [piece]. And, at least the theory would suggest, the modular players start to catch up. And that was the reason why I asked the question, because it might be the case that the difference between iOS and Android is starting to narrow. I would actually say that’s true.”

“When they [Apple] run out of things to improve, when the things to improve become less obvious, the ability to increase performance, to improve the experience, is relatively limited. I’m just going to put it in provocative language: It becomes easier for the modular players to copy the integrated player, because the integrated player runs out of places to go.”

Ask yourself if their position improved even slightly in the well-over-two years since Christensen was Dediu’s guest on The Critical Path, here quoted in Forbes by Steve Denning:

Christensen’s hope of salvation for Apple is not so much that they can defeat the innovator’s dilemma or the threat of modularity but rather that they can defer it indefinitely:

“The salvation for Apple may be that they can find a sequence of exciting new products whose proprietary architecture is demanded by the marketplace, and they can keep going from one product to another so that they will not have to confront this dilemma.”

Like a scratched vinyl record, Christensen, Allworth, and (of late) Dediu hammer home the received wisdom that Apple must forever flee before an advancing army of modular commoditizers that greedily devours anything Apple designs, and it is only Apple’s deft skill at this never-ending flight that has kept it alive and financially healthy for the past fifteen years. If they say it enough times, will it become true? Non-rebootable careers at Harvard notwithstanding — no. It won’t.

Innovation

The normal state of affairs is that the innovator is very, very successful. The innovator makes not just a quick killing in the early market, but continues to dominate that same market as it matures. The innovator re-invests its earnings into the product, improving it, and staying ahead of would-be com­pet­i­tors.

Copycat products carve out little niches in whatever areas the original innovator didn’t care to go, such as the ultra-cheap, zero-profit low end, or exotic, special-feature versions of the product.

Apple’s re-investment strategy goes beyond simply improving the product or making the experience better. Apple has shown a clear pattern of seeking to control more about their products, which not only protects them from backstabbing partners (of which there have been many), but actually improves their product in ways that can’t be matched by competitors. Apple’s use of ARM chips in the iPod gave economy of scale to everyone who used ARM chips, but Apple’s use of its own A8X chips in its latest iPads helps no tablet maker but Apple.

How does anything that’s been happening with Apple in the past fifteen years fit into Christensen’s disruption theory? It doesn’t. How does it fit into the more general punditry’s cries that Apple is on the verge of being run over? It doesn’t.

Apple’s market valuation is now about fifty percent greater than that of any other company on the planet in any business, at a time when many market analysts think its stock is undervalued. Apple’s only serious competitor, Samsung, is going into a financial tailspin. Apple is living, breathing proof that the innovator typically succeeds. The innovator normally wins — despite a fifteen-year anomaly when Microsoft got to run the show.

 

Update 2015.03.25 — added “With zero pushback from co-host Thompson”

Update 2015.05.27 — added “Ask yourself if ... confront this dilemma.”

 

See also:
The Old-Fashioned Way
&
Apple Paves the Way For Apple
&
iPhone 2013 Score Card
&
Disremembering Microsoft
&
What Was Christensen Thinking?
&
Four Analysts
&
Remember the iPod Killers?
&
The Innovator’s Victory
&
Answering the Toughest Question About Disruption Theory
&
Predictive Value
&
It’s Not A Criticism, It’s A Fact

 

See also:
The Self-Fulfilling Prophecy That Wasn’t

 

prev     next

 

 

Hear, hear

prev     next

Best Recent Articles

Method of Implementing A Secure Backdoor In Mobile Devices

When Starting A Game of Chicken With Apple, Expect To Lose

How I Clip My Cat’s Nails

Seasons By Temperature, Not Solstice

It’s Not A Criticism, It’s A Fact

Features (Regularly Updated)

A Memory of Gateway — news chronology of Apple’s ascendancy to the top of the technology mountain.

iPhone Party-Poopers Redux and Silly iPad Spoilsports — amusing litanies of industry pundits desperately hoping the iPhone and iPad will go away and die.

Embittered Anti-Apple Belligerents — general anger at Apple’s gi-normous success.

RSS FEED

My books

Now available on the iBookstore!

   

Links

Daring Fireball

The Loop

RoughlyDrafted

Macalope

Red Meat

Despair, Inc.

Real Solution #9 (Mambo Mania Mix) over stock nuke tests. (OK, somebody made them rip out the music — try this instead.)

Ernie & Bert In Casino

Great Explanation of Star Wars

Best commercials (IMO) from Superbowl 41, 43, 45, 46, and 47

Kirk & Spock get Closer

American football explained.

Sonos and Opalum — awesome sound stuff I saw at CEDIA.

TV: Better Call Saul; Homeland; Survivor; The Jinx; Breaking Bad; House of Cards; Inside Amy Schumer

God’s kitchen

Celebrity Death Beeper — news you can use.

Making things for the web.

My vote for best commercial ever. (But this one’s a close second, and I love this one too.)

Recent commercials I admire: KFC, Audi

Best reggae song I’ve discovered in quite a while: Virgin Islands Nice

Pinball Arcade: Unbelievably accurate simulation of classic pinball machines from the late ’70s through the ’90s, with new ones added periodically. Like MAME for pinball — maybe better.

d120 dice: You too (like me) can be the ultimate dice nerd.

WiFi problems? I didn’t know just how bad my WiFi was until I got eero.

Favorite local pad thai: Pho Asian Noodle on Lane Ave. Yes, that place; blame Taco Bell for the amenities. Use the lime, chopsticks, and sriracha. Yummm.

Um, could there something wrong with me if I like this? Or this?

This entire site as a zip file — last updated 2017.11.02

Previous articles

The Ultimate, Simple, Fair Tax

Compassion and Vision

When Starting A Game of Chicken With Apple, Expect To Lose

The Caveat

Superb Owl

NavStar

Basic Reproduction Number

iBook Price-Fixing Lawsuit Redux — Apple Won

Delusion Made By Google

Religion Is A Wall

It’s Not A Criticism, It’s A Fact

Michigan Wolverines 2014 Football Season In Review

Why There’s No MagSafe On the New Mac­Book

Sundar Pichai Says Devices Will Fade Away

The Question Every Ap­ple Naysayer Must An­swer

Apple’s Move To TSMC Is Fine For Apple, Bad For Samsung

Method of Implementing A Secure Backdoor In Mobile Devices

How I Clip My Cat’s Nails

Die Trying

Merger Hindsight

Human Life Decades

Fire and the Wheel — Not Good Examples of A Broken Patent System

Nobody Wants Public Transportation

Seasons By Temperature, Not Solstice

Ode To Coffee

Starting Over

FaceBook Messenger — Why I Don’t Use It

Happy Birthday, Anton Leeuwenhoek

Standard Deviation De­fined

Not Hypocrisy

Simple Guide To Pro­gress Bar Correctness

A Secure Backdoor Is Feasible

Don’t Blink

Predictive Value

Answering the Toughest Question About Disruption Theory

SSD TRIM Command In A Nutshell

The Enderle Grope

Aha! A New Way To Screw Apple

Champagne, By Any Other Maker

iOS Jailbreaking — A Perhaps-Biased Assessment

Embittered Anti-Apple Belligerents

Before 2001, After 2001

What A Difference Six Years Doesn’t Make

Stupefying New Year’s Stupidity

The Innovator’s Victory

The Cult of Free

Fitness — The Ultimate Transparency

Millions of Strange Dev­o­tees and Fanatics

Remember the iPod Killers?

Theory As Simulation

Four Analysts

What Was Christensen Thinking?

The Grass Is Always Greener — Viewing An­gle

Is Using Your Own Pat­ent Still Allowed?

The Upside-Down Tech Future

Motive of the Anti-Ap­ple Pundit

Cheating Like A Human

Disremembering Mi­cro­soft

Security-Through-Obscurity Redux — The Best of Both Worlds

iPhone 2013 Score Card

Dominant and Recessive Traits, Demystified

Yes, You Do Have To Be the Best

The United States of Texas

Vertical Disintegration

He’s No Jobs — Fire Him

A Players

McEnroe, Not Borg, Had Class

Conflict Fades Away

Four-Color Theorem A­nal­y­sis — Rules To Limit the Problem

The Unusual Mo­nop­o­list

Reasonable Projection

Five Times What They Paid For It

Bypassable Security Certificates Are Useless

I’d Give My Right Arm To Go To Mars

Free Advice About Apple’s iOS App Store Guidelines

Inciting Violence

One Platform

Understanding IDC’s Tablet Market Share Graph

I Vote Socialist Be­cause...

That Person

Product Naming — Google Is the Other Microsoft

Antecessor Hypotheticum

Apple Paves the Way For Apple

Why — A Poem

App Anger — the Supersized-Mastodon-In-the-Room That Marco Arment Doesn’t See

Apple’s Graphic Failure

Why Microsoft Copies Apple (and Google)

Coders Code, Bosses Boss

Droidfood For Thought

Investment Is Not A Sure Thing

Exercise is Two Thirds of Everything

Dan “Real Enderle” Ly­ons

Fairness

Ignoring the iPod touch

Manual Intervention Should Never Make A Computer Faster

Predictions ’13

Paperless

Zeroth — Why the Century Number Is One More Than the Year Number

Longer Than It Seems

Partners: Believe In Ap­ple

Gun Control: Best Ar­gu­ments

John C. Dvorak — Translation To English

Destructive Youth

Wiens’s Whine

Free Will — The Grand Equivocation

What Windows-vs.-Mac Actually Proved

A Tale of Two Logos

Microsoft’s Three Paths

Amazon Won’t Be A Big Winner In the DOJ’s Price-Fixing Suit

Infinite Sets, Infinite Authority

Strategy Analytics and Long Term Ac­count­a­bil­i­ty

The Third Stage of Computing

Why 1 Isn’t Prime, 2 Is Prime, and 2 Is the Only Even Prime

Readability BS

Lie Detection and Psy­chos

Liking

Steps

Microsoft’s Dim Pros­pects

Humanity — Just Barely

Hanke-Henry Calendar Won’t Be Adopted

Collatz Conjecture A­nal­y­sis (But No Proof; Sorry)

Rock-Solid iOS App Stability

Microsoft’s Uncreative Character

Microsoft’s Alternate Reality Bubble

Microsoft’s Three Ruts

Society’s Fascination With Mass Murder

PlaysForSure and Wikipedia — Revisionism At Its Finest

Procrastination

Patent Reform?

How Many Licks

Microsoft’s Incredible Run

Voting Socialist

Darwin Saves

The Size of Things In the Universe

The Self-Fulfilling Prophecy That Wasn’t

Fun

Nobody Was In Love With Windows

Apples To Apples — How Anti-Apple Pundits Shoot Themselves In the Foot

No Holds Barred

Betting Against Hu­man­i­ty

Apple’s Premium Features Are Free

Why So Many Computer Guys Hate Apple

3D TV With No Glasses and No Parallax/Focus Issues

Waves With Particle-Like Properties

Gridlock Is Just Fine

Sex Is A Fantasy

Major Player

Why the iPad Wannabes Will Definitely Flop

Predators and Parasites

Prison Is For Lotto Losers

The False Dichotomy

Wait and See — Windows-vs-Mac Will Repeat Itself

Dishonesty For the Greater Good

Barr Part 2

Enough Information

Zune Is For Apple Haters

Good Open, Bad Open

Beach Bodies — Who’s Really Shallow?

Upgrade? Maybe Not

Eliminating the Im­pos­si­ble

Selfish Desires

Farewell, Pirate Cachet

The Two Risk-Takers

Number of Companies — the Idiocy That Never Dies

Holding On To the Solution

Apple Religion

Long-Term Planning

What You Have To Give Up

The End of Elitism

Good and Evil

Life

How Religion Distorts Science

Laziness and Creativity

Sideloading and the Supersized-Mastodon-In-the-Room That Snell Doesn’t See

Long-Term Self-De­lu­sion

App Store Success Won’t Translate To Books, Movies, and Shows

Silly iPad Spoilsports

I Disagree

Five Rational Coun­ter­ar­gu­ments

Majority Report

Simply Unjust

Zooman Science

Reaganomics — Like A Diet — Works

Free R&D?

Apple’s On the Right Track

Mountains of Evidence

What We Do

Hope Conquers All

Humans Are Special — Just Not That Special

Life = Survival of the Fittest

Excuse Me, We’re Going To Build On Your Property

No Trademark iWorries

Knowing

Twisted Excuses

The Fall of Google

Real Painters

The Meaning of Kicking Ass

How To Really Stop Casual Movie Disc Ripping

The Solitary Path of the High-Talent Pro­gram­mer

Fixing, Not Preaching

Why Blackmail Is Still Illegal

Designers Cannot Do Anything Imaginable

Wise Dr. Drew

Rats In A Too-Small Cage

Coming To Reason

Everything Isn’t Moving To the Web

Pragmatics, Not Rights

Grey Zone

Methodologically Dogmatic

The Purpose of Lan­guage

The Punishment Defines the Crime

Two Many Cooks

Pragmatism

One Last Splurge

Making Money

What Heaven and Hell Are Really About

America — The Last Suburb

Hoarding

What the Cloud Isn’t For

Diminishing Returns

What You’re Seeing

What My Life Needs To Be

Taking An Early Re­tire­ment

Office Buildings

A, B, C, D, Pointless Relativity

Stephen Meyer and Michael Medved — Where Is ID Going?

If You Didn’t Vote — Complain Away

iPhone Party-Poopers Redux

What Free Will Is Really About

Spectacularly Well

Pointless Wrappers

PTED — The P Is Silent

Out of Sync

Stupid Stickers

Security Through Nor­mal­cy

The Case For Corporate Bonuses

Movie Copyrights Are Forever

Permitted By Whom?

Quantum Cognition and Other Hogwash

The Problem With Message Theory

Bell’s Boring Inequality and the Insanity of the Gaps

Paying the Rent At the 6 Park Avenue A­part­ments

Primary + Reviewer — An Alternative IT Plan For Corporations

Yes Yes Yes

Feelings

Hey Hey Whine Whine

Microsoft About Microsoft Visual Microsoft Studio Microsoft

Hidden Purple Tiger

Forest Fair Mall and the Second Lamborghini

Intelligent Design — The Straight Dope

Maxwell’s Demon — Three Real-World Ex­am­ples

Zealots

Entitlement BS

Agenderle

Mutations

Einstein’s Error — The Confusion of Laws With Their Effects

The Museum Is the Art

Polly Sooth the Air Rage

The Truth

The Darkness

Morality = STDs?

Fulfilling the Moral Du­ty To Disdain

MustWinForSure

Choice

Real Design

The Two Rules of Great Programming

Cynicism

The End of the Nerds

Poverty — Humanity’s Damage Control

Berners-Lee’s Rating System = Google

The Secret Anti-MP3 Trick In “Independent Women” and “You Sang To Me”

ID and the Large Had­ron Collider Scare

Not A Bluff

The Fall of Microsoft

Life Sucks When You’re Not Winning

Aware

The Old-Fashioned Way

The Old People Who Pop Into Existence

Theodicy — A Big Stack of Papers

The Designed, Cause-and-Effect Brain

Mosaics

IC Counterarguments

The Capitalist’s Imaginary Line

Education Isn’t Eve­ry­thing

I Don’t Know

Funny iPhone Party-Poopers

Avoiding Conflict At All Costs

Behavior and Free Will, Unconfused

“Reduced To” Ab­sur­dum

Suzie and Bubba Redneck — the Carriers of Intelligence

Everything You Need To Know About Haldane’s Dilemma

Darwin + Hitler = Ba­lo­ney

Meta-ware

Designed For Combat

Speed Racer R Us

Bold — Uh-huh

Conscious of Con­scious­ness

Future Perfect

Where Real and Yahoo Went Wrong

The Purpose of Surface

Eradicating Religion Won’t Eradicate War

Documentation Overkill

A Tale of Two Movies

The Changing Face of Sam Adams

Dinesh D’Souza On ID

Why Quintic (and Higher) Polynomials Have No Algebraic Solution

Translation of Paul Graham’s Footnote To Plain English

What Happened To Moore’s Law?

Goldston On ID

The End of Martial Law

The Two Faces of Ev­o­lu­tion

A Fine Rec­om­men­da­tion

Free Will and Population Statistics

Dennett/D’Souza Debate — D’Souza

Dennett/D’Souza Debate — Dennett

The Non-Euclidean Ge­om­e­try That Wasn’t There

Defective Attitude Towards Suburbia

The Twin Deficit Phan­toms

Sleep Sync and Vertical Hold

More FUD In Your Eye

The Myth of Rub­ber­neck­ing

Keeping Intelligent Design Honest

Failure of the Amiga — Not Just Mis­man­age­ment

Maxwell’s Silver Hammer = Be My Honey Do?

End Unsecured Debt

The Digits of Pi Cannot Be Sequentially Generated By A Computer Program

Faster Is Better

Goals Can’t Be Avoided

Propped-Up Products

Ignoring ID Won’t Work

The Crabs and the Bucket

Communism As A Side Effect of the Transition To Capitalism

Google and Wikipedia, Revisited

National Geographic’s Obesity BS

Cavemen

Theodicy Is For Losers

Seattle Redux

Quitting

Living Well

A Memory of Gateway

Is Apple’s Font Rendering Really Non-Pixel-Aware?

Humans Are Complexity, Not Choice

A Subtle Shift

Moralism — The Emperor’s New Success

Code Is Our Friend

The Edge of Religion

The Dark Side of Pixel-Aware Font Rendering

The Futility of DVD En­cryp­tion

ID Isn’t About Size or Speed

Blood-Curdling Screams

ID Venn Diagram

Rich and Good-Looking? Why Libertarianism Goes Nowhere

FUV — Fear, Uncertainty, and Vista

Malware Isn’t About Total Control

Howard = Second Com­ing?

Doomsday? Or Just Another Sunday

The Real Function of Wikipedia In A Google World

Objective-C Philosophy

Clarity From Cisco

2007 Macworld Keynote Prediction

FUZ — Fear, Uncertainty, and Zune

No Fear — The Most Important Thing About Intelligent Design

How About A Rational Theodicy

Napster and the Subscription Model

Intelligent Design — Introduction

The One Feature I Want To See In Apple’s Safari.