Bri Manning

Response to the Coming Software Apocalypse

October 10, 2017

Recently, I read an article titled “The Coming Software Apocalypse.” It starts simply enough: today’s software is complex, hard-to-understand, pervasive, and, worst of all, used by highly critical systems.

These things are all true.

There are things in the article that are not, however.

I won’t really go into the Toyota brake-related car recall, but suffice it to say, it’s still open for debate. They may have been held legally responsible, but not criminally. There’s plenty of reasonable doubt. For another take, Malcolm Gladwell has a good podcast episode about the Toyota controversy.

One quote really got to me though:

“The problem is that programmers are having a hard time keeping up with their own creations. Since the 1980s, the way programmers work and the tools they use have changed remarkably little.”

I had a hard time at first comprehending how false this even was. What I’ve been able to come up with is that making cars in the 80s was the same as now because assembly lines haven’t changed.

For one, Visual Studio, the IDE mentioned in the article as being used by a third of programmers, was released in February 1997. It just hit its 20th birthday and in that time has changed drastically, something that I can easily attest to having used it for about four years of my career.

In my just-over-a-decade career alone, things are drastically different. The tools are different; good luck creating a modern iPhone app on a computer from 2007, when the iPhone was first released, let alone a computer from the 80s. Large-scale web applications weren’t a thing. In the early 2000s, when websites were exploding in size, new technologies were being invented just to deal with the traffic. AJAX, which is used in nearly all websites now, wasn’t a thing until 2005.

If you really want to go all the way back to the 80s, you can easily point to the Internet, websites, mobile devices, and touch screens. Was there some version of them in some cases? Yes, but nothing like what exists now let alone as pervasive.

Now, to the crux of the issue.

The author seems to think computer-generated code is new. What he apparently somehow doesn’t realize is that that is built into many of the existing frameworks that exist. You get this from every data access object system there is out there. Want to know why people move away from that eventually? Because the queries created by the computer aren’t efficient. They end up being slower than something hand-written. Does it work for most cases? Yes. And that’s when you have a well-defined structure.

He mentioned TLA+ and how that helps, but most programmers can’t read it because they haven’t been taught the correct mathematical concepts. I was a math major and I can tell you most problems I’ve needed to solve as a developer don’t need higher math concepts.

I’m not positive that this is the case, but I’d venture a guess that code is tested more now than ever before. That hypothesis comes from the plethora of testing frameworks and given that every new framework talks about how testing is easy and/or built-in. That’s not even mentioning that error reporting is a mainstay in all mobile and web apps.

Finally, the apocalypse part. Will there be crashes and loss of service? Yes. It happened with MBTA payments to me recently. It’s obviously worse when things are critical systems. Those systems should be tested more rigorously and they are.

Failure is very human. “Everyone makes mistakes” is one of the oldest cliches. And even when someone is making software to make it easier for someone else to make software, there are still humans making software. While human nature and failure go hand-in-hand so does response to that failure. So does learning from our mistakes and improving.