The Immorality of Bad Software Design

You surely saw the news: At 8:07 on January 13, a quiet Saturday morning in Honolulu, Hawaii’s Emergency Management Agency sent out to a million cell phones a text that read, “BALLISTIC MISSILE THREAT INBOUND TO HAWAII. SEEK IMMEDIATE SHELTER. THIS IS NOT A DRILL.”

As it happens, no ballistic missiles were falling on the Hawaiian islands, and 38 minutes after the initial alert (which cell-phone carriers were legally obligated to pass along to local users) the Emergency Management Agency issued a correction. An employee’s error caused the miscommunication, the agency’s spokesman, Richard Rapoza, would tell the Washington Post the next day.

Rapoza was not exactly wrong. In his version of the story, a still-unnamed worker at the end of a shift, doing a scheduled test of the system, clicked on a wrong option in a list of blue-underlined links on a computer screen in the emergency system’s office. Further official statements have noted that no one has yet been fired for the error, but Rapoza was adamant that a simple human mistake caused the false alert. And it looked not just possible but easy to make the mistake in the screenshot the agency released, with the link for “Test Missile Alert” close to, and nearly indistinguishable from, the link for “Missile Alert.”

The Hawaiian agency later modified its story, probably in response to complaints from the software manufacturer—which the Hawaiian television station KHON2 has identified as an Idaho company named AlertSense. The Emergency Management Agency replaced its first screenshot with a second, showing a Windows-style drop-down menu instead of a 1990s-era set of hotlinks. And, the agency confessed, the employee also had to make a second bad choice, clicking “Yes” on a confirmation pop-up. While not admitting to be the manufacturer, AlertSense has given demonstrations of its software to KHON2 and The Verge in recent days, by way of suggesting that the Hawaiian employee had, in fact, to go through multiple menus and clicks to send out the false alert, beginning with choosing what the company called a “wrong template option.”


* * *

Think about that phrase template option for a moment. What does it mean? What could it mean? We can puzzle it out, of course. Computer jargon has contained the word template long enough that users of Microsoft Word or Excel should have an idea of what it might mean in the context of an emergency alert program. For that matter, the Idaho company has been certified by the Federal Emergency Management Agency, and the state of Hawaii, following federal guidelines, requires a course of training for all employees who will use the program.

But why should users have to take an extended course for what is, after all, a fairly simple application? Why are alerts structured as templates in the first place? The screenshots available thus far show a system filled with obscure jargon and inconsistent phrasings. Worse, the program seems to have been organized into categories that reflect how the program was constructed, rather than how the program is meant to be used.

From the programmer’s perspective, a test alert and an actual alert look a lot alike. After all, they have the same format as messages. They vary only in their target audience: agency staff for the test, and both agency staff and external cell-phone users for the actual alert. It make senses for a programmer to construct identically-structured messages according to shared templates.

Unfortunately, the user who considers them identical risks mistaking one for the other—and the consequences of that mistake are so bad that the test alerts and the actual alerts should never have been options on the same menu.

Somewhere along the line, the software developer and the Hawaiian Emergency Management Agency lost track of the fact that the users’ experience of a program is not the programmers’ organization of the program. A few clicks from a tired employee, and a million cell-phones told that the citizens of Hawaii that they should be in terror for their lives. State police and ambulance officials have reported car accidents and heart attacks during the almost 40 minutes officials took to correct the error, with an enormous spike in panicked calls to 911.

The design of a computer program’s interface is not simply a matter of aesthetics. It is a matter of ethics. Overly complex arrangements are not just ugly to stare at all day on a computer screen. They are also immoral. They steal time and energy from daily work. They require more training than should be necessary, raising the cost of an employee entering the workforce. And when they fail, they can fail catastrophically, as they did for the Hawaiian Emergency Alert Agency.

The employee who made the mistake should be fired, yes. But that mistake was made possible, perhaps even inevitable, by the programmers who wrote the code and the officials who managed the system. And they should be sued into the ground. What price a million panicked people?

It’s common in programming these days to recite the mantra that the user interface is not the user experience. But the more consequential a possible error is, the more we need to remember that the users of software are not experiencing the program the way the developers organized the drop-down menus and pop-up options.

The security protocols demanded by the Federal Emergency Management Agency originate in government’s fear of hackers, which is why both AlertSense and Hawaiian officials have been so mealy-mouthed about explaining what happened on January 13. But what we’ve seen is enough to know that the system used in Hawaii for alerts failed the first requirement of such software: It wasn’t idiot proof, and the minor effort necessary to make a mistake was dramatically out of proportion to the major consequences that follow.

Related Content