Alan Ralph

Wearer Of Many Hats


🛠️ Please note that this site is a work-in-progress as I play around & experiment — things may change appearance between visits. 🛠️

We are serving the wrong masters

This 2018 essay by Don Norman, writing at Fast Company, strikes a chord with me:

We have unwittingly accepted the paradigm that technology comes first, with people relegated to doing the actions that the machines cannot do. This requires people to act like machines, ever ready to take over when things go wrong.

As a result, we require people to do tedious, repetitive tasks, to be alert for long periods, ready to respond at a moment’s notice: all things people are bad at doing. When the inevitable errors and accidents occur, people are blamed for “human error.” The view is so prevalent that many times the people involved blame themselves, saying things like “I knew better” or “I should have paid more attention,” not recognizing that the demands of the technology made these errors inevitable.

Over 90% of industrial and automobile accidents are blamed on human error with distraction listed as a major cause. Can this be true? Look, if 5% of accidents were caused by human error, I would believe it. But when it is 90%, there must be some other reason, namely, that people are asked to do tasks that people should not be doing. Tasks that violate fundamental human abilities.

Consider the words we use to describe the result: human error, distraction, lack of attention, sloppiness–all negative terms, all implying the inferiority of people. Distraction, in particular, is the byword of the day–responsible for everything from poor interpersonal relationships to car accidents. But what does the term really mean?

It’s not just fatalities that result from this over-complexity and lack of consideration for users in design. I’d argue that a lot of fraud and theft is facilitated by this same over-complexity, and the ‘solution’ invariably is to add more complexity and make users jump through extra hoops. Disturbingly, a common factor regardless of the situation is that the developers / manufacturers / business owners put the onus on the user to do better next time, rather than fix the design, whether it’s aircraft makers, financial institutions or social networks.

Just think about your life today, obeying the dictates of technology–waking up to alarm clocks (even if disguised as music or news); spending hours every day fixing, patching, rebooting, inventing work-arounds; answering the constant barrage of emails, tweets, text messages, and instant this and that; being fearful of falling for some new scam or phishing attack; constantly upgrading everything; and having to remember an unwieldy number of passwords and personal inane questions for security, such as the name of your least-liked friend in fourth grade. We are serving the wrong masters.

We need to switch from a technology-centric view of the world to a people-centric one. We should start with people’s abilities and create technology that enhances people’s capabilities: Why are we doing it backwards?

Unfortunately, as long as ‘solving’ this complexity is seen as an opportunity to get people to buy and use more stuff in hopes of making their life easier, change is going to be slow. (That’s not a knock against the makers of password managers and GTD apps, to name two examples, but they’re about solving problems that stem from bad design decisions of others.)


If you'd like to comment, send me an email.