feature article
Subscribe to EE Journal Daily Newsletter
5 + 5 =

Why So Complicated?

Why do we keep changing things and, at the same time, making them more complicated?

We should be making things simpler.

Firstly, I need to provide some context. My PC died. There was an intermittent hardware fault that started some time ago and it finally killed the entire system. I had back-ups and even had access to a hard disk reader. Unfortunately, for about a week, I had no machine on which to load the back up.

Now I have a new machine running (and I will skip over the days of setting up, reloading familiar software and fighting subscription systems that began to sap the will to live). The new machine is running Windows 10, and the two most important applications are Office 2016 and Adobe Creative Suite. These three exhibit something that is increasingly irritating – they have made many changes, and in particular have changed their user interface, and the default settings are visually messy. In particular, both Microsoft and Adobe seem to have fallen in love with white letters on a black background for the default menus, etc. Those of us who were using computers in the early 1980s will remember the pleasure that the move to white screens with black letters gave us. Black on white has been proven in much research to provide optimal legibility, yet two companies that have carried out user workshops, and, in Adobe’s case, have a massive typographical heritage, have chosen to use it. Yes, I now know that, in both cases, you can delve down through the menus and change the defaults, but why should you have to do that?

It’s not just the Microsoft programmes. Other applications display the same trends: my phone has just updated itself, and a number of its applications have changed completely.  In a different field, my car is 15 years old, and it has aspects of the human machine interface (HMI) that are not poor. Take climate control: in earlier cars you just reach out and turn a knob up and down to adjust the temperature, the next knob along controlled fan speed and then another clicked into different distribution modes (feet only, windscreen only, etc). My mature car has a row of buttons: temperature is controlled by one button for warmer, the other for cooler. Next to them are on-off buttons for cabin or foot well, then some more up and down buttons for fan speed, and then on/off for windscreen demister.  In each case, you need to take your eyes of the road to identify the button and then look again to see what has been set.

I have been thinking of getting a newer car, and many have a single touch screen to carry out multiple functions, echoing the iPad with a range of swipes, pinches etc. I’ve come across these in hire cars and hate them: there is no way that you can use them without looking at them – a concentrated look, not just a quick glance. In part, my hatred is intensified because the lessons we have learned about typography, graphic design, and HMIs over many years are being thrown out of the window, just as with the white on black obsession I mentioned earlier.

Currently there are at least three groups of people designing graphical interfaces and HMIs: software guys, graphic designers, and HMI specialists. HMI specialists normally started life in an engineering role and then focused on the issues of how people interact with machines, often starting from a health and safety perspective. Graphic designers have spent years learning, both in training and then through experience, how type and images work and how colours can be combined. When they work on a project, they don’t have to invent things; they use their experience.

Now let us look at a system that is being developed. It requires human input and needs to display information to allow that input to be informed. Today, the team would normally turn to a graphical touch panel. Usually, the panel supplier provides a set of tools to accept input and to display output, so one member of the software team gets the job of “pulling together the interface.” After all, this is “free,” while getting a graphic designer or an HMI specialist involved costs extra money – for which no one budgeted. The software guy gets to work and has to think through from scratch all the different elements that are needed (assuming that there exists a specification for the interface.) Most of the time, you get a result that works – even though it may be unattractive. Occasionally, you get an interface that is dangerous: there was a drug delivery system (admittedly using a keyboard and mouse type interface to a screen) where, if the dose setting was cancelled and then reset the decimal point could appear in the wrong place – giving a potential massive overdose.

A grossly overused word is “intuitive,” as in “This interface is intuitive – red signals a problem while green is good.” But, in Chinese traditions, red is good (just as white is associated with death and mourning). Those of us brought up in the west learn to associate red with danger at a very early age. Without wanting to start a flame war, I’ll point out that people writing software are not typical of the average user, and what they regard as intuitive is not always what the average user will regard as such.

Good design (and good HMI design) is not just a pretty skin – it can be a source of commercial success. You have to look only at Apple, where design has always been an important thread and regularly sets the standard by which other companies are measured. This is not an accident; the Chief Design Officer, Jony Ive (Sir Jonathan Ive) reports to the CEO, and, on his page on the Apple web site, it says,

“Jony is responsible for all design at Apple, including the look and feel of Apple hardware, user interface, packaging, major architectural projects such as Apple Campus 2 and Apple’s retail stores, as well as new ideas and future initiatives.”

This is very different to giving one of the software team the task of “pulling together the interface.”

With PC software, another issue is the assumption that the user is always connected. I overuse Microsoft solitaire and previously just clicked on an icon and immediately was wasting time with the same solitaire that has been around since Windows 3.1. Now the first click takes me to a revoltingly garish screen of multiple windows, many of which connect to the internet; two more clicks take me through to the actual game – which again has been changed, and again for the worse. There was also a message that seemed to suggest that my score was being stored in the cloud, while the screen displaying the score has two advertising links and a suggestion that I upgrade (for a charge) to get rid of them. Additionally, at different times, Windows keeps popping up messages suggesting that I use Microsoft’s OneDrive cloud storage.

Adobe no longer has a one-time licence fee for its software; instead, you pay a monthly fee, and, when you use the software, you are connected to the web. From Adobe’s viewpoint, this approach is pretty good: they can keep tweaking the software without rolling out formal releases, and they have a nice steady flow of cash, which they can presumably increase at pretty short notice. I have a pretty good internet connection (over 80 Mbps) and it is available fairly consistently, but I am allergic to keeping so much in the cloud.

Is there an argument for saying that we should look at freezing certain things? But I fear that it is an untenable approach. What would Microsoft and Adobe do with their vast armies of developers? I am reminded of Hutber’s Law. Patrick Hutber was a British business journalist who, in the late 1960s, coined the phrase “improvement means deterioration”. This was based on his observations that when a company announced some great new improvement in a product or service, there was often a corresponding downgrading of another aspect.

Now you could dismiss this as the rants of an old man who is out of touch with the real needs of technology. But pop over to http://www.commitstrip.com/en/2017/08/03/its-an-improvement/ and see how, in six frames, it has summed up the argument beautifully.

Now I must get back to trying to find what has happened to my paragraph styles.

4 thoughts on “Why So Complicated?”

  1. Interesting timing. I am typing this on my newly configured Windows 10 machine (as of about an hour ago), having suffered a hard drive crash a couple weeks ago. My experience is like yours, except that I inserted an attempt to reset the old computer with a recovery disk sent from the manufacturer for that specific machine. Oh, the horror of it all. Internet connection came and went randomly. Windows Explorer hung many times a day. And slow slow slow. Hence punting for a new machine.

    As I wrote in a rant a few years back, when I encountered the outrage that was Windows 8, I’m fine with a learning curve for doing new stuff. But there is always a learning curve with new versions just to get back to zero, to do the stuff I have been doing for years. It’s a huge productivity killer.

    One other improvement would be the ability to save the settings in use for apps. In the early Macintosh days, there was actually a settings file that you could copy over and have your settings restored. Firmament forfend that one should try such a trick with the Windows registry… As it is, it will probably be several weeks before I find all the things that I need to redo and reset to get back to where I was, with no new capability other than a main drive that works.

  2. Agree completely regarding the overuse of “intuitive.” It’s rarely true, or even close to accurate. I’ve seen some truly horrible UIs that the developers proudly described as intuitive, when they were anything but. My hunch is that the programmers spent so much time developing the code that they knew it inside and out. It was intuitive to them, but only because it was so familiar. They could operate it in their sleep. To a newcomer, it was (and remains) a complete and ugly mess.

  3. Regarding your aging car’s dashboard controls, it’s fun to pick up an automotive magazine (e.g., Car and Driver, Road & Track, etc.) and read the horrible reviews of some of the newer electronic instrument panels. Evidently Cadillac, Jaguar, and Land Rover are the worst offenders. Some automakers got the message and have gradually reintroduced real buttons and switches. The flashy LCD screens may have looked cool in the showroom and helped to sell a few cars, but they’re awful to operate safely, as you described.

  4. To be fair about 30 years ago I hired a car (I can’t remember the make or model) and it had about a dozen switches in two banks, identical except for a hieroglyph on each one. Not what you need after a transatlantic flight.

Leave a Reply

featured blogs
Nov 17, 2017
CASPA is the Chinese American Semiconductor Professional Association. Once a year they have their annual conference and dinner banquet. I ended up getting involved with them a few years ago when I stepped in with 24-hours' notice to moderate a panel session for them, plu...
Nov 15, 2017
SuperComputing 2017 remains in full force this week from the Colorado Convention Center in Denver.  There are lots of activity in presentations, seminars, demonstrations and exhibits on the tradeshow floor. Stay tuned to the Samtec blog the rest of the week for more highligh...
Nov 16, 2017
“Mommy, Daddy … Why is the sky blue?” As you scramble for an answer that lies somewhere between a discussion of refraction in gasses and “Oh, look—a doggie!” you already know the response to whatever you say will be a horrifyingly sincere “B...
Nov 07, 2017
Given that the industry is beginning to reach the limits of what can physically and economically be achieved through further shrinkage of process geometries, reducing feature size and increasing transistor counts is no longer achieving the same result it once did. Instead the...