Why do we keep changing things and, at the same time, making them more complicated?
We should be making things simpler.
Firstly, I need to provide some context. My PC died. There was an intermittent hardware fault that started some time ago and it finally killed the entire system. I had back-ups and even had access to a hard disk reader. Unfortunately, for about a week, I had no machine on which to load the back up.
Now I have a new machine running (and I will skip over the days of setting up, reloading familiar software and fighting subscription systems that began to sap the will to live). The new machine is running Windows 10, and the two most important applications are Office 2016 and Adobe Creative Suite. These three exhibit something that is increasingly irritating – they have made many changes, and in particular have changed their user interface, and the default settings are visually messy. In particular, both Microsoft and Adobe seem to have fallen in love with white letters on a black background for the default menus, etc. Those of us who were using computers in the early 1980s will remember the pleasure that the move to white screens with black letters gave us. Black on white has been proven in much research to provide optimal legibility, yet two companies that have carried out user workshops, and, in Adobe’s case, have a massive typographical heritage, have chosen to use it. Yes, I now know that, in both cases, you can delve down through the menus and change the defaults, but why should you have to do that?
It’s not just the Microsoft programmes. Other applications display the same trends: my phone has just updated itself, and a number of its applications have changed completely. In a different field, my car is 15 years old, and it has aspects of the human machine interface (HMI) that are not poor. Take climate control: in earlier cars you just reach out and turn a knob up and down to adjust the temperature, the next knob along controlled fan speed and then another clicked into different distribution modes (feet only, windscreen only, etc). My mature car has a row of buttons: temperature is controlled by one button for warmer, the other for cooler. Next to them are on-off buttons for cabin or foot well, then some more up and down buttons for fan speed, and then on/off for windscreen demister. In each case, you need to take your eyes of the road to identify the button and then look again to see what has been set.
I have been thinking of getting a newer car, and many have a single touch screen to carry out multiple functions, echoing the iPad with a range of swipes, pinches etc. I’ve come across these in hire cars and hate them: there is no way that you can use them without looking at them – a concentrated look, not just a quick glance. In part, my hatred is intensified because the lessons we have learned about typography, graphic design, and HMIs over many years are being thrown out of the window, just as with the white on black obsession I mentioned earlier.
Currently there are at least three groups of people designing graphical interfaces and HMIs: software guys, graphic designers, and HMI specialists. HMI specialists normally started life in an engineering role and then focused on the issues of how people interact with machines, often starting from a health and safety perspective. Graphic designers have spent years learning, both in training and then through experience, how type and images work and how colours can be combined. When they work on a project, they don’t have to invent things; they use their experience.
Now let us look at a system that is being developed. It requires human input and needs to display information to allow that input to be informed. Today, the team would normally turn to a graphical touch panel. Usually, the panel supplier provides a set of tools to accept input and to display output, so one member of the software team gets the job of “pulling together the interface.” After all, this is “free,” while getting a graphic designer or an HMI specialist involved costs extra money – for which no one budgeted. The software guy gets to work and has to think through from scratch all the different elements that are needed (assuming that there exists a specification for the interface.) Most of the time, you get a result that works – even though it may be unattractive. Occasionally, you get an interface that is dangerous: there was a drug delivery system (admittedly using a keyboard and mouse type interface to a screen) where, if the dose setting was cancelled and then reset the decimal point could appear in the wrong place – giving a potential massive overdose.
A grossly overused word is “intuitive,” as in “This interface is intuitive – red signals a problem while green is good.” But, in Chinese traditions, red is good (just as white is associated with death and mourning). Those of us brought up in the west learn to associate red with danger at a very early age. Without wanting to start a flame war, I’ll point out that people writing software are not typical of the average user, and what they regard as intuitive is not always what the average user will regard as such.
Good design (and good HMI design) is not just a pretty skin – it can be a source of commercial success. You have to look only at Apple, where design has always been an important thread and regularly sets the standard by which other companies are measured. This is not an accident; the Chief Design Officer, Jony Ive (Sir Jonathan Ive) reports to the CEO, and, on his page on the Apple web site, it says,
“Jony is responsible for all design at Apple, including the look and feel of Apple hardware, user interface, packaging, major architectural projects such as Apple Campus 2 and Apple’s retail stores, as well as new ideas and future initiatives.”
This is very different to giving one of the software team the task of “pulling together the interface.”
With PC software, another issue is the assumption that the user is always connected. I overuse Microsoft solitaire and previously just clicked on an icon and immediately was wasting time with the same solitaire that has been around since Windows 3.1. Now the first click takes me to a revoltingly garish screen of multiple windows, many of which connect to the internet; two more clicks take me through to the actual game – which again has been changed, and again for the worse. There was also a message that seemed to suggest that my score was being stored in the cloud, while the screen displaying the score has two advertising links and a suggestion that I upgrade (for a charge) to get rid of them. Additionally, at different times, Windows keeps popping up messages suggesting that I use Microsoft’s OneDrive cloud storage.
Adobe no longer has a one-time licence fee for its software; instead, you pay a monthly fee, and, when you use the software, you are connected to the web. From Adobe’s viewpoint, this approach is pretty good: they can keep tweaking the software without rolling out formal releases, and they have a nice steady flow of cash, which they can presumably increase at pretty short notice. I have a pretty good internet connection (over 80 Mbps) and it is available fairly consistently, but I am allergic to keeping so much in the cloud.
Is there an argument for saying that we should look at freezing certain things? But I fear that it is an untenable approach. What would Microsoft and Adobe do with their vast armies of developers? I am reminded of Hutber’s Law. Patrick Hutber was a British business journalist who, in the late 1960s, coined the phrase “improvement means deterioration”. This was based on his observations that when a company announced some great new improvement in a product or service, there was often a corresponding downgrading of another aspect.
Now you could dismiss this as the rants of an old man who is out of touch with the real needs of technology. But pop over to http://www.commitstrip.com/en/2017/08/03/its-an-improvement/ and see how, in six frames, it has summed up the argument beautifully.
Now I must get back to trying to find what has happened to my paragraph styles.