I’m going to start possibly a little earlier than you imagined… in 1880.

1880 – 1920

There was a young man in his twenties by the name of Frederick Taylor who started work as a clerk at a steel manufacturer in Philadelphia, Pennsylvania. Within a couple of years he was promoted to foreman. But like any good capitalist of his day (or any day for that matter), he wasn’t known to be particularly compassionate when it came to his workforce.

Workers in a factory.
[I am] constantly impressed by the failure of my [team] to produce more than about one-third of a good day's work.
Frederick Taylor (1881)

The general opinion of the day was that certain trades, including steelwork, could only be performed by craft production methods immune to any sort of workflow analysis or standardisation of process. It was something unique to each individual, and embodied in the inherent skills they passed on from master to apprentice.

Taylor believed that was nonsense. He felt that unproductivity, disguised as craftsmanship, was a deliberately selfish attempt by workers to keep employers ignorant of how fast work could actually be carried out. Because industry at that time was still largely dependent on human energy, this has a huge impact on his economic output and profitability.

Taylor wanted to transform management into a set of calculated and written techniques. At the time, management techniques were largely undocumented. So he set himself a task. He wanted to determine how long it should take a man to perform a given piece of work. He looked at things like unloading railroad cars of ore by shovel, how to lift and carry iron bars, and so on.

And he did actually stumble on some unconventional thinking for his day. He observed that mental and physical fatigue negatively affected productivity (who would have thought it?). He concluded it was more productive to include rest breaks so that the worker has time to recover. He also introduced a primitive version of performance-related pay — the more output each worker produced, the more they would be paid. But it's worth remembering, the key aim of him doing this was not for the physical or mental safety of his workers, but simply to maximise his profitability from them. Despite his management innovations he was fairly old-school in regards to human relations.

He called the approach Scientific Management, and it really took off. This was one of the earliest known instances of applying science to the engineering of processes.

Scientific Management was based on a simple theory: work could be divided into constituent tasks, with each task timed and then reordered or reconstructed into the most efficient way of working. He called this the “One Best Way” of performing any job or task, with the aim of increasing the productivity and efficiency of his labour force.

His work was superseded in the years to follow by Lilian Gilbreth and her husband, who were experts in recording what are known as ‘time-and-motion’ studies. These filmed observations enabled the Gilbreths, who were experts in Scientific Management at this point, to redesign machinery to reduce fatigue and provide more natural, intuitive and less strenuous movements for workers.

The Gilbreths went further than Taylor by introducing a more human approach to scientific management. They helped establish the importance of the psychological dimension of work and wellbeing; improving lighting, providing regular breaks, introducing suggestion boxes and free books.

The Scientific Management method remained popular until World World One. When that was over, the Russians had been through a revolution and had to form radically different ideas about Man’s relationship to work.

1920 – 1960

In the early 20s, Lenin organised a conference on Scientific Organization of Labour, where Scientific Management came under some serious scrutiny. The Soviet communists were focused on the rejection of turning a man into a machine. Dull monotonous work was merely a temporary stage for human civilisation to go through until a machine could be developed to replace the worker.

Vladimir Bekhterev (1921)

The Soviets were interested in protecting and even improving the life of the worker.

As the twenties turned into the thirties, mass-produced machines became more commonplace.

Any customer can have a car painted any colour that he wants so long as it is black.
Henry Ford (1922)

Henry Ford was revolutionising the transport industry (even if customer choice still had some way to go).

When World War Two gets into full steam by the 1940s, it was the first concrete example of technology challenging the limits of what a human was capable of. By this time, Taylor’s “One Best Way” feels hopelessly out of date. Technology was now pushing at the limits of what humans can do. Rapid decision-making, situational awareness and hand-eye coordination became critical factors in the success or failure of executing a task, and with a life or death consequence.

Even the best-trained pilots were crashing expensive aircraft. Military research showed that "pilot error”, as it was called, could be reduced by designing clearer, more logical controls and displays that operators found easier to use.

The many controls in the cockpit of a World War Two figher plane.

After the war this military research got shared in the public domain as increasingly complex technology became more widespread.

The “fit” between people, their equipment or machinery, and the environment became ever more important. So around this time the first professional organisation in relation to this almost symbiotic relationship between man and technology was formed. Here in the UK. It would ultimately become the Chartered Institute of Ergonomics and Human Factors.

Around the same time, the atomic bomb was dropped on Japan, bringing World War Two to a close. One of the leading scientists involved in the Manhattan project that brought about the atomic bomb was called Vannevar Bush.

Watching the aftermath of what he helped create, Bush was concerned about the direction of science toward destruction. So he published as essay in The Atlantic called As We May Think which anticipates many aspects of our contemporary society including the world wide web.

Not much of any great interest seems to occur in response to Bush’s speculative idea, until around 20 years later, in the late 60s…

1960 – 1980

A screenshot from The Mother Of All Demos showing Douglas Engelbart wearing a headset.
The mother of all demos

In 1968 Douglas Engelbart delivered what is known in technology circles as the mother of all demos. The computer he was demoing included some incredibly pioneering technology: a mouse-driven cursor and multiple windows used to display hypertext and hyperlinks, the precursor to the world wide web. Engelbart acknowledges he had been inspired, in part, by the Memex machine suggested by Vannevar Bush back in 1945.

It was still to take another ten to fifteen years or so before the computer became a household consumer product, but when it did the concept of Human-Computer Interaction had also emerged.

Human-Computer Interaction (or HCI) is a field that’s traceable back to it’s parent: Ergonomics and Human Factors, but with computers now becoming prevalent as a technology people interact with the focus is turned specifically on the relationship between a user and their computer interface.

Also around the turn of the 80s, something interesting was happening back in Pennsylvania, where the journey all started with Frederick Taylor 100 years before.

1980 – 2000

Harrisburg airport in the foreground with Three Mile Island in the background.
Harrisburg International Airport in Pennsylvania. In the background is Three Mile Island.

In 1979 a nuclear reactor at Three Mile Island failed. It became the biggest accident in US nuclear power plant history. One of the key reasons behind the scale of the failure was to do with a problem with the control room user interface which workers believed was indicating something it wasn’t. By the time they realised what was happening, it was too late.

A man called Donald Norman was part of a special team flown in to investigate the potential cause of the Three Mile Island incident, and one of the men to diagnose the failure of Human-Computer Interaction as a contributing factor.

Donald Norman went on to become a pioneer in an emerging field of what became known as user-centered design. The phrase originated from his research laboratory at University of California in San Diego and became a well-known term in the Human-Computer Interaction field by the mid eighties.

UCD or Human-Centred Design (HCD) was an approach or methodology to help design products or services to better meet the needs of their intended audience, thereby making them more likely to be a success.

When Don Norman travelled to London in the mid eighties he found a whole host of frustrating conventions for interacting with daily objects which were different from his native California. Doors that look like they should be pushed instead of pulled, or pulled instead of pushed. Taps that only emitted freezing cold or boiling hot water. Light switches or toilet flushes that seemed counterintuitive. His trip inspired his seminal book called The Design of Everyday Things where he talks about how to design for people and their experiences.

And to be honest, haven’t we all pulled a door by mistake that was intended to be pushed? Because the handle has been misleading, right?

A door handle with a pushable affordance labelled with the word pull.

User experiences are everywhere, not just for people interacting with computers or technology. And unfortunately that means badly designed user experiences are also everywhere.

Without a doubt, many of these things have been designed by designers. But these are cautionary tales about designing things without a deep, observed understanding of how people interact with their environment.

Ever heard of desire paths?

A green area with a pathway tramped across it.

Public spaces aren’t always designed efficiently to get us from A to B. The world doesn’t work as intuitively as it could. But, as you can see, humanity finds a way!

An aeriel view of the grounds of Michigan State University.

A good user experience is sometimes as obvious as simply observing what people do intuitively. Smart designers, such these people who planned Michigan State University, treat the users of the system as an equal partner to those that create it. When they had to decide on their campus footpaths, they simply made none to begin with. They waited for the students to arrive and find their own way from building to building, then paved over the trails.

It's worth remembering that in the mid-eighties computer technology was still a relatively niche affair. A computer was not readily available to the masses as an affordable consumer device.

In 1984 only 19 million US citizens had access to a computer at home. That’s about 8% of the population at that time.

Ten years later, by the mid nineties, that number had trebled to 60 million

Around this time Donald Norman joined Apple as a User Experience Architect, the first known use of the phrase "User Experience" in a job title and the reason why Don Norman is considered the godfather of User Experience.

At the turn of the century the number of people with access to the internet was 740 million.

Ten years later it more than doubled to 2 billion.

Today, that number is approaching half of the world's population of around 7.5 billion. Human-computer interaction through devices such as our mobile phones is at a scale like never before.

UX Today

Why do we need to create good user experiences? Hopefully some of what I’ve already shared naturally answers that question.

When humans interact with computers and technology good user experience can in extreme situations save lives, and even in the mundane minimise or avoid unnecessary physical or mental stress.

As you can also see, UX is not really a new thing. It might seem new to your organisation and its design process, but in fact it’s been emerging since before the dawn of the internet, back in the 80s, and people have been looking to solve similar problems for almost 140 years.

At Clearleft, our designers believe that a good user experience is synonymous with user-centered design. And user-centered design, at its core, is a toolkit of methods and techniques to help improve our ability to design products and services that meet the needs of those who use them.