HUMAN ERROR IS RISKY BUSINESS

By Mark Miller, Business Unit Director, Learning Solutions


But does human error really exist?

It’s a big question. But maybe there’s a small answer. Not as much as you think.

To put this answer in context though might need a bigger explanation.

Firstly, what do we mean by ‘HUMAN’ and secondly, what do we mean by ‘ERROR’. ‘Human’ probably doesn’t need too much definition. If you’re reading this you’ll fit the description. ‘Error’ – for our purposes is the failure of a system to deliver an expected outcome.

This article looks at potentially widening perspectives on how we might view human error - and adopt a different way of looking at people in organisations. 
 

Human error or bad design?

One of the most famous case studies associated with alleged human error comes from the US Army Air Corps in the 1940’s. Certain aircraft types were crashing due to pilots retracting the undercarriage just prior to landing.

This was attributed to pilot (i.e. human) error and the investigation looked at the training, profiles and other factors for the pilots. The actual cause was ergonomics.

The problem was the undercarriage and flap levers were side by side and identical. When coming into land a pilot activates the flaps in order to dump lift. The pilots were selecting the wrong lever during a high pressure, complex point in the flight.

The solution was to redesign the undercarriage lever and make it look and feel like a wheel. This has been carried forward into all cockpit controls design since then.
 

"So, is the human error a symptom of a system failure, rather than the cause itself?"


Is it better to redesign the lever or expend aircraft, time, training and probably lives, trying to force pilots to select the correct of two identical levers?
 


The motivation; conscious or otherwise, for blaming human operator error is not always clear. For instance, a massive investment in a technological system, that turns out to be riddled with errors, makes it difficult to justify large changes and adjustments to that system.


The abandonment or replacement of the system is expensive in terms of financial capital, political capital and reputational capital. Just admitting it is not perfect might be unacceptable.


Human error - a symptom or the cause?

The consequences can become more serious, easier to ignore, when there is not a major failure. Although, the increased frequency of lesser modes of failure and error, these in turn degrade performance and lead to cumulative losses and creeping failure.

Whatever the motivation, the proposed solution is too often to focus on and treat a “human operator error” - in fact a symptom rather than the cause. This is often just a pragmatic response with all other options being unacceptable, despite effectiveness.

At this point we could take a whole new perspective. Maybe it’s not human error, but the human solution that is the issue?

For more in depth analysis read the full article, which takes a closer look at Human Systems in Action (HSIA) and considers how we could apply learning and development within an organisation when considering building resilient systems.