Kindly note that our offices close on 20 December 2024 and reopen on the 6th of January 2025.

How do we get even safer?

13 February 2014

Delve into aviation's ultra-safe world through the Human Factors and Crew Resource Management lens. Discover the shift from 'who's at fault' to 'what's at fault' mindset

Human Factor concerns in an ultra-safe environment such as aviation

During January 2014 the first South African Symposium for Human Factors and Aviation was held at the Birchwood Hotel in Johannesburg. Whilst there were many interesting talks, the first keynote presentation by Professor Sidney Dekker stood out. His talk entitled: “I wish I could get rid of people who make mistakes” - Dealing with dreams of safety, powerfully set the tone for the rest of the conference. Professor Dekker’s lecture has important implications for anyone working in an environment where safety is a concern: mining, medicine, construction to name but a few. In this article I will attempt to summarise the main points of this lecture, and what I’ve learnt from it.

First, some context: what are HF and CRM?

Human Factors and Ergonomics (HF&E) is a multidisciplinary field incorporating contributions from psychology, engineering, biomechanics, mechanobiology, industrial design, graphic design, statistics, operations research and anthropometry. In essence it is the study of designing equipment and devices that fit the human body and its cognitive abilities. The two terms "human factors" and "ergonomics" are essentially synonymous (Wikipedia). Crew Resource Management (CRM) is a set of training procedures for use in environments where human error can have devastating effects. Used primarily for improving air safety, CRM focuses on interpersonal communication, leadership, and decision making in the cockpit (Wikipedia). Now let’s get down to business...

There are two fundamental view points when thinking about safety.

You might think your work environment is safe, except for those few ‘bad apples’ that keep making mistakes. You have told them over and over, tried issuing rules, putting up posters and signs - you’ve even had to reprimand or let go of some individuals, because they are safety risks. Or perhaps you acknowledge that your work environment is amazingly complex and all the people working in it somehow get things right, more often than they get things wrong. People make mistakes, but each mistake is an opportunity to modify the environment and get rid of a potential booby trap. Professor Dekker argues that the first view point quickly shows diminishing returns, and that the second viewpoint is imperative to realise safety improvements in an already ultra-safe environment such as aviation. You shouldn’t be asking: “Who is the problem?”, instead ask: “What is the problem?”

The epic success of aviation safety and how we got here.

Many industries, from medicine to oil are looking to learn from the aviation industry, but how did aviation get this safe? During the late 1800’s and early 1900’s we saw the rise of Taylorism and Scientific Management. The goal of which was to improve efficiency and productivity. Safety was a concern, but the prevailing mindset was that safety was a ‘people problem’ and that the human being needed to adapt to the environment in order to be safer and more productive (not the other way around). Then we get to the second world war, where a mindshift is about to occur in Human Factors thinking. It was brought about by one of the iconic aircrafts of the United States Air Force, called by some the Flying Fortress. The Boeing B-17 bomber kept on flying even with significant battle damage, yet it had more than a hundred crashes on the runway. The B-17’s runway issues continued despite all efforts to eliminate the problem. Problem pilots were identified and removed. More attention was paid to the selection of pilots. Nothing worked. All the while, the pilots themselves began worrying about having a bad landing. This was a problem, because if people are afraid of accountability consequences of their work, their attention is no longer on the work, but on the consequences of making mistakes - which tends to exacerbate the problem.

Then came along Alphonse Chapanis and he very elegantly solved the problem.

The Boeing B-17 had two switches with very different functions, sitting right next to each other in the cockpit. One controlled the landing gear, the other the flaps. They looked the same, actuated the same and sounded the same. Confusing these two switches could have catastrophic consequences. Alphonse fashioned two shapes out of rubber: a wheel and a triangle, and fitted these on to the switches so that they could be distinguished by touch alone - which solved the B-17’s landing woes completely. Suddenly we realised that technology is not something we simply had to adapt to. We could change our environment and make the booby trap go away. The fundamental insight of human factors is this: forget about who is responsible, ask: “what is responsible?” The more you fire and prosecute people, the less you learn about the real issues ,and the more repetition of critical events you will get.

Now we're in 2014, asking ourselves: "How do we get even safer?"

Civil aviation’s accident rate is less than one incident per million departures, however this safety level was reached two decades ago and now remains stable in a long-term plateau. This is what Professor Dekker refers to as life at the thin edge of the wedge - where the asymptote is almost horizontal. Here it is quite clear that our old safety strategies are either dead or dying. Putting up more signs, introducing more checklists, standardising more procedures and publishing a bigger manual will have little to no effect.

What causes accidents at the thin edge of the wedge?

Accidents at the “thin edge of the wedge ” are caused by normalisation of deviance and a slow drift into failure. Everyone comes to work with the intention of getting the job done. If a problem is encountered, those with expertise quickly find a solution and move on. This solution might not be 100% by the book, yet despite this, it quickly becomes the normal way of solving that particular problem. This is called normalisation of deviance. Over time the effects of these new ‘normal’ ways of doing things might compound, eventually leading to an incident. This happens slowly and is very hard to spot. Professor Dekker calls this ‘a slow drift into failure’, and it happens exactly because everything is working well. In an ultra-safe environment such as aviation, the fact that you are successful becomes your biggest safety risk.

Out with the old. in with the new? 4 questions to ask:

So if the old strategies won’t work to improve safety, what will? According to Professor Dekker, the solution lies in fostering an open, safety conscious culture, where constant and candid discussions around safety are encouraged despite the appearance of everything running smoothly. To do this Professor Dekker suggests you ask yourself four questions.

  • Are you taking past successes as an indication of future safety?

  • Are the people in your organisation keeping a discussion alive about risk, even when everything looks safe?

  • Do you have the capabilities in your organisation for peer reviews (across divisions / departments) in terms of safety practices? These ‘fresh’ viewpoints counteract the normalisation of deviance, yet must not put people’s careers in jeopardy – safe discussions facilitate honest feedback.

  • Does your organisation have the capability to stop productions for chronic safety concerns?

  • Your manual might say yes, but do your people actually feel empowered (safe enough) to enforce this?

Newsletter

Get up-to-date industry news right in your inbox

Someone pointing to the left looking surprised

This site uses cookies to enhance your experience and to provide us with information on how to improve our website. To find out more, see our Terms of Business.