Mode confusion

At 19:29 on 31st May 2009, Air France flight 477 departed Rio de Janeiro bound for Paris. The 228 passengers, aircrew and cabin crew on board the aircraft would never make it to their destination.

The crew were piloting an Airbus A330 – at the time one of the most advanced aircrafts in service. Like all modern passenger aircrafts, it has an auto-pilot. But the A330 also has an additional sophistication level of computerization called fly-by-wire. When out of autopilot, traditional flight controls hand the pilot direct mechanical governance of the rudder, elevators and ailerons, but fly-bywire systems act like a skilled interpreter, turning what might be imputing as ungainly instructions into graceful manoeuvres.

This makes it very hard to push an A330 beyond its capabilities, resulting in an exemplary safety record with not a single accident in its first 15 years of service. So, to many observers, the loss of Air France 477 was the most prominent and disturbing airline accident of modern times, raising questions as to whether the very technology designed to protect pilots was actually artificially cocooning them from developing the decision-making skills necessary in a crisis.

Senegalese Air Traffic Controllers were the first to realise that flight AF477 was missing when it failed to radio them as scheduled. Five days later, the Brazilian Navy sighted the first major wreckage. It took a further two years to locate and recover the aircraft’s flight recorders, 2,000m below the sea’s surface, and it was the black-box cockpit voice recordings that revealed the catastrophic and sustained errors of judgment exhibited by all three pilots.

The final investigation report, released in 2012, concluded that the aircraft crashed after temporary inconsistencies between the airspeed measurements – almost certainly due to the aircraft’s air speed measurement devices being obstructed by ice crystals. This caused the auto-pilot to disconnect, after which the pilot and co-pilot repeatedly reacted incorrectly, ultimately causing the aircraft to enter an aerodynamic stall, from which it stood no chance of recovery.

Transcripts of the AF477 pilots’ last eight minutes of cockpit exchanges are freely available online and make for grim reading, with subsequent blame for the deaths of the other 225 passengers and crew almost entirely set at the hands of the three pilots. Investigators concluded the lead pilot had suffered ‘mode confusion’. He stalled the aircraft because in his gut he felt it was impossible to stall the aircraft. Thankfully these incidents are rare, partly at least because the aviation industry has a global knowledge dissemination network that shares best practice the minute it is identified.

This is most visible in the operational checklist used by pretty much every civilian and military pilot. Where there’s an accident, regardless of scale, severity or outcome, the box is opened, black-box data is analysed and experts quantify what went wrong. Then the facts are published, and procedures are changed, so the likelihood of the same mistake happening again are lessened. Because even the smallest aircrafts are complex. So, these written guides walk pilots through the key steps in both routine and unscheduled situations. By applying this simple knowledge transfer protocol, the industry has created an astonishingly good safety record: in 2014 there was one crash
for every 8.3 million flights flown by a major airline. Aviation’s ‘growth mindset’ mentality sees every accident as a learning opportunity.

This is in stark contrast to the healthcare industry, where there is a culture of blame. Atwul Gawande is an American surgeon and public health researcher who is deeply disturbed by the large margin for human error that exists in medicine. He makes the distinction between errors of ignorance (mistakes we make because we don’t know enough) and errors of ineptitude (mistakes we made because we don’t make proper use of what we know). With vast amounts of knowledge at our fingertips, it is failure of ineptitude that is most often to blame for tragic medical mistakes. “Our great struggle in medicine these days is not just with ignorance and uncertainty,” Gawande told NPR. “It’s also with complexity: how much you have to make sure you have in your head and think about. There are a thousand ways things can go wrong.”

Checklists are a first step towards eliminating harmful oversights, but the thing that will really revolutionise healthcare, and any industry or business, is shedding the fear of failure and treating every mistake as a learning opportunity.

Gawande compared the protocols between surgeons, pilots and architects – all professions where error can lead to loss of life – and landed on a simple solution: checklists. Experts need written guides that walk them through the key steps in any complex procedure. Gawande’s research team has taken this idea and developed a safe surgery checklist around the world with amazing results. For example, Gawande’s team discovered that often doctors and nurses didn’t know each other’s names in the operating room. When introductions are made before a surgery, complications and deaths dip by 35 percent. “Making sure everyone knows each other’s names is an activation phenomenon,” Gawande explains. “The person, having gotten a chance to speak in the room, was much more likely to speak up later if they saw a problem.”

Checklists are a first step towards eliminating harmful oversights, but the thing that will really revolutionise healthcare, and any industry or business, is shedding the fear of failure and treating every mistake as a learning opportunity. This will create a dynamic process of change that will minimise avoidable error.

It is the anomalies in the existing systems, where they are failing, that set the stage for change. The transcript taken from AF477’s black box yields information that may ensure that no airline pilot will ever make the same mistakes. In response, airlines around the world have changed their training programmes to enforce habits that could have been lifesaving for those aboard AF477.

The Upside of Failure

All industries can benefit from disseminating knowledge and learning from mistakes, and yet so few do it.

Our own industry, real estate, is often guilty of protecting knowledge for their own competitive advantage, ultimately slowing the innovation and progress that could be made if learnings were shared. Leesman has taken the opposite approach. We know first-hand the power of data to incite change and our ambition is to see the whole industry elevated, rather than one or two players. That’s why we publish all of our data throughout the year and explain the impact of the data to the industry.

For some organisations, investigating how employees feel about the space their working in is a daunting prospect. But the example from the aviation sector proves that every failure is an opportunity that should be exploited. Growth mindset organisations can’t wait to dive into the data and see where they can improve because they know that acknowledging, and changing, their deficiencies is what will set them apart.

Our vision is to give employees a frictionless day at work. One of the ways we are attempting to do that is by creating our own workplace experience (EwX) checklist. This list identifies what factors our research has shown to be the most critical to enabling employees to do their best work – things like: individual work environment, learning and relaxing. Often, a Leesman survey will reveal that an organisation is actually not providing the best environment for their employees. It is our hope that the EwX checklist will make it easy for organisations to see where they are not delivering and how they can improve.

Back to Leesman Review