I recently read "Black Box Thinking" by Matthew Syed, which looks at how we learn (or don't) from our mistakes. I was really struck by the comparison of how the aviation industry and the health sector react to failure. The difference couldn't be more profound.
Syed points out that the culture in aviation is very much about encouraging those who make mistakes (even small ones) to report and discuss them so the "data" can be absorbed and compared with other similar experiences. The result is a truly learning industry that quickly acts to correct small failing that might have eventually led to more serious failings.
On the flip side, health services across the world (including our own NHS) have a very different approach. The amazing commitment of the individuals providing medical care is a lesson to us all but in many ways they are let down by a system that does not effectively learn from mistakes. Of course there will be serious case reviews if something goes badly wrong - but the lesson from aviation is that you reduce the instance of serious mistakes by being open about small ones. Syed points out that this can sometimes this is as simple as labeling things more clearly.
The Government is trying to do something about this within the NHS following the learning from the Francis Inquiry into the scandal at Mid Staffordshire Foundation Trust. The "Learning from mistakes" league table was recently published. It splits trusts into categories from those with "outstanding levels" of openness and transparency to those with a "poor reporting culture". In a discussion facilitated by the HSJ, Health Secretary Jeremy Hunt was given a roasting by a Trust Chief Executive who had not fared well on the league table. The criticism was focused on the rating methodology and highlighted that this all "risked undermining the government's focus on patient safety". Whilst there will undoubtedly be flaws in the methodology etc., it would have been more reassuring (as a potential hospital patient) to see more engagement with the principle of transparency and openness to learning from mistakes.
Another initiative is the "Healthcare Safety Investigation Branch". The name will not help as it immediately summons up imagery of surgery doors being kicked in and Doctors and nurses being dragged off for questioning. That's not the intention though - it is primarily a learning service. Evidence provided will be protected and will require a court order to be made public. This is a good first step in a sector that has not been good at sharing the real experience of failure.
However, what these initiatives really expose is how far behind some public services have fallen in terms of how great organisations improve. The crux of it is that someone has to feel confident that they will be supported and treated fairly if they openly admit a mistake. In a brilliant Tedtalk, Amy Edmondson coined the term "psychological safety". This is a shared belief within an organisation that improvement comes from controlled risk taking and learning lessons. At a more basic level psychological safety is also about people feeling free (encouraged even) to raise questions and challenge decisions if they believe a course of action is incorrect. Unsurprisingly some of the most innovative and successful organisations in the world, like Google, are very psychologically safe. Conversely, it's clear a lot of public services are in pretty serious psychological danger.
In a recent article for Pioneer's Post, I discussed the need for "experimentation" in public services. However, it's pointless talking about experimentation until organisations are in a position where they can admit and discuss mistakes. There is no silver bullet, it's about setting the culture of an organisation and that has to come from the top. The league table and the presence of learning focused investigations will force hospitals to behave differently - but the danger is that these initiatives will actually cause more reticence regarding transparency when they rub up against the underlying organisational culture.