Balancing accountability and learning in a patient safety culture

At the point at which something has gone wrong, the way the organisation and wider systems responds is crucial

Weighing Scales

The vast majority of healthcare professionals go to work each day to do the best they can for their patients. When patient safety problems occur, almost always it is the systems, procedures, conditions, environment and constraints staff face that are the main contributory factors in what happened.

At the point at which something has gone wrong, the way the organisation and wider systems responds is crucial. An organisational response that fails to support staff, or attributes inappropriate individual blame without proper consideration and understanding of the system factors involved, can reinforce a culture of blame.

Where staff perceive blame from their peers, managers, even themselves, this has been associated with apprehension of reporting harm and potential problems, ultimately stifling improvement.1

This stifling effect on learning and improvement can result in an abdication of a fundamental and basic accountability which all healthcare organisations owe to future patients and staff; to take reasonable actions to minimise the chance of the same risks resulting in future harm.

From ‘no blame’ to a ‘just culture’

In the 1990’s the term ‘no-blame culture’ emerged and is still used today. However, the ‘no-blame’ concept fails to make the crucial differential between non-culpable human error and culpable unsafe acts. A culture of safety depends critically upon first negotiating where the line should be drawn between these.

In healthcare, the concept of ‘no-blame’ has now been largely replaced by the principles of a ‘just culture’.

In May 2016, the Expert Advisory Group (EAG) was established to provide advice of how the new Healthcare Safety Investigations Branch (HSIB) should operate in England. It recommended that the promotion of a ‘just culture’ should be a central principle in the operation of the new organisation.

“The Branch must promote the creation of a just safety culture, a shared set of values in which healthcare professionals trust the process of safety investigation; and are assured that any actions, omissions or decisions that reflect the conduct of a reasonable person under the same circumstances will not be subject to inappropriate or punitive sanctions.”

This neatly describes the culture that we need to promote, foster and support in healthcare in order to avoid creating conditions that detract from openness and learning. This must be a priority for all healthcare organisations and crucially, the same principles must be followed by the wider system, including system and professional regulation.

Defining the boundary between non-culpable and culpable unsafe acts

In the book ‘Whack-a-Mole: The Price We Pay For Expecting Perfection’2 David Marx argues that a ‘just culture’ distinguishes between different types of ‘unsafe’ acts as follows:

• Human error

• At-risk behaviour

• Reckless behaviour

In a ‘just culture’, Marx argues that the response to Human error should be to console, at-risk behaviour to coach and reckless behaviour, to punish.’

This framework for distinguishing between culpable and non-culpable unsafe acts is widely accepted in high-risk industries. In the NHS in England, NHS Improvement have recently published a just culture guide. This guidance provides useful tools to help those investigating patient safety events to make fair and consistent judgements about the actions of individuals involved in patient safety events.

Psychological Safety

The concept of a ‘just culture’ closely aligns with the concept of ‘psychological safety’, as described by Professor Amy C. Edmondson of Harvard Business School.3

“In psychologically safe environments, people believe that if they make a mistake others will not penalise or think less of them for it. They also believe that others will not resent or penalise them for asking for help, information or feedback.”

However, many healthcare staff still do not perceive the culture where they work in the way Edmondson describes. For example, the 2017 NHS staff survey found that only 70% of staff felt secure in raising concerns and just 58% felt confident that any concern raised would be properly addressed.

How can we change this?

The high-profile case of Dr Bawa-Garba has prompted considerable debate amongst the medical community, patients and patient safety groups in the UK. The case centred around the tragic death of a young child, Jack Adcock who died in 2011 from septic shock. Dr Bawa-Garba was prosecuted for Gross Negligence Manslaughter for errors made during Jack’s care and subsequently struck off the medical register by the General Medical Council (GMC), the professional regulator of Doctors in the UK.

The case triggered an urgent review of the application of Gross Criminal Manslaughter by the criminal courts, chaired by Sir Norman Williams. The review published in June 2018 and made a number of recommendations centred around the importance of creating the right culture in healthcare. The recommendations include steps to improve the standard of training and experience of expert witnesses and an emphasis on the importance of high-quality local investigations. Sir Normal Williams said:

“We hope our recommendations will change the environment by establishing a just culture and providing reassurance to healthcare professionals, patients and their families that gross negligence manslaughter cases will be dealt with in a fair and compassionate manner.”

Since the publication of the William’s Review, the GMC have announced a number of new measures, including a commitment to partner with a Human Factors training organisation to ensure its own investigations are carried out with the expertise needed to ensure that individual actions of healthcare professionals are always considered in the wider context of systems and the working environment at the time.

The Chief Executive of the GMC, Charlie Massey said:

“This collaboration will make sure that Human Factors are hardwired into our investigations so that the role systems and workplaces play in events is fully and evenly evaluated in assessing context following serious failings.”

The role of local organisations

This change in direction from national organisations is welcome, but changes in local systems and processes are also essential.

All healthcare organisations should have the following in place:

1) A clear commitment relating to how healthcare staff will be supported in the aftermath of an adverse event.

2) A pledge to treat staff fairly, with assurances that honest human error will not result in punitive sanctions or consequences.

3) An obligation to ensure that adverse events are investigated by people with suitable training and the required expertise to ensure relevant system, human, environmental, and cultural factors are properly considered.

4) A commitment to be transparent and share the outcomes of investigations, demonstrating the learning and changes made and the fair treatment of staff involved.

5) Clear processes for identifying and escalating rare examples where the actions of healthcare professionals fall outside acceptable boundaries, ensuring such processes are transparent and trusted, and making it clear that dishonestly will not be tolerated.

Mersey Care NHS Foundation Trust have done award winning work to move towards a just, learning culture and this video about their journey is highly recommended.


In 2009, Lucian Leap said that ‘…the single greatest impediment to error prevention in the medical industry is that we punish people for making mistakes.’

If our healthcare systems are able to change this, ensuring the right balance is struck between accountability and fostering an environment that supports a culture of learning, we will have succeeded in taking a significant step forward in creating the conditions needed to minimise patient safety risks and protect patients from avoidable harm.

References
  1. Waring JJ. Beyond blame: cultural barriers to medical incident reporting. Soc Sci Med. 2005 May; 60(9):1927–35)
  2. Marx, D. Whack-a-mole: the price we pay for expecting perfection, Plano, TX: By Your Side Studios, 2009.
  3. Edmondson, A.C. (2008) ‘Managing the risk of learning: Psychological safety in work teams’, in International Handbook of Organizational Teamwork and Cooperative Working. Wiley-Blackwell, pp. 255–275
Sign up for blog updates
Book a demo
Contact us

Contact Us

We will respond to your query and deal with your personal data in accordance with our Privacy Policy. If you are an organisation already benefitting from, or likely to benefit from, Datix’s products and services, we may get in touch with you to consider this further.

Book a Demo

We will respond to your query and deal with your personal data in accordance with our Privacy Policy. If you are an organisation already benefitting from, or likely to benefit from, Datix’s products and services, we may get in touch with you to consider this further.