Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information please take a look at our terms and conditions. Some parts of the site may not work properly if you choose not to accept cookies.

Join

Subscribe or Register

Existing user? Login

Patient safety

Understanding models of error and how they apply in clinical practice

Models of human error can be helpful in determining why errors have occurred in the past, where future vulnerabilities may lie, and how healthcare professionals might take action to make clinical practice safer.

Illustration showing possible errors in hospital and community scenarios

Source: MAG / Shutterstock.com

As humans, we all make errors, both professionally and in our personal lives. We sometimes forget to do things, get distracted and do the wrong thing, and make what turn out to be wrong decisions. Within the UK pharmacy setting, dispensing errors occur in 0.04–9.8% of dispensed items in community pharmacy, and 0.008–0.02% within hospital pharmacy[1],[2],[3],[4],[5]. The wide range is at least partly a result of different definitions of error and different methods used to detect them. The most common types of error identified in such studies include the wrong drug, wrong strength, wrong dosage form, wrong quantity and incorrect labelling[6]. Pharmacists are also involved in identifying and rectifying the prescribing errors that occur in around 5% of prescribed medicines in UK general practice[7] and 7% of prescribed medicines in hospitals[8]. Psychological theories of human error can help us understand why these errors occur and identify strategies to prevent them. This article aims to provide pharmacy staff and other healthcare professionals with the background knowledge to understand theoretical models of human error and how they apply in clinical practice.

‘Person-centred’ versus ‘system-focused’ approaches to error

Historically, a person-centred approach to error has been taken, both in healthcare and elsewhere[9],[10]. This approach has the underlying philosophy that errors are caused by human weaknesses and that some humans are more prone to error than others. According to this model, error reduction involves identifying variation between individuals and then targeting those who make the most errors. This is, therefore, a blame-oriented approach, one that is likely to create fear and involve disciplinary sanctions, while also holding individuals responsible and liable for errors. It has been suggested that blaming individuals may be more emotionally satisfying than blaming institutions and is potentially more convenient than trying to address the wider system[11].

Since the 1990s, a system-focused model has been advocated instead[12]. In contrast to the person-centred approach, this model’s philosophy is that errors are caused by systems of which humans form only one part. For example, the people who apparently make the most errors may be those carrying out the most high-risk tasks or who work in the most difficult environments. According to the systems model of error, addressing these task and environmental factors is more likely to be effective than disciplinary action aimed at the individuals concerned. Organisations using the systems approach seek to reduce errors by looking at a range of factors, including the organisation itself and its policies. In clinical settings, this would include the use of incident reporting, or other methods, to identify errors and share lessons learnt, make changes to the system, and reduce the chances of errors occurring in the future. For example, if two strengths of the same type of tablet are stored together in similar packaging, dispensing errors are likely to occur[11]. Identifying this problem could lead to solutions, such as changing the packaging or changing how these products are stored to better aid their differentiation.

The accident causation model

An example of a model of human error based on a systems approach is James Reason’s accident causation model, which was published in 1990[12] and has been applied in many contexts, including aviation and nuclear power[10]. It has also been specifically adapted to healthcare[13] and has been used to understand medication errors[5],[11],[14],[15],[16] and patient non-adherence[17].

According to the accident causation model, a system has both a ‘sharp end’ and a ‘blunt end’ (see ‘Figure 1: The accident causation model’). At the sharp end, ‘active failures’ can occur on the part of front-line workers. These active failures are unsafe acts that can be classified into slips, lapses, mistakes and violations[12]. Slips and lapses occur when the correct plan is made but then executed incorrectly. Specifically, a ‘slip’ occurs when a step of the plan is carried out incorrectly, whereas a ‘lapse’ occurs when a step of the plan is omitted or forgotten. Examples of slips that have been identified in clinical practice are selecting the wrong drug from a drop-down menu or intending to dispense one quantity of tablets but dispensing another[11]. Lapses include a prescriber forgetting to cross a drug off a drug chart[14] or a patient forgetting to take a dose of their medicine[17].

Diagram of the accident causation model, also known as the

Figure 1: The accident causation model

Source: Adapted from Reason, J. Human Error. 1990. Cambridge: University Press, Cambridge.

According to James Reason’s accident causation model, a system has a ‘sharp end’ and a ‘blunt end’. At the sharp end, active failures or unsafe acts (e.g. slips, lapses, mistakes and violations) can occur on the part of frontline workers. In addition, ‘mistakes’ and ‘violations’ can occur when an incorrect plan is formulated and then followed. Active failures do not occur in isolation, but result from ‘error-producing conditions’ that arise at different levels within the system. A system has a series of defences to prevent an adverse outcome. Rather than being completely intact, each of these defences is seen as having holes in it (giving rise to the name ‘the Swiss cheese model’). These holes can arise as a result of active failures and latent conditions within the system. Even if one of these holes is penetrated, a potentially dangerous situation may be blocked by the next defence. However, a trajectory of ‘accident opportunity’ arises if the whole series of defences is penetrated.

In contrast to slips and lapses, ‘mistakes’ and ‘violations’ occur when an incorrect plan is formulated and then followed. Mistakes occur because of lack of, or misapplication of, the relevant rules or knowledge. Examples of mistakes are prescribing an incorrect dose because of a lack of knowledge of how a particular medicine is dosed[18], or a patient not taking their medicine because of an incorrect belief that it is contraindicated when drinking alcohol[17]. Violations occur when a person knows the rule but makes a decision not to follow it – not with the intention to cause harm, but to save time or achieve a competing priority. An example is leaving a medical student to insert the dose on a paper prescription after it has been signed[14]. For examples of the different types of active failure, see ‘Table 1: Examples of active failures resulting in failure to achieve the desired outcome’.

Table 1: Examples of active failures resulting in failure to achieve the desired outcome
Active failuresExamples based on driving to the airport for an early flight and missing the planeExamples based on dispensing medicine
SlipStarted driving to the airport, but fell into mental ‘autopilot’ and took the route to work insteadIntended to dispense flucloxacillin but selected and dispensed amoxicillin instead
LapseIntended to set an alarm, but forgot to do so and oversleptIntended to dispense second item on prescription but forgot about it and failed to give it to the patient
ViolationDrove at 80mph and stopped by the police, resulting in a delayDispensed a controlled drug from a prescription that did not meet handwriting requirements before it had been corrected to avoid inconveniencing the patient
MistakeInterpreted the flight time of 8am as the check-in time and arrived at the airport two hours lateA lack of knowledge about the differences between sodium valproate ‘chrono’ and ‘enteric-coated’, leading to the dispensing of the wrong preparation

According to the accident causation model, these active failures do not occur in isolation, but result from ‘error-producing conditions’ that arise at different levels within the system[10]. Within healthcare, these may relate to the patient, the task, the individual healthcare professional, the team, or the environment[13]. Examples include lack of individual knowledge, lack of team communication and no access to a computer[19]. Some analyses include technology as a separate group of error-producing conditions[15]. In turn, error-producing conditions result from latent failures at the ‘blunt end’ of the system (i.e. those caused by the organisation or the surrounding culture). Examples include no one taking responsibility for the whole medication system in care homes[5], a tight budget to deliver a safe service[5], a lack of formal prescribing training in medical schools[19], and a lack of a culture of open communication between patients and prescribers[17].

The final part of the model is a system’s series of defences to prevent an adverse outcome (see figure 1). Within the medication-use process, these defences may include pharmacists, other healthcare professionals, double-checking systems, and the patient or their carers. However, rather than being completely intact, each of these defences is seen as having holes in it (giving rise to the name ‘the Swiss cheese model’). These holes can arise as a result of active failures and latent conditions within the system. Even if one of these holes is penetrated, a potentially dangerous situation may be blocked by the next defence. However, a trajectory of ‘accident opportunity’ arises if the whole series of defences is penetrated. Defences against prescribing errors have been identified as self-checking and checking by pharmacists and nurses[14]. However, if an erroneous prescription gets through those defences, an error will reach the patient and may result in patient harm. Dean et al. also found that over-reliance on the pharmacist as a defence may result in compensatory behaviours, such as doctors not looking up doses when prescribing[14]. Beso et al. identified a series of defences in place in a hospital pharmacy, including the dispenser noticing mistakes on a label in cases where the dispenser and labeller are different people, self-checking, and a stage where a final check on all dispensed items is carried out[11]. However, pharmacy staff also identified that these defences have weaknesses: the dispenser may not notice an error, the drug may be dispensed from the label rather than the prescription, or dispensers could rely too much on the final checker to identify errors[11].

Other models

While Reason’s accident causation model has been the dominant model of human error used in healthcare to date, a more specific model — the Yorkshire contributory factors framework — was published in 2012 (see ‘Figure 2: The Yorkshire contributory factors framework’ and ‘Table 3: Definitions for factors in the Yorkshire contributory factors framework’)[20]. This is based on a framework of factors contributing to patient safety incidents in hospital settings. In the centre of the framework are active failures, around which are a series of circles representing situational factors, local working conditions, and two layers of latent factors: those relating to the organisation itself and those relating to wider external policies. Advantages of this model are that it is more specific to healthcare and it is also evidence-based, having been developed following a systematic review of relevant literature. Disadvantages are that it is more complex than the accident causation model[20] and relates to hospital settings only, with no current equivalent for primary care[21].

Diagram showing the Yorkshire Contributory Factors Framework

Figure 2: The Yorkshire contributory factors framework

Source: BMJ Qual Saf 2012;21:369–380.

This model is based on a framework of factors contributing to patient safety incidents in hospital settings. Active failures are included in the centre of the framework, around which are a series of circles representing situational factors, local working conditions, and two layers of latent factors: those relating to the organisation itself and those relating to wider external policies.

Table 3: Definitions for factors in the Yorkshire contributory factors framework
Factor Definition 
Active failures Any failure in performance or behaviour (e.g. error, mistake, violation) of the person at the ‘sharp end’ (e.g. the health professional)
Communication systems Effectiveness of the processes and systems in place for the exchange and sharing of information between staff, patients, groups, departments and services. This includes both written (e.g. documentation) and verbal (e.g. handover) communication systems 
Equipment and supplies Availability and functioning of equipment and supplies 
External policy context Nationally driven policies/directives that impact on the level and quality of resources available to hospitals 
Design of equipment and supplies The design of equipment and supplies to overcome physical and performance limitations 
Individual factors Characteristics of the person delivering care that may contribute in some way to active failures. Examples of such factors include inexperience, stress, personality, attitudes
Lines of responsibility Existence of clear lines of responsibility clarifying accountability of staff members and delineating the job role 
Management of staff and staffing levels The appropriate management and allocation of staff to ensure adequate skill mix and staffing levels for the volume of work 
Patient factors The features of the patient that make caring for them more difficult and therefore more prone to error. These might include abnormal physiology, language difficulties, personality characteristics (e.g. aggressive attitude) 
Physical environment Features of the physical environment that help or hinder safe practice. This refers to layout of the unit, the fixtures and fittings and the level of noise, lighting, temperature, etc. 
Policy and procedures The existence of formal or written guidance for the appropriate conduct of work tasks and processes. This can also include situations where procedures are available but contradictory, incomprehensible and of otherwise poor quality 
Safety culture Organisational values, beliefs and practices surrounding the management of safety and learning from error 
Scheduling and bed management Adequate scheduling to manage patient throughout, minimising delays and excessive workload 
Staff workload Level of activity and pressures on time during a shift 
Supervision and leadership The availability and quality of direct and local supervision and leadership 
Support from central functions Availability and adequacy of central services in support the functioning of wards/units. This may include support from information technology and human resources, portering services, estates or clinically related services such as radiology, phlebotomy and pharmacy 
Task characteristics Factors related to specific patient related tasks that may make individuals vulnerable to error 
Team factors Any factor related to the work of different professionals within a group that they may be able to change to improve patient safety 
Training and education Access to correct, timely and appropriate training, both specific (e.g. task related) and general (e.g. organisation related) 

Preventing error

According to these models of human error, we can prevent patient harm by stopping active failures from occurring — generally by targeting latent and error-producing conditions — and by creating more effective defences to block errors that do occur from actually reaching the patient. For example, latent failures existing in prescribing knowledge require suitable education and training strategies to resolve. These strategies may range from formal one-to-one sessions and group training, to looking at error monitoring and reporting systems as part of wider improvement programmes[22],[23],[24]. Extra layers of defence can be introduced by information technology, such as clinical decision support[25]. However, there are also holes in these defences, including alert fatigue, and new types of error-producing conditions can be introduced[26],[27]. As noted above, pharmacists often provide an extra layer of defence by reviewing and checking prescriptions and intervening to correct erroneous prescriptions before they can result in patient harm[22]. The role that patients can play in preventing error is being increasingly recognised and several research groups are currently exploring how best to facilitate this[28],[29]. Models of human error illustrate how even the most experienced healthcare professionals can introduce errors, and so all healthcare professionals are potentially defences in the system and have an obligation to speak up if they are concerned. Effective communication between different healthcare professionals, and between healthcare professionals and patients, is often needed to activate these defences.

A series of tools, including flowcharts, fishbone diagrams, chronological mapping diagrams and action grids, have been developed based on Reason’s accident causation model to facilitate the investigation of incidents in clinical practice and to identify educational opportunities at a systems level (see Resources).

Conclusion

In summary, there has been some movement from a person-centred approach to a system-based approach of understanding error production and reduction in many areas, including healthcare. A systems approach focuses on improving the system rather than blaming individuals. Understanding models of human error can be helpful in understanding why errors have occurred in the past, where future vulnerabilities may lie, and how healthcare professionals might take action to make clinical practice safer.

Resources

Imperial College London. Systems analysis of clinical incidents: the London protocol. 1999. Available at: http://www.imperial.ac.uk/patient-safety-translational-research-centre/education/training-materials-for-use-in-research-and-clinical-practice/the-london-protocol/ 

Sara Garfield is a research pharmacist and Bryony Dean Franklin is director and professor of medication safety, the Centre for Medication Safety and Service Quality, Imperial College  Pharmacy Department, Imperial College Healthcare NHS Trust, London & UCL School of Pharmacy, London.

Citation: The Pharmaceutical Journal DOI: 10.1211/PJ.2016.20201110

Readers' comments (1)

  • Great article and it would be fantastic if there could be a decision on contributory factor definition in order to consistently collect the information and help to get a measure of impact of root causes. In recent training and inspection processes I am have been surprised about the limited formal training in RCA and association processes.
    Pharmacy professionals have transferrable skills in RCA and it is perhaps surprising that more do not provide support to investigations - whether medicines related or otherwise e.g. in hospitals which are often nurse-led

    Unsuitable or offensive? Report this comment

Have your say

For commenting, please login or register as a user and agree to our Community Guidelines. You will be re-directed back to this page where you will have the ability to comment.

Recommended from Pharmaceutical Press

  • MCQs in Pharmacy Practice

    MCQs in Pharmacy Practice

    A study aid with 800 MCQs. Assess your knowledge, analytical skills, and ability to apply this knowledge base in clinical practice.

    £24.00Buy now
  • Good Pharmacovigilance Practice Guide

    Good Pharmacovigilance Practice Guide

    An essential guide on pharmacovigilance of medicinal products for human use. Practical advice for developing effective pharmacovigilance systems.

    £37.00Buy now
  • Pharmacy Case Studies

    Pharmacy Case Studies

    Understand the application of therapeutics in clinical practice with Pharmacy Case Studies. This book helps you to demonstrate the knowledge gained during your studies.

    £32.00Buy now
  • Patient Care in Community Practice

    Patient Care in Community Practice

    Patient Care in Community Practice is a unique, practical guide for healthcare professionals or carers. Covers a range of non-medicinal products suitable for use at home.

    £21.00Buy now
  • International Research in Healthcare

    International Research in Healthcare

    Guidance for students or researchers undertaking a multi-centre research project in health services, medicines use and professional practice.

    £37.00Buy now
  • BNF for Children (BNFC) 2016-2017

    BNF for Children (BNFC) 2016-2017

    The BNFC contains essential practical information for all healthcare professionals involved in prescribing, dispensing, monitoring and administration of medicines to children.

    £49.95Buy now

Search an extensive range of the world’s most trusted resources

Powered by MedicinesComplete
  • Print
  • Share
  • Comment
  • Save
  • Print Friendly Version of this pagePrint Get a PDF version of this webpagePDF

Supplementary images

  • Illustration showing possible errors in hospital and community scenarios
  • Diagram of the accident causation model, also known as the "swiss cheese" model
  • Diagram showing the Yorkshire Contributory Factors Framework

Newsletter Sign-up

Want to keep up with the latest news, comment and CPD articles in pharmacy and science? Subscribe to our free alerts.