Understanding models of error and how they apply in clinical practice
Models of human error can be helpful in determining why errors have occurred in the past, where future vulnerabilities may lie, and how healthcare professionals might take action to make clinical practice safer.
Source: MAG / Shutterstock.com
As humans, we all make errors, both professionally and in our personal lives. We sometimes forget to do things, get distracted and do the wrong thing, and make what turn out to be wrong decisions. Within the UK pharmacy setting, dispensing errors occur in 0.04–9.8% of dispensed items in community pharmacy, and 0.008–0.02% within hospital pharmacy,,,,. The wide range is at least partly a result of different definitions of error and different methods used to detect them. The most common types of error identified in such studies include the wrong drug, wrong strength, wrong dosage form, wrong quantity and incorrect labelling. Pharmacists are also involved in identifying and rectifying the prescribing errors that occur in around 5% of prescribed medicines in UK general practice and 7% of prescribed medicines in hospitals. Psychological theories of human error can help us understand why these errors occur and identify strategies to prevent them. This article aims to provide pharmacy staff and other healthcare professionals with the background knowledge to understand theoretical models of human error and how they apply in clinical practice.
‘Person-centred’ versus ‘system-focused’ approaches to error
Historically, a person-centred approach to error has been taken, both in healthcare and elsewhere,. This approach has the underlying philosophy that errors are caused by human weaknesses and that some humans are more prone to error than others. According to this model, error reduction involves identifying variation between individuals and then targeting those who make the most errors. This is, therefore, a blame-oriented approach, one that is likely to create fear and involve disciplinary sanctions, while also holding individuals responsible and liable for errors. It has been suggested that blaming individuals may be more emotionally satisfying than blaming institutions and is potentially more convenient than trying to address the wider system.
Since the 1990s, a system-focused model has been advocated instead. In contrast to the person-centred approach, this model’s philosophy is that errors are caused by systems of which humans form only one part. For example, the people who apparently make the most errors may be those carrying out the most high-risk tasks or who work in the most difficult environments. According to the systems model of error, addressing these task and environmental factors is more likely to be effective than disciplinary action aimed at the individuals concerned. Organisations using the systems approach seek to reduce errors by looking at a range of factors, including the organisation itself and its policies. In clinical settings, this would include the use of incident reporting, or other methods, to identify errors and share lessons learnt, make changes to the system, and reduce the chances of errors occurring in the future. For example, if two strengths of the same type of tablet are stored together in similar packaging, dispensing errors are likely to occur. Identifying this problem could lead to solutions, such as changing the packaging or changing how these products are stored to better aid their differentiation.
The accident causation model
An example of a model of human error based on a systems approach is James Reason’s accident causation model, which was published in 1990 and has been applied in many contexts, including aviation and nuclear power. It has also been specifically adapted to healthcare and has been used to understand medication errors,,,, and patient non-adherence.
According to the accident causation model, a system has both a ‘sharp end’ and a ‘blunt end’ (see ‘Figure 1: The accident causation model’). At the sharp end, ‘active failures’ can occur on the part of front-line workers. These active failures are unsafe acts that can be classified into slips, lapses, mistakes and violations. Slips and lapses occur when the correct plan is made but then executed incorrectly. Specifically, a ‘slip’ occurs when a step of the plan is carried out incorrectly, whereas a ‘lapse’ occurs when a step of the plan is omitted or forgotten. Examples of slips that have been identified in clinical practice are selecting the wrong drug from a drop-down menu or intending to dispense one quantity of tablets but dispensing another. Lapses include a prescriber forgetting to cross a drug off a drug chart or a patient forgetting to take a dose of their medicine.
Figure 1: The accident causation model
Source: Adapted from Reason, J. Human Error. 1990. Cambridge: University Press, Cambridge.
In contrast to slips and lapses, ‘mistakes’ and ‘violations’ occur when an incorrect plan is formulated and then followed. Mistakes occur because of lack of, or misapplication of, the relevant rules or knowledge. Examples of mistakes are prescribing an incorrect dose because of a lack of knowledge of how a particular medicine is dosed, or a patient not taking their medicine because of an incorrect belief that it is contraindicated when drinking alcohol. Violations occur when a person knows the rule but makes a decision not to follow it – not with the intention to cause harm, but to save time or achieve a competing priority. An example is leaving a medical student to insert the dose on a paper prescription after it has been signed. For examples of the different types of active failure, see ‘Table 1: Examples of active failures resulting in failure to achieve the desired outcome’.
|Active failures||Examples based on driving to the airport for an early flight and missing the plane||Examples based on dispensing medicine|
|Slip||Started driving to the airport, but fell into mental ‘autopilot’ and took the route to work instead||Intended to dispense flucloxacillin but selected and dispensed amoxicillin instead|
|Lapse||Intended to set an alarm, but forgot to do so and overslept||Intended to dispense second item on prescription but forgot about it and failed to give it to the patient|
|Violation||Drove at 80mph and stopped by the police, resulting in a delay||Dispensed a controlled drug from a prescription that did not meet handwriting requirements before it had been corrected to avoid inconveniencing the patient|
|Mistake||Interpreted the flight time of 8am as the check-in time and arrived at the airport two hours late||A lack of knowledge about the differences between sodium valproate ‘chrono’ and ‘enteric-coated’, leading to the dispensing of the wrong preparation|
According to the accident causation model, these active failures do not occur in isolation, but result from ‘error-producing conditions’ that arise at different levels within the system. Within healthcare, these may relate to the patient, the task, the individual healthcare professional, the team, or the environment. Examples include lack of individual knowledge, lack of team communication and no access to a computer. Some analyses include technology as a separate group of error-producing conditions. In turn, error-producing conditions result from latent failures at the ‘blunt end’ of the system (i.e. those caused by the organisation or the surrounding culture). Examples include no one taking responsibility for the whole medication system in care homes, a tight budget to deliver a safe service, a lack of formal prescribing training in medical schools, and a lack of a culture of open communication between patients and prescribers.
The final part of the model is a system’s series of defences to prevent an adverse outcome (see figure 1). Within the medication-use process, these defences may include pharmacists, other healthcare professionals, double-checking systems, and the patient or their carers. However, rather than being completely intact, each of these defences is seen as having holes in it (giving rise to the name ‘the Swiss cheese model’). These holes can arise as a result of active failures and latent conditions within the system. Even if one of these holes is penetrated, a potentially dangerous situation may be blocked by the next defence. However, a trajectory of ‘accident opportunity’ arises if the whole series of defences is penetrated. Defences against prescribing errors have been identified as self-checking and checking by pharmacists and nurses. However, if an erroneous prescription gets through those defences, an error will reach the patient and may result in patient harm. Dean et al. also found that over-reliance on the pharmacist as a defence may result in compensatory behaviours, such as doctors not looking up doses when prescribing. Beso et al. identified a series of defences in place in a hospital pharmacy, including the dispenser noticing mistakes on a label in cases where the dispenser and labeller are different people, self-checking, and a stage where a final check on all dispensed items is carried out. However, pharmacy staff also identified that these defences have weaknesses: the dispenser may not notice an error, the drug may be dispensed from the label rather than the prescription, or dispensers could rely too much on the final checker to identify errors.
While Reason’s accident causation model has been the dominant model of human error used in healthcare to date, a more specific model — the Yorkshire contributory factors framework — was published in 2012 (see ‘Figure 2: The Yorkshire contributory factors framework’ and ‘Table 3: Definitions for factors in the Yorkshire contributory factors framework’). This is based on a framework of factors contributing to patient safety incidents in hospital settings. In the centre of the framework are active failures, around which are a series of circles representing situational factors, local working conditions, and two layers of latent factors: those relating to the organisation itself and those relating to wider external policies. Advantages of this model are that it is more specific to healthcare and it is also evidence-based, having been developed following a systematic review of relevant literature. Disadvantages are that it is more complex than the accident causation model and relates to hospital settings only, with no current equivalent for primary care.
Figure 2: The Yorkshire contributory factors framework
Source: BMJ Qual Saf 2012;21:369–380.
|Active failures||Any failure in performance or behaviour (e.g. error, mistake, violation) of the person at the ‘sharp end’ (e.g. the health professional)|
|Communication systems||Effectiveness of the processes and systems in place for the exchange and sharing of information between staff, patients, groups, departments and services. This includes both written (e.g. documentation) and verbal (e.g. handover) communication systems|
|Equipment and supplies||Availability and functioning of equipment and supplies|
|External policy context||Nationally driven policies/directives that impact on the level and quality of resources available to hospitals|
|Design of equipment and supplies||The design of equipment and supplies to overcome physical and performance limitations|
|Individual factors||Characteristics of the person delivering care that may contribute in some way to active failures. Examples of such factors include inexperience, stress, personality, attitudes|
|Lines of responsibility||Existence of clear lines of responsibility clarifying accountability of staff members and delineating the job role|
|Management of staff and staffing levels||The appropriate management and allocation of staff to ensure adequate skill mix and staffing levels for the volume of work|
|Patient factors||The features of the patient that make caring for them more difficult and therefore more prone to error. These might include abnormal physiology, language difficulties, personality characteristics (e.g. aggressive attitude)|
|Physical environment||Features of the physical environment that help or hinder safe practice. This refers to layout of the unit, the fixtures and fittings and the level of noise, lighting, temperature, etc.|
|Policy and procedures||The existence of formal or written guidance for the appropriate conduct of work tasks and processes. This can also include situations where procedures are available but contradictory, incomprehensible and of otherwise poor quality|
|Safety culture||Organisational values, beliefs and practices surrounding the management of safety and learning from error|
|Scheduling and bed management||Adequate scheduling to manage patient throughout, minimising delays and excessive workload|
|Staff workload||Level of activity and pressures on time during a shift|
|Supervision and leadership||The availability and quality of direct and local supervision and leadership|
|Support from central functions||Availability and adequacy of central services in support the functioning of wards/units. This may include support from information technology and human resources, portering services, estates or clinically related services such as radiology, phlebotomy and pharmacy|
|Task characteristics||Factors related to specific patient related tasks that may make individuals vulnerable to error|
|Team factors||Any factor related to the work of different professionals within a group that they may be able to change to improve patient safety|
|Training and education||Access to correct, timely and appropriate training, both specific (e.g. task related) and general (e.g. organisation related)|
According to these models of human error, we can prevent patient harm by stopping active failures from occurring — generally by targeting latent and error-producing conditions — and by creating more effective defences to block errors that do occur from actually reaching the patient. For example, latent failures existing in prescribing knowledge require suitable education and training strategies to resolve. These strategies may range from formal one-to-one sessions and group training, to looking at error monitoring and reporting systems as part of wider improvement programmes,,. Extra layers of defence can be introduced by information technology, such as clinical decision support. However, there are also holes in these defences, including alert fatigue, and new types of error-producing conditions can be introduced,. As noted above, pharmacists often provide an extra layer of defence by reviewing and checking prescriptions and intervening to correct erroneous prescriptions before they can result in patient harm. The role that patients can play in preventing error is being increasingly recognised and several research groups are currently exploring how best to facilitate this,. Models of human error illustrate how even the most experienced healthcare professionals can introduce errors, and so all healthcare professionals are potentially defences in the system and have an obligation to speak up if they are concerned. Effective communication between different healthcare professionals, and between healthcare professionals and patients, is often needed to activate these defences.
A series of tools, including flowcharts, fishbone diagrams, chronological mapping diagrams and action grids, have been developed based on Reason’s accident causation model to facilitate the investigation of incidents in clinical practice and to identify educational opportunities at a systems level (see Resources).
In summary, there has been some movement from a person-centred approach to a system-based approach of understanding error production and reduction in many areas, including healthcare. A systems approach focuses on improving the system rather than blaming individuals. Understanding models of human error can be helpful in understanding why errors have occurred in the past, where future vulnerabilities may lie, and how healthcare professionals might take action to make clinical practice safer.
Imperial College London. Systems analysis of clinical incidents: the London protocol. 1999. Available at: http://www.imperial.ac.uk/patient-safety-translational-research-centre/education/training-materials-for-use-in-research-and-clinical-practice/the-london-protocol/
Sara Garfield is a research pharmacist and Bryony Dean Franklin is director and professor of medication safety, the Centre for Medication Safety and Service Quality, Imperial College Pharmacy Department, Imperial College Healthcare NHS Trust, London & UCL School of Pharmacy, London.
Citation: The Pharmaceutical Journal DOI: 10.1211/PJ.2016.20201110
Recommended from Pharmaceutical Press