In the past 2 decades, a gradual shift has taken place from the ‘person approach’ to patient safety (in which the individual clinician at the sharp end is blamed for any accident) to a ‘systems approach’ (in which causation of accidents is attributed to loopholes in the organisational defences). Increasingly, however, concern has been expressed that the systems approach risks absolving individuals from responsibility for patient safety, and a balance between the systems and person approaches has been sought. In this paper, resolution of the tension between the person and the systems approaches is advocated through the use of a paradigm that places more emphasis on the relationships between the individual at the sharp end and other components of the system. This paradigm, which is adapted from ecosystems, has been labelled the ‘bionomic approach’. A bionomic approach to patient safety incorporates principles and concepts of human ecology and applies them to the healthcare system, situating the individual as an intrinsic component of the system rather than an adjunct. It builds on the notion that ‘people create safety’ and on the recognition that, in some clinical areas, particularly surgery, the individual is the primary defence against patient safety incidents. Skills required for ‘error wisdom’ are described, and the principles of the bionomic approach are applied to gynaecological surgery, using an illustrative case study.
Introduction
Both the understanding of, and investment in, patient safety have improved substantially since the landmark publication of To err is human and Organisation with a memory in the USA and UK, respectively, in the 1990s. The field is still evolving, and the underlying concepts and metrics are continually being tuned. One concept that has been dominant in the field is systems thinking. Like the volume knob of a radio that has been turned up, found to be too loud, then turned down for optimal effect, the concept of systems thinking has been tuned in recent times to achieve a balance between the individual healthcare provider (the subject of the title To err is human ) and the system in which he or she operates (the subject of the title Organisation with a memory ). Systems thinking has brought great advances in research and practice to the field of patient safety but further advance, particularly in surgical practice, calls for as much attention to be paid to individuals’ skills as to systemic defences.
Protecting patient safety: systems and person approaches
Traditionally, patient safety incidents were regarded simply as resulting from lack of knowledge, skill or diligence on the part of the individual practitioner. In surgical practice, it was taken that good surgeons were flawless, and any surgery-related incidents were the fault of the surgeon; if the surgeon conducted the operation with due skill and care, no errors and adverse events would occur. This is the person approach. Twenty years ago, this approach was also the norm in other sectors.
Towards the end of the last millennium, a gradual shift took place from the person approach to an alternative approach that was based on systems thinking. Systems thinking focuses on the physical, psychological, and organisational environment (i.e. the system) in which the individual operates, and emphasises the interactions between the various components of the system. The system commonly comprises a complex interplay of structures and processes, with varying degrees of error-inducing components (‘latent conditions’ or ‘resident pathogens’) as well as defences against accidents. The systems approach aims to trap errors and pre-empt harm (caused by ‘active failure’ at the sharp end) by strengthening defences. When an accident occurs, the systems approach aims to identify the holes in the defences that allowed this to happen, and to strengthen systemic defences to prevent this type of accident occurring again. This is in contrast to the person approach, in which the individual at the sharp end carries the blame. The person approach flies the flag of a ‘blame culture’ whereas the systems approach is sometimes (albeit misleadingly) tagged a ‘no-blame culture’. It is more appropriate to use the terms ‘just culture’ or ‘open and fair culture’ instead of no-blame culture. Some concern has been expressed about the prospects of a complete shift away from a blame culture, but it is also arguable that the pendulum has swung from one extreme to the other: that while the person approach disregarded the latent pathogens and blamed the individual, the systems approach, as often conceptualised on the shop-floor, seems to absolve the individual of responsibility for accidents and to shift responsibility for patient safety incidents to managers. A balance between the systems and the person approaches has been called for. In practice, the individual should bear responsibility for safe practice, and it will sometimes be justifiable to hold an individual to account rather than attribute accident causation to the system.
The systems approach has had a significant positive influence on how patient safety is conceptualised and how patient safety incidents are managed, but the apparent dualisation (‘system’ and ‘person’), whereby the person at the sharp end is viewed as somewhat remote from the system, is a weakness of this approach to patient safety.
Systems thinking and systems theory have their roots in systems engineering, and so (not unsurprisingly), a bias exists in the systems approach to safety towards interventions that reduce reliance on people, the fundamental aim being to design tools and processes that pre-empt human fallibility. This often entails high technology, but may also include ‘low tech’ interventions, such as protocols and guidelines that promote standardisation. These interventions deliver first order (or transactional) change; changes in structure but not necessarily in culture. In the health sector, the achievement of optimal safety requires second order (or transformational) change; change in mission, culture, leadership and strategy. To deliver second-order change in the field of patient safety, more emphasis should be placed on people, their thinking, their cognitive and safety skills, and their ‘error wisdom’.
Protecting patient safety: systems and person approaches
Traditionally, patient safety incidents were regarded simply as resulting from lack of knowledge, skill or diligence on the part of the individual practitioner. In surgical practice, it was taken that good surgeons were flawless, and any surgery-related incidents were the fault of the surgeon; if the surgeon conducted the operation with due skill and care, no errors and adverse events would occur. This is the person approach. Twenty years ago, this approach was also the norm in other sectors.
Towards the end of the last millennium, a gradual shift took place from the person approach to an alternative approach that was based on systems thinking. Systems thinking focuses on the physical, psychological, and organisational environment (i.e. the system) in which the individual operates, and emphasises the interactions between the various components of the system. The system commonly comprises a complex interplay of structures and processes, with varying degrees of error-inducing components (‘latent conditions’ or ‘resident pathogens’) as well as defences against accidents. The systems approach aims to trap errors and pre-empt harm (caused by ‘active failure’ at the sharp end) by strengthening defences. When an accident occurs, the systems approach aims to identify the holes in the defences that allowed this to happen, and to strengthen systemic defences to prevent this type of accident occurring again. This is in contrast to the person approach, in which the individual at the sharp end carries the blame. The person approach flies the flag of a ‘blame culture’ whereas the systems approach is sometimes (albeit misleadingly) tagged a ‘no-blame culture’. It is more appropriate to use the terms ‘just culture’ or ‘open and fair culture’ instead of no-blame culture. Some concern has been expressed about the prospects of a complete shift away from a blame culture, but it is also arguable that the pendulum has swung from one extreme to the other: that while the person approach disregarded the latent pathogens and blamed the individual, the systems approach, as often conceptualised on the shop-floor, seems to absolve the individual of responsibility for accidents and to shift responsibility for patient safety incidents to managers. A balance between the systems and the person approaches has been called for. In practice, the individual should bear responsibility for safe practice, and it will sometimes be justifiable to hold an individual to account rather than attribute accident causation to the system.
The systems approach has had a significant positive influence on how patient safety is conceptualised and how patient safety incidents are managed, but the apparent dualisation (‘system’ and ‘person’), whereby the person at the sharp end is viewed as somewhat remote from the system, is a weakness of this approach to patient safety.
Systems thinking and systems theory have their roots in systems engineering, and so (not unsurprisingly), a bias exists in the systems approach to safety towards interventions that reduce reliance on people, the fundamental aim being to design tools and processes that pre-empt human fallibility. This often entails high technology, but may also include ‘low tech’ interventions, such as protocols and guidelines that promote standardisation. These interventions deliver first order (or transactional) change; changes in structure but not necessarily in culture. In the health sector, the achievement of optimal safety requires second order (or transformational) change; change in mission, culture, leadership and strategy. To deliver second-order change in the field of patient safety, more emphasis should be placed on people, their thinking, their cognitive and safety skills, and their ‘error wisdom’.
The bionomic approach
In the context of operational safety, the individual cannot be meaningfully separated from the system in which he or she operates; the individual is intrinsic to the system. A division between the person approach and the systems approach is therefore artificial. The tension between the person and the systems approaches could be resolved by using a different paradigm that places more emphasis on the relationships between the individual at the sharp end and other components of the system, including people along the more proximal segments of the healthcare chain or error trajectory ( Fig. 1 ).
An alternative paradigm can be adapted from ecology. The core attribute of an ecosystem is that of interdependence between living organisms and their environment, these constituent elements working together as a system; the individual organisms are in dynamic relationships with each other and with their surroundings. Interacting levels of organisation range from the cellular level to the biosphere. Regulation and feedback mechanisms are intrinsic to the ecosystem. Variety also exists within the ecosystem, but this biodiversity is strength rather than weakness.
Bionomic means ‘pertaining to ecology’. A bionomic approach to patient safety is based on principles and concepts of human ecology, and applies them to the healthcare system, situating the individual as an intrinsic component of the system rather than an adjunct. The individual is encouraged to see him or herself as an intrinsic part of the system, as one link in a broad framework of patient safety, and as holding a stake in the integrity of the various defences against potential accidents. Individuals are made aware that they owe responsibilities to colleagues operating at more distal ends of the chain or trajectory. The defences in Fig. 1 are not always organisational decisions made, or procedures prescribed by managers, they could also be decisions and practices of frontline clinicians. In other words, although many errors or ‘holes in the layers of the Swiss cheese’ result from latent conditions, some are due to active failures. A close look at one classification of surgical errors ( Table 1 ) shows that, even at the proximal end of the trajectory, the safety skills of the individual practitioner are implicated.
| Classification | Description |
|---|---|
| Proximal | Imposed by the system operated by the organisation and the process used by the practitioners, resulting in defects relating to: coherence and goal conflicts, between organisation and individual departments. poor leadership. inadequate team work. inadequate training and continued professional development of staff. inadequate resource allocation. unclear protocols briefings and procedures. lack of evidence-based practice and inadequate information technology for staff non-transparent culture. overwork. lack of quality assurance measures. inadequate system for detection of poor performance. |
| Distal (coal face, front line, sharp end) | Enacted by surgeons working within the system throughout the perioperative process. Input error (knowledge and perception): the input data are incorrectly perceived, leading to incorrect or inappropriate action. Intention (mind-set) error: although the input data are correctly perceived, an incorrect intention is formed or is not changed as the situation evolves, resulting in incorrect action. Execution (psychomotor) error: both perception of the input data and intention are correct, but the incorrect action is carried out; subtyped into: Omission: omitting an essential component step in the process. Commission (consequential and inconsequential). |
In gynaecological survey, for example, the surgeon is the individual at the sharp end, and a dedicated pre-operative clinic is one of the barriers (latent control) introduced by the organisation to prevent intra-operative accidents. The nurses running the pre-operative assessment clinic are themselves ‘sharp-enders’. A failure of pre-operative assessment is no more a failure of the managers or the system (latent condition) than a failure of a person at the sharp end (active failure). It is essential that the staff of the pre-operative assessment clinic should possess the ‘safety skills’ that enable entrapment of error, thus reducing holes in the Swiss cheese.
The hallmark of the bionomic approach is its focus on people, relationships and interactions. It has been said that ‘people create safety’ and that ‘it is time to talk about people’ in patient safety. Also, ‘if you want to understand what went on in the mind, look in the world in which the mind found itself’.
A key concept in bionomics that has been transferred to patient safety is resilience, the capacity of an ecosystem to respond to a disturbance by resisting damage and recovering quickly. In patient safety, resilience reflects the degree to which individuals at all levels of an organisation proactively anticipate accidents, devise strategies to prevent them, and take prompt remedial action when they occur.
Related to this are the concepts of adaptability (the capacity of individuals in a system to influence resilience) and transformability (the capacity to change when external conditions make the current system untenable). Adaptive governance is a process of creating adaptability and transformability in social–ecological systems. The equivalent in patient safety is clinical governance.
The bionomic approach is not antithetical to the systems and person approaches. Rather, it brings together the key features of both approaches (such as systems thinking and individual accountability). Its fundamental difference from the systems approach, however, is the use of ecological rather than engineering metaphor. The justification for this is that the complexities of healthcare delivery are more akin to ecosystems than to manufacturing production lines or aviation. As Reason put it: ‘[the healthcare domain] contrasts with those domains in which the performance of the human operator is extensively moderated by automated safety features. It would, for example, require some ingenuity on the part of an individual pilot to engineer the crash of a modern airliner. But in many healthcare activities serious harm is but a few unguarded moments away’.
The advantage of the bionomic approach is that individuals are required to be more acutely aware of their role in patient safety, each clinician being a manager of risk in his or her clinical area and practice. With the systems approach, we risk staying stuck in first order change; a bionomic approach gives a better chance of achieving and sustaining second-order change.
This is not to say that the systems approach fails to take account of individual clinician’s contribution to safety; however, even when this approach places the individual ‘at the centre of the work system’, the emphasis is on system redesign to minimise fallibility. The bionomic approach places more emphasis on the importance of individuals as creators of safety. This emphasis has significant implications, however, and the difference between ‘preventing’ accidents and ‘creating’ safety could be more than semantic.
The individual as primary defence against patient safety incidents: cognitive skills and error management
Anaesthesiology is the medical specialty that has made the most progress in entrenching systems thinking within clinical practice. This is not particularly surprising, as anaesthetists rely substantially on technology and automation, whereas surgeons rely more on personal skills. The surgical specialties need to catch up with their anaesthetic colleagues in implementing systemic defences, but also need to promote safety wisdom among individual practitioners and incorporate safety competencies in the training of surgeons and surgical teams. Error wisdom does not preclude the occurrence of errors; safety-wise surgeons make errors but they proactively hunt for hazards and errors, identify them and contain them.
A major step to meeting this need has been taken with the recognition of the value of non-technical skills: cognitive, social and personal skills that, combined with technical proficiency, facilitate clinical safety. These skills, which include leadership, decision making, situation awareness, team work and communication, have been described as ‘what the best practitioners do in order to achieve consistently high performance, and what the rest of us do on a good day’. A further step is being taken with the realisation that safety skills go beyond these non-technical skills, encompassing a broader range of generic skills. Long et al. identified a preliminary set of 73 safety skills, in 18 categories ( Table 2 ), that a safe practitioner should have.
| Category | Individual skills |
|---|---|
| Anticipation and preparedness | Anticipation of organisational problems. Being able to anticipate the deteriorating patient. Contingency planning with clearly defined levels of care. Developing risk-averse methods of working. Thinking ‘what could go wrong today?’ and trying to prepare for it. |
| Awareness of the patient (including empathy) | Caring about patients. Empathy. Not hating patients you can’t solve. Not thinking of re-attenders as a nuisance. Thinking physiologically. |
| Awareness of oneself | Not letting your emotions interfere with patient care. Continuous questioning of self and others. Having up-to-date knowledge and training. Learning from previous mistakes. Reflective thinking. Self awareness — recognising one’s own limitations. Thinking ‘how am I today?’ (tiredness etc.). Knowing who, when and how to call for help appropriately. |
| Awareness of the situation | Being able to minimise distractions. Recognising error-prone situations. Information gathering. |
| Awareness of one’s team | Awareness of others around you. Being aware of unsafe members in the team. Being receptive to others in the team. Team awareness and monitoring. |
| Common sense | Having a common-sense approach. Being able to follow instructions. |
| Confidence | Being able to speak up. Being confident in decision making. Having an appropriate level of confidence and assertiveness. |
| Conscientiousness | Being thorough and paying attention to detail. Checking and re-checking. Conducting a thorough history and examination. Going out of your way to help. Hunting for answers. |
| Crisis management | The ability to think clearly in a crisis situation. Acting decisively in a crisis. |
| Honesty | Honesty. Proactive and open communication. Being open about error. |
| Humility | Taking criticism constructively. Humility. Being courteous and considerate. Willingness to listen and take advice. Allowing others to take over. |
| Leadership | Having good leadership skills. |
| Open-mindedness | Being open minded. |
| Organisational skills and efficiency | Organisational skills and efficiency. Co ordination. Prioritisation/multi-tasking. |
| Responsiveness | Acting decisively if hazards are noticed. Changing one’s behaviour in response to tiredness. Responding to changes in circumstances. Thinking and problem solving. |
| Team working and communication | Asking team for reminders Being available and perceived as available. Delegating appropriately. Encouraging frequent and regular team meetings. Giving constructive feedback. Having a sense of togetherness within the team. Team working. |
| Technical skills | Having good technical skills. |
| Vigilance | Alertness and being ‘on the ball’. Pattern recognition and vigilance for deviation from patterns. Regularly re-reviewing the situation. |
Stay updated, free articles. Join our Telegram channel
Full access? Get Clinical Tree