Sidney Dekker Quotes

We've searched our database for all the quotes and captions related to Sidney Dekker. Here they are! All 44 of them:

Underneath every simple, obvious story about ‘human error,’ there is a deeper, more complex story about the organization.
Sidney Dekker (The Field Guide to Understanding 'Human Error')
Accidents are no longer accidents at all. They are failures of risk management.
Sidney Dekker (Just Culture: Balancing Safety and Accountability)
Not being able to find a cause is profoundly distressing; it creates anxiety because it implies a loss of control. The desire to find a cause is driven by fear.
Sidney Dekker (Just Culture: Balancing Safety and Accountability)
If professionals consider one thing “unjust,” it is often this: Split-second operational decisions that get evaluated, turned over, examined, picked apart, and analyzed for months—by people who were not there when the decision was taken, and whose daily work does not even involve such decisions.
Sidney Dekker (Just Culture: Balancing Safety and Accountability)
There is almost no human action or decision that cannot be made to look flawed and less sensible in the misleading light of hindsight.
Sidney Dekker (The Field Guide to Understanding 'Human Error')
There is almost no human action or decision that cannot be made to look flawed and less sensible in the misleading light of hindsight. It is essential that the critic should keep himself constantly aware of that fact.”1
Sidney Dekker (The Field Guide to Understanding 'Human Error')
If we adjudicate an operator’s understanding of an unfolding situation against our own truth, which includes knowledge of hindsight, we may learn little of value about why people saw what they did, and why taking or not taking action made sense to them.
Sidney Dekker (Drift into Failure: From Hunting Broken Components to Understanding Complex Systems)
Unjust responses to failure are almost never the result of bad performance. They are the result of bad relationships.
Sidney Dekker (Just Culture: Balancing Safety and Accountability)
The question that drives safety work in a just culture is not who is responsible for failure, rather, it asks what is responsible for things going wrong. What is the set of engineered and organized circumstances that is responsible for putting people in a position where they end up doing things that go wrong?
Sidney Dekker (Just Culture: Balancing Safety and Accountability)
A just culture accepts nobody’s account as “true” or “right” and others wrong.
Sidney Dekker (Just Culture: Balancing Safety and Accountability)
It has to do with being open, with a willingness to share information about safety problems without the fear of being nailed for them.
Sidney Dekker (Just Culture: Balancing Safety and Accountability)
Creating a climate in which disclosure is possible and acceptable is the organization’s responsibility.
Sidney Dekker (Just Culture: Balancing Safety and Accountability)
forward-looking accountability.”2 Accountability that is backward-looking (often the kind in trials or lawsuits) tries to find a scapegoat, to blame and shame an individual for messing up. But accountability is about looking ahead. Not only should accountability acknowledge the mistake and the harm resulting from it, it should lay out the opportunities (and responsibilities!) for making changes so that the probability of such harm happening again goes down.
Sidney Dekker (Just Culture: Balancing Safety and Accountability)
There is almost no human action or decision that cannot be made to look flawed and less sensible in the misleading light of hindsight. It is essential that the critic should keep himself constantly aware of that fact.
Sidney Dekker (Just Culture: Balancing Safety and Accountability)
The main question for a just culture is not about matching consequences with outcome. It is this: Did the assessments and actions of the professionals at the time make sense, given their knowledge, their goals, their attentional demands, their organizational context?
Sidney Dekker (Just Culture: Balancing Safety and Accountability)
Safety improvements come from organizations monitoring and understanding the gap between proceedures and practice
Sidney Dekker (The Field Guide to Understanding Human Error)
Saying what people failed to do has no role in understanding ‘human error.
Sidney Dekker (The Field Guide to Understanding 'Human Error')
Safety and risk are made and broken the whole time, throughout your organization. You are not the custodian of an otherwise safe system that you need to protect from erratic human beings.
Sidney Dekker (The Field Guide to Understanding 'Human Error')
In shipping, for example, injury counts were halved over a recent decade, but the number of shipping accidents tripled
Sidney Dekker (The Safety Anarchist: Relying on human expertise and innovation, reducing bureaucracy and compliance)
Tomorrow’s accident, which will be rare but no doubt even more disastrous, will be an accident where the regulations were in place to prevent the problem, or perhaps where no-one actually made an identifiable error and no system truly broke down but all the components had been weakened by erosion: the degree of variation within the operating conditions will one day prove enough to exceed the tolerable linkage thresholds.
Sidney Dekker (The Safety Anarchist: Relying on human expertise and innovation, reducing bureaucracy and compliance)
No matter how significant the numbers are according to their own logic, statistics fail to convey the true, lived meaning of the suffering they contain and can thus leave us numbly indifferent.
Sidney Dekker (The Safety Anarchist: Relying on human expertise and innovation, reducing bureaucracy and compliance)
When these accidents affect our customers, we seek to understand why it happened. The root cause is often deemed to be human error, and the all too common management response is to “name, blame, and shame” the person who caused the problem.† And, either subtly or explicitly, management hints that the person guilty of committing the error will be punished. They then create more processes and approvals to prevent the error from happening again. Dr. Sidney Dekker, who codified some of the key elements of safety culture and coined the term just culture, wrote, “Responses to incidents and accidents that are seen as unjust can impede safety investigations, promote fear rather than mindfulness in people who do safety-critical work, make organizations more bureaucratic rather than more careful, and cultivate professional secrecy, evasion, and self-protection.” These issues are especially problematic in the technology value stream—our work is almost always performed within a complex system, and how management chooses to react to failures and accidents leads to a culture of fear, which then makes it unlikely that problems and failure signals are ever reported. The result is that problems remain hidden until a catastrophe occurs.
Gene Kim (The DevOps Handbook: How to Create World-Class Agility, Reliability, and Security in Technology Organizations)
But the point of a ‘human error’ investigation is to understand why people’s assessments and actions made sense at the time, given their context, and without knowledge of outcome, not to point out what they should have done instead.
Sidney Dekker (The Field Guide to Understanding Human Error)
This is at the heart of the professional pilot’s eternal conflict,” writes Wilkinson in a comment to the November Oscar case. “Into one ear the airlines lecture, “Never break regulations. Never take a chance. Never ignore written procedures. Never compromise safety.” Yet in the other they whisper, “Don’t cost us time. Don’t waste our money. Get your passengers to their destination—don’t find reasons why you can’t.
Sidney Dekker (The Field Guide to Understanding Human Error)
Valujet flight 592 crashed after takeoff from Miami airport because oxygen generators in its cargo hold caught fire. The generators had been loaded onto the airplane by employees of a maintenance contractor, who were subsequently prosecuted. The editor of Aviation Week and Space Technology “strongly believed the failure of SabreTech employees to put caps on oxygen generators constituted willful negligence that led to the killing of 110 passengers and crew. Prosecutors were right to bring charges. There has to be some fear that not doing one’s job correctly could lead to prosecution.”13 But holding individuals accountable by prosecuting them misses the point. It shortcuts the need to learn fundamental lessons, if it acknowledges that fundamental lessons are there to be learned in the first place. In the SabreTech case, maintenance employees inhabited a world of boss-men and sudden firings, and that did not supply safety caps for expired oxygen generators. The airline may have been as inexperienced and under as much financial pressure as people in the maintenance organization supporting it. It was also a world of language difficulties—not only because many were Spanish speakers in an environment of English engineering language: “Here is what really happened. Nearly 600 people logged work time against the three Valujet airplanes in SabreTech’s Miami hangar; of them 72 workers logged 910 hours across several weeks against the job of replacing the ‘expired’ oxygen generators—those at the end of their approved lives. According to the supplied Valujet work card 0069, the second step of the seven-step process was: ‘If the generator has not been expended install shipping cap on the firing pin.’ This required a gang of hard-pressed mechanics to draw a distinction between canisters that were ‘expired’, meaning the ones they were removing, and canisters that were not ‘expended’, meaning the same ones, loaded and ready to fire, on which they were now expected to put nonexistent caps. Also involved were canisters which were expired and expended, and others which were not expired but were expended. And then, of course, there was the simpler thing—a set of new replacement canisters, which were both unexpended and unexpired.”14 These were conditions that existed long before the Valujet accident, and that exist in many places today. Fear of prosecution stifles the flow of information about such conditions. And information is the prime asset that makes a safety culture work. A flow of information earlier could in fact have told the bad news. It could have revealed these features of people’s tasks and tools; these longstanding vulnerabilities that form the stuff that accidents are made of. It would have shown how ‘human error’ is inextricably connected to how the work is done, with what resources, and under what circumstances and pressures.
Sidney Dekker (The Field Guide to Understanding Human Error)
A whole complex system cannot be inspected, only parts or sub-systems can be inspected.
Sidney Dekker (Drift into Failure: From Hunting Broken Components to Understanding Complex Systems)
People have long been willing to trade freedoms for economic prosperity, which is where a compelling link to the promises (though not generally delivered realities) of neoliberalism becomes visible.
Sidney Dekker (Compliance Capitalism: How Free Markets Have Led to Unfree, Overregulated Workers (The Business, Management and Safety Effects of Neoliberalism))
We are amazed at how little organizations recognize and value the opinion of their workforce about operational and production issues.
Sidney Dekker (Do Safety Differently)
Studying and enhancing the “information environment” for decision-making, as Rasmussen and Svedung put it, can be a good place to start.46 This information environment, after all, is where assessments are made, decisions are shaped, in which local rationality is created. It is the place where the social and the technical meet; where risk itself is constructed.
Sidney Dekker (Drift into Failure: From Hunting Broken Components to Understanding Complex Systems)
Our technologies have got ahead of our theories. Our theories are still fundamentally reductionist, componential and linear. Our technologies, however, are increasingly complex, emergent and non-linear.
Sidney Dekker (Drift into Failure: From Hunting Broken Components to Understanding Complex Systems)
with a shift of natural sciences into systems thinking and complexity, and social sciences and humanities doing the same, it seems that they once again seem to come closer to each other. Not because the social sciences and humanities are becoming “harder” and more quantifiable, but because natural sciences are becoming “softer” with an emphasis on unpredictability, irreducibility, non-linearity, time-irreversibility, adaptivity, self-organization, emergence – the sort of things that may always have been better suited to capture the social order.
Sidney Dekker (Drift into Failure: From Hunting Broken Components to Understanding Complex Systems)
Oh, and what about the nurse? She was fired and charged as a criminal. That’s Newton, too. If there are really bad effects, there must have been really bad causes. A dead patient means a really bad nurse. Much worse than if the patient had survived. So much worse, she’s got to be a criminal. Must be. We can’t escape Newton even in our thinking about one of the most difficult areas of safety: accountability for the consequences of failure.
Sidney Dekker (Drift into Failure: From Hunting Broken Components to Understanding Complex Systems)
Accountability can mean letting people tell their account, their story.
Sidney Dekker (The Field Guide to Understanding 'Human Error')
What is the cause of the accident? This question is just as bizarre as asking what the cause is of not having an accident.
Sidney Dekker (The Field Guide to Understanding 'Human Error')
In complex systems, after all, it is very hard to foresee or predict the consequences of presumed causes. So it is not the consequences that we should be afraid of (we might not even foresee them or believe them if we could). Rather, we should be weary of renaming things that negotiate their perceived risk down from what it was before.
Sidney Dekker (Drift into Failure: From Hunting Broken Components to Understanding Complex Systems)
Arriving at the edge of chaos is a logical endpoint for drift. At the edge of chaos, systems have tuned themselves to the point of maximum capability.
Sidney Dekker (Drift into Failure: From Hunting Broken Components to Understanding Complex Systems)
The question is not how pilots can be so silly to rely on this, but how they have been led to believe (built up a mental model) that this is actually effective.
Sidney Dekker (The Field Guide to Understanding Human Error)
Indeed, automation does not fail often, which limits people’s ability to practice the kinds of breakdown scenarios that still justifies their marginal presence in the system. Here, the human is painted as a passive monitor, whose greatest safety risks would lie in deskilling, complacency, vigilance decrements and the inability to intervene in deteriorating circumstances.
Sidney Dekker (The Field Guide to Understanding Human Error)
But errors are consequences: the leakage that occurs around the edges when you put pressure on a system without taking other factors into account.
Sidney Dekker (The Field Guide to Understanding Human Error)
People do not come to work to do a bad job. Safety in complex systems is not a result of getting rid of people, of reducing their degrees of freedom. Safety in complex systems is created by people through practice—at all levels of an organization.
Sidney Dekker (The Field Guide to Understanding Human Error)
Asking what is the cause, is just as bizarre as asking what is the cause of not having an accident. Accidents have their basis in the real complexity of the system, not their apparent simplicity.
Sidney Dekker (The Field Guide to Understanding Human Error)
Challenges to existing views are generally uncomfortable. Indeed, for most people and organizations, coming face to face with a mismatch between what they believed and what they have just experienced is difficult. These people and organizations will do anything to reduce the nature of the surprise.
Sidney Dekker (The Field Guide to Understanding Human Error)
Holding people accountable and [unfairly] blaming people are two quite different things,” Sidney Dekker, one of the world’s leading thinkers on complex systems, has said. “Blaming people may in fact make them less accountable: They will tell fewer accounts, they may feel less compelled to have their voice heard, to participate in improvement efforts.
Matthew Syed (Black Box Thinking: Why Most People Never Learn from Their Mistakes--But Some Do)
But the crucial point here is that justifiable blame does not undermine openness. Why? Because management has taken the time to find out what really happened rather than blaming preemptively, giving professionals the confidence that they can speak up without being penalized for honest mistakes. This is what is sometimes called a “just culture.” The question, according to Sidney Dekker, is not Who is to blame? It is not even Where, precisely, is the line between justifiable blame and an honest mistake? because this can never be determined in the abstract. Rather, the question is, Do those within the organization trust the people who are tasked with drawing that line? It is only when people trust those sitting in judgment that they will be open and diligent.8
Matthew Syed (Black Box Thinking: Why Most People Never Learn from Their Mistakes--But Some Do)