How to Avoid Catastrophe
By Catherine H. Tinsley, Robin L. Dillon, and Peter M. Madsen
Akhmad H Gumasjaya & Herryanto Prasetyo
2
Author
Catherine
H. Tinsley
Robin L.
Dillon
Peter M.
Madsen
3
Near Miss
Most people think of “near
misses” as harrowing close
calls that could have been a lot
worse
4
Near Miss
• But there’s another class of near misses
• Unremarked small failures that permeate
day-to-day business but cause no immediate
harm, if conditions shift slightly, or if luck
does not intervene, a crisis erupts
5
Cognitive Biases
• Normalization of Deviance: The tendency over time to
accept anomalies—particularly risky ones—as normal
• Outcome Bias: People observe successful outcomes, they
tend to focus on the results more than on the (often
unseen) complex processes that led to them.
6
Roots of Crises
• Organizational disasters, studies show, rarely
have a single cause.
• Rather, they are initiated by the unexpected
interaction of multiple small, often seemingly
unimportant, human errors, technological
failures, or bad business decisions.
• These latent errors combine with enabling
conditions to produce a significant failure
7
BP Gulf of Mexico Oil Rig Disaster
• The well had been
plagued by technical
problems all along
• Stakeholders were lulled
by a catalog of previous
near misses
8
Bad Apple
• Immediately after iPhone
4 launch, Customers
began complaining about
dropped calls and poor
signal strength
• Apple blame user
• They state dropped call a
non issue
• Several filed class action
lawsuits
• Customer Report
declined to
recommend iPhone 4
9
Speed Warning
Complaints of acceleration
problems in Toyotas
increased sharply after
they use new accelerator
design
Normalization of deviance
and outcome bias, along
with other factors, conspired
to obscure the grave
implications of the near
misses
10
Jet Black and Blue
• JetBlue Airways
canceling
proportionately fewer
flights than other airlines
and directing its pilots to
pull away from gates as
soon as possible in
severe weather.
• The airline reported canceling
more than 250 of its 505 flights
that day. It lost millions of dollars
and squandered priceless
consumer loyalty
11
Recognizing & Preventing Near Misses
1 Heed
High
Pressure
2 Learn
From
Deviation
3 Uncover
Root
Cause
4 Demand
Accountability
5 Consider
Worst Case
Scenario
6 Evaluate
Projects at
Every Stage
7 Reward
Owning Up
12
1. Heed High Pressure
• With greater the pressure to meet
performance goals, manager more likely to
discount near-miss signals or misread them
as signs of sound decision making
Real Life Example:
• BP Oil Split
• Columbia space shuttle
disaster
13
1. Heed High Pressure
• BP Gulf Oil Rig Disaster
o Incurring overrun costs of $1 million a day in rig lease
and contractor fees
• Columbia space shuttle disaster
o The pressure of maintaining the flight schedule
14
1. Heed High Pressure
• Organizations should encourage, or even
require, employees to examine their
decisions during pressure-filled periods and
ask, “If I had more time and resources,
would I make the same decision?”
15
2. Learn from Deviations
• Decision makers may clearly understand the
statistical risk represented by the deviation, but
grow increasingly less concerned about it
16
2. Learn from Deviations
• Managers should seek out operational
deviations from the norm and examine whether
their reasons for tolerating the associated risk
have merit.
Have we always been
comfortable with this
level of risk?
Has our policy toward
this risk changed over
time?
17
3. Uncover Root Causes
• When managers identify deviations, their reflex
is often to correct the symptom rather than its
cause.
18
3. Uncover Root Causes
• Iphone 4 – Bad Apple
• 1998 Mars Climate Orbiter Mission
19
4. Demand Accountability
• Even when people are aware of near misses,
they tend to downgrade their importance.
• One way to limit this potentially dangerous
effect is to require managers to justify their
assessments of near misses
20
5. Consider Worst-Case Scenarios.
• Unless expressly advised to do so, people tend
not to think through the possible negative
consequences of near misses
21
6. Evaluate Projects at Every Stage
• NASA use “pause and
learn” process in which
teams discuss at each
project milestone what
they have learned
22
6. Evaluate Projects at Every Stage
23
7. Reward Owning Up
• There are reason for employee to keep quiet
about failures or discourage to expose near
miss.
• Leaders in any organization should publicly
reward staff for uncovering near misses—
including their own
24
Conclusion
• Two forces conspire to make learning from near
misses difficult:
o Cognitive biases make them hard to see, and,
o Even when they are visible, leaders tend to ignore
it
• Surfacing near misses and correcting root
causes is one the soundest investments an
organization can make

HBR - How to Avoid Catastrophe

  • 1.
    How to AvoidCatastrophe By Catherine H. Tinsley, Robin L. Dillon, and Peter M. Madsen Akhmad H Gumasjaya & Herryanto Prasetyo
  • 2.
  • 3.
    3 Near Miss Most peoplethink of “near misses” as harrowing close calls that could have been a lot worse
  • 4.
    4 Near Miss • Butthere’s another class of near misses • Unremarked small failures that permeate day-to-day business but cause no immediate harm, if conditions shift slightly, or if luck does not intervene, a crisis erupts
  • 5.
    5 Cognitive Biases • Normalizationof Deviance: The tendency over time to accept anomalies—particularly risky ones—as normal • Outcome Bias: People observe successful outcomes, they tend to focus on the results more than on the (often unseen) complex processes that led to them.
  • 6.
    6 Roots of Crises •Organizational disasters, studies show, rarely have a single cause. • Rather, they are initiated by the unexpected interaction of multiple small, often seemingly unimportant, human errors, technological failures, or bad business decisions. • These latent errors combine with enabling conditions to produce a significant failure
  • 7.
    7 BP Gulf ofMexico Oil Rig Disaster • The well had been plagued by technical problems all along • Stakeholders were lulled by a catalog of previous near misses
  • 8.
    8 Bad Apple • Immediatelyafter iPhone 4 launch, Customers began complaining about dropped calls and poor signal strength • Apple blame user • They state dropped call a non issue • Several filed class action lawsuits • Customer Report declined to recommend iPhone 4
  • 9.
    9 Speed Warning Complaints ofacceleration problems in Toyotas increased sharply after they use new accelerator design Normalization of deviance and outcome bias, along with other factors, conspired to obscure the grave implications of the near misses
  • 10.
    10 Jet Black andBlue • JetBlue Airways canceling proportionately fewer flights than other airlines and directing its pilots to pull away from gates as soon as possible in severe weather. • The airline reported canceling more than 250 of its 505 flights that day. It lost millions of dollars and squandered priceless consumer loyalty
  • 11.
    11 Recognizing & PreventingNear Misses 1 Heed High Pressure 2 Learn From Deviation 3 Uncover Root Cause 4 Demand Accountability 5 Consider Worst Case Scenario 6 Evaluate Projects at Every Stage 7 Reward Owning Up
  • 12.
    12 1. Heed HighPressure • With greater the pressure to meet performance goals, manager more likely to discount near-miss signals or misread them as signs of sound decision making Real Life Example: • BP Oil Split • Columbia space shuttle disaster
  • 13.
    13 1. Heed HighPressure • BP Gulf Oil Rig Disaster o Incurring overrun costs of $1 million a day in rig lease and contractor fees • Columbia space shuttle disaster o The pressure of maintaining the flight schedule
  • 14.
    14 1. Heed HighPressure • Organizations should encourage, or even require, employees to examine their decisions during pressure-filled periods and ask, “If I had more time and resources, would I make the same decision?”
  • 15.
    15 2. Learn fromDeviations • Decision makers may clearly understand the statistical risk represented by the deviation, but grow increasingly less concerned about it
  • 16.
    16 2. Learn fromDeviations • Managers should seek out operational deviations from the norm and examine whether their reasons for tolerating the associated risk have merit. Have we always been comfortable with this level of risk? Has our policy toward this risk changed over time?
  • 17.
    17 3. Uncover RootCauses • When managers identify deviations, their reflex is often to correct the symptom rather than its cause.
  • 18.
    18 3. Uncover RootCauses • Iphone 4 – Bad Apple • 1998 Mars Climate Orbiter Mission
  • 19.
    19 4. Demand Accountability •Even when people are aware of near misses, they tend to downgrade their importance. • One way to limit this potentially dangerous effect is to require managers to justify their assessments of near misses
  • 20.
    20 5. Consider Worst-CaseScenarios. • Unless expressly advised to do so, people tend not to think through the possible negative consequences of near misses
  • 21.
    21 6. Evaluate Projectsat Every Stage • NASA use “pause and learn” process in which teams discuss at each project milestone what they have learned
  • 22.
  • 23.
    23 7. Reward OwningUp • There are reason for employee to keep quiet about failures or discourage to expose near miss. • Leaders in any organization should publicly reward staff for uncovering near misses— including their own
  • 24.
    24 Conclusion • Two forcesconspire to make learning from near misses difficult: o Cognitive biases make them hard to see, and, o Even when they are visible, leaders tend to ignore it • Surfacing near misses and correcting root causes is one the soundest investments an organization can make

Editor's Notes

  • #3 Catherine H. Tinsley is a Professor of Management and Academic Director of the Executive Masters in Leadership Program at the McDonough School of Business at Georgetown University. She is also Executive Director of the Georgetown University Women’s Leadership Institute and a Zaeslin fellow at the college of Law and Economics, University of Basel.  Robin L. Dillon-Merrill is a Professor and Area Coordinator for the Operations and Information Management Group in the McDonough School of Business at Georgetown University. Professor Dillon-Merrill seeks to understand and explain how and why people make the decisions that they do under conditions of uncertainty and risk. This research specifically examines critical decisions that people have made following near-miss events in situations with severe outcomes including hurricane evacuation, terrorism, cybersecurity, and NASA mission management. She has received research funding from the National Science Foundation, NASA, the Department of Defense, and the Department of Homeland Security through USC’s National Center for Risk and Economic Analysis for Terrorism Events. She has served as a risk analysis and project management expert on several National Academies Committees including the review of the New Orleans regional hurricane protection projects and the application of risk analysis techniques to securing the Department of Energy’s special nuclear materials. She has a B.S./M.S. from the University of Virginia in Systems Engineering and a Ph.D. from Stanford University. From 1993-1995 she worked as a systems engineer for the Fluor Daniel Corporation. She can be reached via e-mail at [email protected]. Peter M. Madsen (Ph.D. UC Berkeley) is an associate professor of OBHR. His research focuses on organizational learning—particularly on how organizations and their members learn from failure and attempt to manage uncertainty and risk. Madsen also studies the interplay between safety, environmental, social, and financial goals in organizations. He has examined these issues in the aerospace, airline, automobile, healthcare, mining, and chemicals industries. This work has been published in top management and safety journals, including: Academy of Management Journal, Organization Science, Journal of Management, Harvard Business Review, and Quality and Safety in Health Care. Madsen’s recearch has been awarded: Western Academy of Management Ascendent Scholar Award, Western Academy of Management Best Paper, Academy of Management Best Paper Finalist, and Emerald Management Reviews Citation of Excellence.
  • #4 Di HSSE ada istilahnya kondisi tidak aman
  • #5 Multiple near misses preceded (and foreshadowed) every disaster and business crisis most of the misses were ignored or misread cognitive biases conspire to blind managers to the near misses
  • #6 cognitive biases conspire to blind managers to the near misses. Cognitive biases, next step stelah near miss. Penyimpangan kognitife. Normilasi dari penyimpangan. Karena terbiasa, near miss Selama outcome: Cuma lihat outcome bukan prosess, keberhasilan semu. Karena proses tidak sempurna. ko
  • #8 A gas blowout occurred during the cementing of the Deepwater Horizon well Killing 11 people, Sinking the rig, and triggering a massive underwater spill killed 11 people and caused 200 million gallons of oil to flood into the Gulf of Mexico Di BP pun terjadi
  • #9 Apa yang dilakukan setelah ini
  • #11 Rather than perceiving that a dramatic increase in delays represented a dramatic increase in risk, JetBlue managers saw only successfully launched flights. It took an enabling condition—the ferocious ice storm—to turn the latent error into a crisis
  • #12 Avoid, transfer, mitigate, accept
  • #13 Waspada: jika ada di high pressure, bikin kesimpulan, klo ga di high pressure, sama atau engga. Kesimpulannya: kalau lagi high pressure harus sadar.
  • #16 Kalau terjadi deviasi kamu harus belajar/dipelajari, jangan diacuhkan
  • #20 Meminta pertanggung jawaban
  • #21 immediante If they had considered a worst-case scenario, they might have headed off the crisis, our research Suggests Examining events closely helps people distinguish between near misses and successes, and they’ll often adjust their decision making accordingly.
  • #22 Duduk bersama, melepaskan pekerjaan, posisi dan jabatan untuk membahas hasil research, pekerjaan atau project
  • #24 Menghargai orang yang mengklaim kesalahan
  • #25 Bias yang mempengaruhi judgment, Tend to ignore = karena high pressure, tidak accountable atas itu, Soundest investment = investment yang sangat berharga, Terakumulasi jadi disaster.