Skip to content
Permalink
Branch: master
Find file Copy path
Find file Copy path
Fetching contributors…
Cannot retrieve contributors at this time
148 lines (129 sloc) 7.27 KB

Reflecting on Jens Rasmussen’s legacy. A strong program for a hard problem

Jean Christophe Le Coze, Safety Science, Vol. 71 (January 2015), pp. 123-141 doi:10.1016/j.ssci.2014.03.015 citeulike:14525475

Abstract

Jens Rasmussen has been a very influential thinker for the last quarter of the 20th century in the safety science field and especially in major hazard prevention. He shaped many of the basic assumptions regarding safety and accidents which are still held today. One can see that many of his ideas underlie more recent advances in this field. Indeed, in the first decade of the 21st century, many have been inspired by his propositions and have pursued their own research agendas by using, extending or criticising his ideas. The author of numerous articles, chapters of books and books, Rasmussen had an inspiring scientific research record spreading over 30 years, expanding across the boundaries of many scientific disciplines. This article introduces selected elements of Rasmussen’s legacy, including the SRK model, his theoretical approach of errors, the issue of investigating accidents, his model of migration and the sociotechnical view. It will be demonstrated that Jens Rasmussen provided key concepts for understanding safety and accidents, many of which are still relevant today. In particular, this article introduces how some principles such as degree of freedom, self organisation and adaptation, defence in depth fallacy but also the notion of error as ‘unsuccessful experiment with unacceptable consequences’ still offer powerful insights into the challenge of predicting and preventing major accidents. It is also argued that they combine into a specific interpretation of the ‘normal accident’ debate, anticipating current trends based on complexity lenses. Overall, Jens Rasmussen defines the contours of what is called ‘a strong program for a hard problem’.

Outline

  1. Introduction
    • Two articles on Jens Rasmussen's legacy
    • Preliminary remarks
  2. Methodology
    • Article sections
  3. Modelling process plant operator in relation to engineering issues
    • Genesis of the model
    • Hollnagel's critique of SRK
  4. The conceputalisation of 'human error'
    • Two independent fields, 'psychology' and 'cognitive engineering'
    • An early 'naturalistic' perspective on 'human error'
  5. The difference between technical and human reliability/safety analysis
  6. Intermediate comments
  7. A new vision for accident and safety
    • Degree of freedom, self-organnisation and defence-in-depth fallacy
    • A 'normal accident' perspective, as the product of organisational migration towards the boundary of acceptable performance
    • Rasmussen's Ashbyan version of normal accidents
  8. Investigating accidents
    • Causality, stop rules and goals
    • Accimap
  9. The whole is more than the sum of its parts
    • A socio-technical perspective based on feedback loops
    • Concerning the managerial issue
    • Safety science as cross-disciplinary problem driven research and a convergence of human science paradigms
    • 'A strong program for a hard problem'
  10. Conclusion

Notes

  • Emphasized importance of studying real life situations instead of experimental data
  • Importance of context in shaping cognitive strategies
  • quote: to optimize performance, to develop smooth and efficient skills, it is very important to have opportunities to perform trial and error experiments, and humans errors can in a way be considered as unsuccessful experiments with unacceptable consequences
  • Rasmussen was critical of human reliability assessment (HRA)
  • fallacy of defence-in-depth: one basic problem is that in such a system having functionally redundant protective defenses, a local violation of one of the defenses has no immediate, visible effect and then may not be observed in action. In this situation the boundary of safe behaviour of one particular actor depends on the possible violation of defenses by other actors.
  • Boundaries:
    • boundary of functionally acceptable performance
    • boundary to economic failure
    • boundary to unnaceptable work load

Skill-rule-knowledge (SKR) model

  • Three levels model
    1. Skill-based behaviour
    • automatic or unconscious processes
    • internalised through experience
    • triggered by sensory inputs
    1. Rule-based behaviour
    • Explicit, known rules
    • Operators consciously active to perform specific tasks
    1. Knowledge-based behaviour
    • Ability in new circumstances to find responses that are:
    • not directly available in the operator's repertoire
    • require considerable attention and concentration

Difference in approach Reason & Mycielska

  1. Reason and Micielska consider error from a psychological angle
  2. Reason and Micielska focused on types of errors related to automatic or unconscious cognitive processes that could be categorised as 'slips' or 'lapses'.
  3. The absence of detailed empirical study of real-life situations of operators of technological systems in the theoretical approach of Reason and Micielska

Fallacy of defense in depth

From http://www.macroresilience.com/2011/12/29/people-make-poor-monitors-for-computers/ quoting James Reason:

the system very often does not respond actively to single faults. Consequently, many errors and faults made by the staff and maintenance personnel do not directly reveal themselves by functional response from the system. Humans can operate with an extremely high level of reliability in a dynamic environment when slips and mistakes have immediately visible effects and can be corrected……Violation of safety preconditions during work on the system will probably not result in an immediate functional response, and latent effects of erroneous acts can therefore be left in the system. When such errors are allowed to be present in a system over a longer period of time, the probability of coincidence of the multiple faults necessary for release of an accident is drastically increased. Analyses of major accidents typically show that the basic safety of the system has eroded due to latent errors.

Normal accident threads

  • Ellulian
    • Technological determinism
    • Technology out of control
    • Technology is subverise
    • J. Ellul: autonomy of technology
    • Ch. Perrow: normal accidents
  • Kuhnian
    • Fallible constructs
    • epistemic
    • constructivist
    • failure of foresight
    • institutional views are unable to consider possibility of events or to integrate anomalies early enough to be modified accordingly beofre an accident
    • T. Kuhn: Paradigm shift
    • B. Turner: incubation periods (anomalies do not challenge worldview)
    • K. Weick: collapse of sensemaking
    • D. Vaughan: normalisation of deviance
  • Ashbyan
    • Self organised, emergent systems
    • cybernetics
    • R Ashby: self-organising systems
    • J Rasmussen: self-organised migration
    • S. Snook: Practical drift
    • E. Hollnagel: resonance

Role of managers

  1. Information - boundaries of acceptable performance should be visible
  2. Competency - decision makers should be competent
  3. Awareness - decision makers should be aware of safety implications of their actions.
  4. Commitment - adequate resources should be present to maintain defences.
You can’t perform that action at this time.