Skip to main content

From Reason and Rasmussen to Kahneman and Thaler: Styles of Thinking and Human Reliability in High Hazard Industries

  • Conference paper
  • First Online:
Advances in Human Error, Reliability, Resilience, and Performance (AHFE 2017)

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 589))

Included in the following conference series:

Abstract

Much of the practical work conducted to minimise risk of human error and loss of human reliability, at least in oil and gas, chemicals and other process industries over the past 30 or so years, has been based on the model of basic error types known as the Generic Error Modelling System (GEMS). Over roughly the same period, psychologists and behavioural economists, have developed a rich understanding of the nature and characteristics of what, in simplified terms, are widely considered to be two styles of thinking – often referred to as “System 1” and “System 2”. This paper explores the relationship between the GEMS model and what is known of the functioning of the two styles of thinking, and in particular the characteristics and biases associated with System 1 thinking. While many of the ideas behind the two styles of thinking are embedded in the GEMS model, there are some important omissions.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Kahneman, D.: Thinking, Fast and Slow. Allen Lane, London (2011)

    Google Scholar 

  2. Reason, J.: Human Error. Cambridge University Press, Cambridge (1990)

    Book  Google Scholar 

  3. Kahneman, D., Klein, G.: Conditions for intuitive expertise: a failure to disagree. Am. Psychol. 64(6), 515–526 (2009)

    Article  Google Scholar 

  4. CSB: Investigation report Volume 3: Drilling rig explosion and fire at the Macondo well. Report no 2010-10-I-OS: Investigation report Volume 3. Chemical Safety Board (2016)

    Google Scholar 

  5. Swain, A.D., Guttman, H.E.: Handbook of human reliability analysis with emphasis on nuclear power plant applications. Final report, NUREG/CR-1278, USNRC (1983)

    Google Scholar 

  6. McLeod, R.W.: Implications of styles of thinking for risk awareness and decision-making in safety critical operations. Cognitia 22(3), Human Factors and Ergonomics Society (2016)

    Google Scholar 

  7. McLeod, R.W.: Human factors in barrier management: hard truths and challenges. Process Saf. Environ. Perform (In Press). IChemE

    Google Scholar 

  8. RAIB: Fatal accident involving a track worker near Newark North gate station, 22 January 2014. Report 01/2015, Rail Accident Investigation Branch (2015)

    Google Scholar 

  9. Thaler, R.: Misbehaving: The Making of Behavioural Economics. Allen Lane, London (2015)

    Google Scholar 

  10. McLeod, R.W.: Designing for Human Reliability: Human Factors Engineering for the Oil, Gas and Process Industries. Gulf Professional Publishing, Houston (2015)

    Google Scholar 

  11. McLeod, R.W.: The impact of styles of thinking and cognitive bias on how people assess risk and make real world decisions in oil and gas companies. Oil and Gas Facilities, Society of Petroleum Engineers (2016)

    Google Scholar 

  12. Office of Nuclear Regulatory Research: Building a Psychological Foundation for Human Reliability Analysis. NUREG-2114, INL/EXT-11-23898 (2012)

    Google Scholar 

  13. IFE: The PetroHRA guideline. IFE/HR/E-2017/001, Institute for Energy Technology (2017)

    Google Scholar 

  14. Weick, K.E., Sutcliffe, K.M.: Managing the Unexpected: Resilient Performance in an Age of Uncertainty, 2nd edn. Jossey-Bass, San Francisco (2007)

    Google Scholar 

  15. Dekker, S.: The Field Guide to Understanding Human Error. Ashgate, Farnham (2006)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ronald W. McLeod .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this paper

Cite this paper

McLeod, R.W. (2018). From Reason and Rasmussen to Kahneman and Thaler: Styles of Thinking and Human Reliability in High Hazard Industries. In: Boring, R. (eds) Advances in Human Error, Reliability, Resilience, and Performance. AHFE 2017. Advances in Intelligent Systems and Computing, vol 589. Springer, Cham. https://doi.org/10.1007/978-3-319-60645-3_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-60645-3_11

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-60644-6

  • Online ISBN: 978-3-319-60645-3

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics