Skip to main content

Parts and Wholes: Scenarios and Simulators for Human Performance Studies

  • Conference paper
  • First Online:
Advances in Human Error, Reliability, Resilience, and Performance (AHFE 2018)

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 778))

Included in the following conference series:

Abstract

As tools like full-scale simulators and microworlds become more readily available to researchers, a fundamental question remains the extent to which full scenarios and simulators are necessary for valid and generalizable results. In this paper, we explore the continuum of scenarios and simulators and evaluate the advantages and disadvantages of each for human performance studies. The types of scenarios presented to participants may range from microtasks to complex multi-step scenarios. Microtasks usually involve only brief exposure to the human-system interface but may thereby facilitate ready data collection through repeated trials. In contrast, full scenarios present a sequence of actions that may require an extended period of time. The tradeoffs center on the fidelity of the situations and the requirements for the type of human performance data to be collected. The type of simulator presented to participants may range from a part-task simulator, to a simplified microworld, or to a full-scope high-fidelity simulator. The simplified simulators present greater opportunity for control but lose much of the context of real-world use found in full-scope simulators. We frame scenarios and simulators in the context of micro- vs. macro-cognition and provide examples of how the different experimental design choices lend themselves to different types of studies.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Boring, R.L., Jung, W., Lau, N., Skraaning Jr., G.: How to run a control room simulator research study. In: 9th International Topical Meeting on Nuclear Power Plant Instrumentation, Control, and Human-Machine Interface Technologies (NPIC&HMIT), pp. 1560–1568 (2015)

    Google Scholar 

  2. Boring, R., Agarwal, V.: Beyond integrated system validations: use of a control room training simulator for proof-of-concept interface development. In: 8th International Topical Meeting on Nuclear Power Plant Instrumentation, Control, and Human-Machine Interface Technologies (NPIC&HMIT), pp. 1389–1399 (2012)

    Google Scholar 

  3. O’Hara, J., Stubler, W., Higgins, J., Brown, W.: Integrated System Validation: Methodology and Review Criteria, NUREG/CR6393. U.S. Nuclear Regulatory Commission, Washington, DC (1995)

    Google Scholar 

  4. International Standards Organization: Ergonomics of Human-System Interaction—Part 210: Human Centred Design for Interactive Systems, ISO 9241-210, Geneva (2010)

    Google Scholar 

  5. American Nuclear Society: Nuclear Power Plant Simulators for Use in Operator Training and Examination, ANSI/ANS-3.5-2009, La Grange Park, Illinois (2009)

    Google Scholar 

  6. Boring, R.L.: The use of simulators in human factors studies within the nuclear industry. In: Simulator-Based Human Factors Studies Across 25 Years, pp. 3–17, Springer, London (2011)

    Chapter  Google Scholar 

  7. Boring, R.L.: Lessons learned using a full-scale glasstop simulator for control room modernization in nuclear power plants. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, pp. 1712–1716 (2013)

    Article  Google Scholar 

  8. Boring, R., Agarwal, V., Fitzgerald, K., Hugo, J., Hallbert, B.: Digital Full-Scope Simulation of a Conventional Nuclear Power Plant Control Room, Phase 2: Installation of a Reconfigurable Simulator to Support Nuclear Plant Sustainability, INL/EXT-13-28432. Idaho National Laboratory, Idaho Falls (2013)

    Book  Google Scholar 

  9. Naweed, A., Hockey, G.R.J., Clarke, S.D.: Designing simulator tools for rail research: the case study of a train driving microworld. Appl. Ergon. 44, 445–454 (2013)

    Article  Google Scholar 

  10. Ulrich, T., Boring, R., Lew, R.: Extrapolating nuclear process control microworld simulation performance data from novices to experts—a preliminary analysis. Advances in Intelligent Systems and Computing (2018, in press)

    Google Scholar 

  11. Lew, R., Boring, R.L., Ulrich, T.A.: A prototyping environment for research on human-machine interfaces in process control. In: Seventh International Symposium on Resilient Control Systems (2014)

    Google Scholar 

  12. Ulrich, T.A., Lew, R., Werner, S., Boring, R.L: Rancor: a gamified microworld nuclear power plant simulation for engineering psychology research and process control applications. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, pp. 398–402 (2017)

    Article  Google Scholar 

  13. Boring, R.L., Ulrich, T.A., Joe, J.C., Lew, R.T.: Guideline for operational nuclear usability and knowledge elicitation (GONUKE). Procedia Manuf. 3, 1327–1334 (2015)

    Article  Google Scholar 

  14. Leis, R., Reinerman-Jones, L.E., Sollins, B., Barber, D.J., Mercado, J.: Workload from nuclear power plant task types across repeated sessions. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, pp. 210–214 (2014)

    Article  Google Scholar 

  15. Ulrich, T.A., Werner, S., Boring, R.L.: Studying situation awareness on a shoestring budget: an example of an inexpensive simulation environment for theoretical research. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, pp. 1520–1524 (2015)

    Article  Google Scholar 

  16. Boring, R.L., Joe, J.C., Ulrich, T.A., Lew, R.T.: Early-stage design and evaluation for nuclear power plant control room upgrades. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, pp. 1909–1913 (2014)

    Article  Google Scholar 

  17. Boring, R.L., Lew, R., Ulrich, T.A.: Epistemiation: an approach for knowledge elicitation of expert users during product design. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, pp. 1699–1703 (2016)

    Article  Google Scholar 

  18. Boring, R.L.: Envy in V&V: an opinion piece on new directions for verification and validation in nuclear power plants. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, pp. 1746–1750 (2015)

    Article  Google Scholar 

  19. Boring, R.L., Griffith, C.D., Joe, J.C.: The measure of human error: direct and indirect performance shaping factors, official. In: Proceedings of the Joint 8th IEEE Conference on Human Factors and Power Plants and the 13th Annual Workshop on Human Performance/Root Cause/Trending/Operating Experience/Self Assessment, pp. 170–176 (2007)

    Google Scholar 

  20. Hildebrandt, M., Fernandes, A.: Micro task evaluation of analog vs. digital power plant control room interfaces. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, pp. 1349–1353 (2016)

    Article  Google Scholar 

  21. Boring, R.L., Lau, N.: Measurement sufficiency versus completeness: integrating safety cases into verification and validation in nuclear control room modernization. Advances in Intelligent Systems and Computing, vol. 495, pp. 79–90 (2016)

    Google Scholar 

  22. Cacciabue, P.C., Hollnagel, E.: Simulation of Cognition: Applications, Expertise and Technology. Lawrence Erlbaum Associates, Hillsdale (1995)

    Google Scholar 

  23. Hutchins, E.: Cognition in the Wild. MIT Press, Cambridge (1995)

    Google Scholar 

  24. Hickling, E.M., Bowie, J.E.: Applicability of human reliability assessment methods to human-computer interfaces. Cog. Tech. Work 15, 19–27 (2012)

    Article  Google Scholar 

Download references

Disclaimer

The opinions expressed in this paper are entirely those of the authors and do not represent official position. This work of authorship was prepared as an account of work sponsored by Idaho National Laboratory, an agency of the United States Government. Neither the United States Government, nor any agency thereof, nor any of their employees makes any warranty, express or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately-owned rights. Idaho National Laboratory is a multi-program laboratory operated by Battelle Energy Alliance LLC, for the United States Department of Energy under Contract DE-AC07-05ID14517. This research was funded through the Laboratory Directed Research and Development program at Idaho National Laboratory.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ronald L. Boring .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer International Publishing AG, part of Springer Nature (outside the USA)

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Boring, R.L., Ulrich, T.A., Lew, R., Rasmussen, M. (2019). Parts and Wholes: Scenarios and Simulators for Human Performance Studies. In: Boring, R. (eds) Advances in Human Error, Reliability, Resilience, and Performance. AHFE 2018. Advances in Intelligent Systems and Computing, vol 778. Springer, Cham. https://doi.org/10.1007/978-3-319-94391-6_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-94391-6_12

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-94390-9

  • Online ISBN: 978-3-319-94391-6

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics