Abstract
As tools like full-scale simulators and microworlds become more readily available to researchers, a fundamental question remains the extent to which full scenarios and simulators are necessary for valid and generalizable results. In this paper, we explore the continuum of scenarios and simulators and evaluate the advantages and disadvantages of each for human performance studies. The types of scenarios presented to participants may range from microtasks to complex multi-step scenarios. Microtasks usually involve only brief exposure to the human-system interface but may thereby facilitate ready data collection through repeated trials. In contrast, full scenarios present a sequence of actions that may require an extended period of time. The tradeoffs center on the fidelity of the situations and the requirements for the type of human performance data to be collected. The type of simulator presented to participants may range from a part-task simulator, to a simplified microworld, or to a full-scope high-fidelity simulator. The simplified simulators present greater opportunity for control but lose much of the context of real-world use found in full-scope simulators. We frame scenarios and simulators in the context of micro- vs. macro-cognition and provide examples of how the different experimental design choices lend themselves to different types of studies.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Boring, R.L., Jung, W., Lau, N., Skraaning Jr., G.: How to run a control room simulator research study. In: 9th International Topical Meeting on Nuclear Power Plant Instrumentation, Control, and Human-Machine Interface Technologies (NPIC&HMIT), pp. 1560–1568 (2015)
Boring, R., Agarwal, V.: Beyond integrated system validations: use of a control room training simulator for proof-of-concept interface development. In: 8th International Topical Meeting on Nuclear Power Plant Instrumentation, Control, and Human-Machine Interface Technologies (NPIC&HMIT), pp. 1389–1399 (2012)
O’Hara, J., Stubler, W., Higgins, J., Brown, W.: Integrated System Validation: Methodology and Review Criteria, NUREG/CR6393. U.S. Nuclear Regulatory Commission, Washington, DC (1995)
International Standards Organization: Ergonomics of Human-System Interaction—Part 210: Human Centred Design for Interactive Systems, ISO 9241-210, Geneva (2010)
American Nuclear Society: Nuclear Power Plant Simulators for Use in Operator Training and Examination, ANSI/ANS-3.5-2009, La Grange Park, Illinois (2009)
Boring, R.L.: The use of simulators in human factors studies within the nuclear industry. In: Simulator-Based Human Factors Studies Across 25 Years, pp. 3–17, Springer, London (2011)
Boring, R.L.: Lessons learned using a full-scale glasstop simulator for control room modernization in nuclear power plants. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, pp. 1712–1716 (2013)
Boring, R., Agarwal, V., Fitzgerald, K., Hugo, J., Hallbert, B.: Digital Full-Scope Simulation of a Conventional Nuclear Power Plant Control Room, Phase 2: Installation of a Reconfigurable Simulator to Support Nuclear Plant Sustainability, INL/EXT-13-28432. Idaho National Laboratory, Idaho Falls (2013)
Naweed, A., Hockey, G.R.J., Clarke, S.D.: Designing simulator tools for rail research: the case study of a train driving microworld. Appl. Ergon. 44, 445–454 (2013)
Ulrich, T., Boring, R., Lew, R.: Extrapolating nuclear process control microworld simulation performance data from novices to experts—a preliminary analysis. Advances in Intelligent Systems and Computing (2018, in press)
Lew, R., Boring, R.L., Ulrich, T.A.: A prototyping environment for research on human-machine interfaces in process control. In: Seventh International Symposium on Resilient Control Systems (2014)
Ulrich, T.A., Lew, R., Werner, S., Boring, R.L: Rancor: a gamified microworld nuclear power plant simulation for engineering psychology research and process control applications. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, pp. 398–402 (2017)
Boring, R.L., Ulrich, T.A., Joe, J.C., Lew, R.T.: Guideline for operational nuclear usability and knowledge elicitation (GONUKE). Procedia Manuf. 3, 1327–1334 (2015)
Leis, R., Reinerman-Jones, L.E., Sollins, B., Barber, D.J., Mercado, J.: Workload from nuclear power plant task types across repeated sessions. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, pp. 210–214 (2014)
Ulrich, T.A., Werner, S., Boring, R.L.: Studying situation awareness on a shoestring budget: an example of an inexpensive simulation environment for theoretical research. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, pp. 1520–1524 (2015)
Boring, R.L., Joe, J.C., Ulrich, T.A., Lew, R.T.: Early-stage design and evaluation for nuclear power plant control room upgrades. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, pp. 1909–1913 (2014)
Boring, R.L., Lew, R., Ulrich, T.A.: Epistemiation: an approach for knowledge elicitation of expert users during product design. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, pp. 1699–1703 (2016)
Boring, R.L.: Envy in V&V: an opinion piece on new directions for verification and validation in nuclear power plants. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, pp. 1746–1750 (2015)
Boring, R.L., Griffith, C.D., Joe, J.C.: The measure of human error: direct and indirect performance shaping factors, official. In: Proceedings of the Joint 8th IEEE Conference on Human Factors and Power Plants and the 13th Annual Workshop on Human Performance/Root Cause/Trending/Operating Experience/Self Assessment, pp. 170–176 (2007)
Hildebrandt, M., Fernandes, A.: Micro task evaluation of analog vs. digital power plant control room interfaces. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, pp. 1349–1353 (2016)
Boring, R.L., Lau, N.: Measurement sufficiency versus completeness: integrating safety cases into verification and validation in nuclear control room modernization. Advances in Intelligent Systems and Computing, vol. 495, pp. 79–90 (2016)
Cacciabue, P.C., Hollnagel, E.: Simulation of Cognition: Applications, Expertise and Technology. Lawrence Erlbaum Associates, Hillsdale (1995)
Hutchins, E.: Cognition in the Wild. MIT Press, Cambridge (1995)
Hickling, E.M., Bowie, J.E.: Applicability of human reliability assessment methods to human-computer interfaces. Cog. Tech. Work 15, 19–27 (2012)
Disclaimer
The opinions expressed in this paper are entirely those of the authors and do not represent official position. This work of authorship was prepared as an account of work sponsored by Idaho National Laboratory, an agency of the United States Government. Neither the United States Government, nor any agency thereof, nor any of their employees makes any warranty, express or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately-owned rights. Idaho National Laboratory is a multi-program laboratory operated by Battelle Energy Alliance LLC, for the United States Department of Energy under Contract DE-AC07-05ID14517. This research was funded through the Laboratory Directed Research and Development program at Idaho National Laboratory.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer International Publishing AG, part of Springer Nature (outside the USA)
About this paper
Cite this paper
Boring, R.L., Ulrich, T.A., Lew, R., Rasmussen, M. (2019). Parts and Wholes: Scenarios and Simulators for Human Performance Studies. In: Boring, R. (eds) Advances in Human Error, Reliability, Resilience, and Performance. AHFE 2018. Advances in Intelligent Systems and Computing, vol 778. Springer, Cham. https://doi.org/10.1007/978-3-319-94391-6_12
Download citation
DOI: https://doi.org/10.1007/978-3-319-94391-6_12
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-94390-9
Online ISBN: 978-3-319-94391-6
eBook Packages: EngineeringEngineering (R0)