Skip to main content

Measuring Human Trust Behavior in Human-Machine Teams

  • Conference paper
  • First Online:
Advances in Human Factors in Simulation and Modeling (AHFE 2017)

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 591))

Included in the following conference series:

Abstract

This paper presents a paradigm for distilling trust behaviors in human-machine teams. The paradigm moves beyond diagnostic alarms-based automation definitions of compliance and reliance toward a view of trust behavior that includes automations where the machine has authority to act on behalf of the human-machine team in the environment. The paradigm first determines the purpose of the automation and then relies on three types of authority within the human-machine team to identify what trust behaviors will look like in specific instances. An example using the Space Navigator environment demonstrates how trust behaviors can be measured.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Lyons, J.B., Ho, N.T., Koltai, K.S., Masequesmay, G., Skoog, M., Cacanindin, A., Johnson, W.W.: Trust-based analysis of an air force collision avoidance system. Ergon. Des.: Q. Hum. Factors Appl. 24(1), 9–12 (2016)

    Google Scholar 

  2. Bindewald, J.M., Miller, M.E., Peterson, G.L.: A function-to-task process model for adaptive automation system design. Int. J. Hum.-Comput. Stud. 72(12), 822–834 (2014)

    Article  Google Scholar 

  3. Lee, J.D., See, K.A.: Trust in automation: designing for appropriate reliance. Hum. Factors: J. Hum. Factors Ergon. Soc. 46(1), 50–80 (2004)

    Article  Google Scholar 

  4. Hu, W.-L., Akash, K., Jain, N., Reid, T.: Real-time sensing of trust in human-machine interactions. Cyber-Phys. Hum.-Syst. 49(32), 48–53 (2016)

    Google Scholar 

  5. Khawaji, A., Chen, F., Zhou, J., Marcus, N.: Using galvanic skin response (GSR) to measure trust and cognitive load in the text-chat environment. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems, Seoul (2015)

    Google Scholar 

  6. Evans, A.M., Revelle, W.: Survey and behavioral measurements of interpersonal trust. J. Res. Personal. 42(6), 1585–1593 (2008)

    Article  Google Scholar 

  7. Hoff, K.A., Bashir, M.: Trust in automation: integrating empirical evidence on factors that influence trust. Hum. Factors: J. Hum. Factors Ergon. Soc. 57(3), 407–434 (2015)

    Article  Google Scholar 

  8. Rusnock, C.F., Miller, M.E., Bindewald, J.M.: Framework for trust in human-automation teams. In: Proceedings of the 2017 Industrial and Systems Engineering Conference, Pittsburgh (2017, to appear)

    Google Scholar 

  9. Freedy, A., DeVisser, E., Weltman, G., Coeyman, N.: Measurement of trust in human-robot collaboration. In: Proceedings of the 2007 International Symposium on Collaborative Technologies and Systems (CTS), Orlando (2007)

    Google Scholar 

  10. Dixon, S.R., Wickens, C.D.: Automation reliability in unmanned aerial vehicle control: a reliance-compliance model of automation dependence in high workload. Hum. Factors: J. Hum. Factors Ergon. Soc. 48(3), 474–486 (2006)

    Article  Google Scholar 

  11. Lacson, F.C., Wiegmann, D.A., Madhavan, P.: Effects of attribute and goal framing on automation reliance and compliance. In: Proceedings of the Human Factors and Ergonomics Society 49th Annual Meeting (2005)

    Google Scholar 

  12. Chiou, E.K., Lee, J.D.: Beyond reliance and compliance: human-automation coordination and cooperation. In: Proceedings of the Human Factors and Ergonomics Society 59th Annual Meeting (2015)

    Google Scholar 

  13. Feigh, K.M., Dorneich, M.C., Hayes, C.C.: Toward a characterization of adaptive systems: a framework for researchers and system designers. Hum. Factors: J. Hum. Factors Ergon. Soc. 54(6), 1008–1024 (2012)

    Article  Google Scholar 

  14. Bindewald, J.M., Peterson, G.L., Miller, M.E.: Trajectory generation with player modeling. In: Canadian Conference on Artificial Intelligence, Halifax (2015)

    Google Scholar 

  15. Bindewald, J.M., Peterson, G.L., Miller, M.E.: Clustering-based online player modeling. In: Joint Conference on Artificial Intelligence (IJCAI): Computer Games Workshop, New York (2016)

    Google Scholar 

Download references

Acknowledgements

The views expressed in this document are those of the author and do not reflect the official policy or position of the United States Air Force, the United States Department of Defense, or the United States Government.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jason M. Bindewald .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG (outside the USA)

About this paper

Cite this paper

Bindewald, J.M., Rusnock, C.F., Miller, M.E. (2018). Measuring Human Trust Behavior in Human-Machine Teams. In: Cassenti, D. (eds) Advances in Human Factors in Simulation and Modeling. AHFE 2017. Advances in Intelligent Systems and Computing, vol 591. Springer, Cham. https://doi.org/10.1007/978-3-319-60591-3_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-60591-3_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-60590-6

  • Online ISBN: 978-3-319-60591-3

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics