Skip to main content

Exploring Trust Barriers to Future Autonomy: A Qualitative Look

  • Conference paper
  • First Online:
Advances in Human Factors in Simulation and Modeling (AHFE 2017)

Abstract

Autonomous systems dominate future Department of Defense (DoD) strategic perspectives, yet little is known regarding the trust barriers of these future systems as few exemplars exist from which to appropriately baseline reactions. Most extant DoD systems represent “automated” versus “autonomous” systems, which adds complexity to our understanding of user acceptance of autonomy. The trust literature posits several key trust antecedents to automated systems, with few field applications of these factors into the context of DoD systems. The current paper will: (1) review the trust literature as relevant to acceptance of future autonomy, (2) present the results of a qualitative analysis of trust barriers for two future DoD technologies (Automatic Air Collision Avoidance System [AACAS]; and Autonomous Wingman [AW]), and (3) discuss knowledge gaps for implementing future autonomous systems within the DoD. The study team interviewed over 160 fighter pilots from 4th Generation (e.g., F-16) and 5th Generation (e.g., F-22) fighter platforms to gauge their trust barriers to AACAS and AW. Results show that the trust barriers discussed by the pilots corresponded fairly well to the existing trust challenges identified in the literature, though some nuances were revealed that may be unique to DoD technologies/operations. Some of the key trust barriers included: concern about interference during operational requirements; the need for transparency of intent, function, status, and capabilities/limitations; concern regarding the flexibility and adaptability of the technology; cyber security/hacking potential; concern regarding the added workload associated with the technology; concern for the lack of human oversight/decision making capacity; and doubts regarding the systems’ operational effectiveness. Additionally, the pilots noted several positive aspects of the proposed technologies including: added protection during last ditch evasive maneuvers; positive views of existing fielded technologies such as the Automatic Ground Collision Avoidance System; the potential for added operational capabilities; the potential to transfer risk to the robotic asset and reduce risk to pilots; and the potential for AI to participate in the entire mission process (planning-execution-debriefing). This paper will discuss the results for each technology and will discuss suggestions for implementing future autonomy into the DoD.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Veloso, M., Aisen, M., Howard, A., Jenkins, C., Mutlu, B., Scassellati, B.: WTEC Panel Report on Human-Robot Interaction Japan, South Korea, and China. World Technology Evaluation Center, Inc., Arlington (2012)

    Google Scholar 

  2. Mayer, R.C., Davis, J.H., Schoorman, F.D.: An integrated model of organizational trust. Acad. Manag. Rev. 20, 709–734 (1995)

    Google Scholar 

  3. Lee, J.D., See, K.A.: Trust in automation: designing for appropriate reliance. Hum. Factors 46, 50–80 (2004)

    Article  Google Scholar 

  4. Chen, J.Y.C., Barnes, M.J.: Human-agent teaming for multirobot control: a review of the human factors issues. IEEE Trans. Hum.-Mach. Syst. 44(1), 13–29 (2014)

    Article  Google Scholar 

  5. Hoff, K.A., Bashir, M.: Trust in automation: integrating empirical evidence on factors that influence trust. Hum. Factors 57, 407–434 (2015)

    Article  Google Scholar 

  6. Onnasch, L., Wickens, C.D., Li, H., Manzey, D.: Human performance consequences of stages and levels of automation: an integrated meta-analysis. Hum. Factors 56, 476–488 (2014)

    Article  Google Scholar 

  7. Hancock, P.A., Billings, D.R., Schaefer, K.E., Chen, J.Y.C., de Visser, E.J., Parasuraman, R.: A meta-analysis of factors affecting trust in human-robot interaction. Hum. Factors 53(5), 517–527 (2011)

    Article  Google Scholar 

  8. Lyons, J.B., Saddler, G.G., Koltai, K., Battiste, H., Ho, N.T., Hoffmann, L.C., Smith, D., Johnson, W.W., Shively, R.: Shaping trust through transparent design: theoretical and experimental guidelines. In: Savage-Knepshield, P., Chen, J. (eds.) Advances in Human Factors in Robotics and Unmanned Systems, pp. 127–136. Springer, Cham (2017)

    Chapter  Google Scholar 

  9. Li, X., Hess, T.J., Valacich, J.S.: Why do we trust new technology? A study of initial trust formation with organizational information systems. J. Strateg. Inf. Syst. 17, 39–71 (2008)

    Article  Google Scholar 

  10. Guznov, S., Lyons, J.B., Nelson, A., Wooley, M.: The effects of automation error types on operators trust and reliance. In: Proceedings of HCI International, Toronto, CA (2016)

    Google Scholar 

  11. Pak, R., Fink, N., Price, M., Bass, B., Sturre, L.: Decision support aids with anthropomorphic characteristics influence trust and performance in younger and older adults. Ergonomics 55(9), 1–14 (2012)

    Article  Google Scholar 

  12. Merritt, S.M., Unnerstall, J.L., Lee, D., Huber, K.: Measuring individual differences in the perfect automation schema. Hum. Factors 57, 740–753 (2015)

    Article  Google Scholar 

  13. Lyons, J.B., Ho, N.T., Koltai, K., Masequesmay, G., Skoog, M., Cacanindin, A., Johnson, W.W.: A trust-based analysis of an air force collision avoidance system: test pilots. Ergon. Des. 24, 9–12 (2016)

    Google Scholar 

  14. Lyons, J.B.: Being transparent about transparency: a model for human-robot interaction. In: Sofge, D., Kruijff, G.J., Lawless, W.F. (eds.) Trust and Autonomous Systems: Papers from the AAAI Spring Symposium (Technical Report SS-13-07). AAAI Press, Menlo Park (2013)

    Google Scholar 

  15. Defense Science Board (DSB) Task Force on the Role of Autonomy in Department of Defense (DoD) Systems. Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics. Washington, DC (2012)

    Google Scholar 

  16. Defense Science Board (DSB) Summer Study on Autonomy. Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics. Washington, DC (2016)

    Google Scholar 

  17. Wadley, J., Jones, S.E., Stoner, D.E., Griffin, E.M., Swihart, D.E., Hobbs, K.L., Burns, A.C., Bier, J.M.: Development of an automatic air collision avoidance system for fighter aircraft. In: AIAA Infotech@Aerospace Conference, Guidance, Navigation, and Control and Co-located Conferences. Boston, MA (2013)

    Google Scholar 

  18. Jones, S.E., Petry, A.K., Eger, C.A., Turner, R.M., Griffin, E.M.: Automatic integrated collision system. In: 17th Australian Aerospace Congress. Melbourne, AU (2017)

    Google Scholar 

  19. Ho, N.T., Sadler, G.G., Hoffmann, L.C., Lyons, J.B., Fergueson, W.E., Wilkins, M.: A longitudinal field study of auto-GCAS acceptance and trust: first year results and implications. J. Cognit. Eng. Decis. Mak. (in press)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Joseph B. Lyons .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG (outside the USA)

About this paper

Cite this paper

Lyons, J.B. et al. (2018). Exploring Trust Barriers to Future Autonomy: A Qualitative Look. In: Cassenti, D. (eds) Advances in Human Factors in Simulation and Modeling. AHFE 2017. Advances in Intelligent Systems and Computing, vol 591. Springer, Cham. https://doi.org/10.1007/978-3-319-60591-3_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-60591-3_1

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-60590-6

  • Online ISBN: 978-3-319-60591-3

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics