Abstract
Rigor in the products of information analysis is essential for decision makers to rely on the assessments contained within them. Zelik, Patterson, and Woods [1] defined an eight-attribute metric for communicating the rigor of analytic products. This paper describes two iterations of the process of designing, implementing, and evaluating a context-aware web application that uses this analytic rigor metric to recommend augmentations to analysts’ workflow that will improve the quality of the resultant products. We used multiple methods to evaluate this tool with subject matter experts, including brainstorming, collaborative card sorting, semi-structured interviews, cognitive walkthroughs, and heuristic evaluations. This research found that: (1) it is critical to have flexible recommendations that adapt to movements between foraging and sense-making components of workflow, and the changing structure of the analysis; and (2) persistent visualizations of analytic rigor assessments are distracting, and promote interpretation as a performance metric rather than a process aid.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
References
Zelik, D.J., Patterson, E.S., Woods, D.D.: Understanding rigor in information analysis. In: 8th International Conference on Naturalistic Decision Making, Pacific Grove, CA (2007)
U.S. Department of the Army: Open-Source Intelligence. Army Techniques Publication 2-22.9. U.S. Department of the Army, Washington, DC (2012)
Robb, C.S., Silberman, L.H., Levin, R.C., McCain, J., Rowen, H.S., Slocombe, W.B., Cutler, L.: The Commission on the Intelligence Capabilities of the United States Regarding Weapons of Mass Destruction: Report to the President of the United States. Commission on Intelligence Capabilities Regarding WMD, Washington, DC (2005)
Clark, R.M.: Intelligence Analysis. American Literary Press Inc, Alexandria (1996)
Toulmin, S.: The Philosophy of Science: An Introduction. The Hutchinson University Library, London (1953)
Kent, S.: Strategic Intelligence for American World Policy. Princeton University Press, Princeton (1965)
Heuer, R.J.: Psychology of Intelligence Analysis (1999)
United States Government: A Tradecraft Primer: Structured Analytic Techniques for Improving Intelligence Analysis, Washington, DC (2009)
Klein, G.: Intuition at Work: Why Developing Your Gut Instincts Will Make You Better at What You Do. Currency Books, New York (2002)
Cohen, M.S., Freeman, J.T., Thompson, B.T.: Critical thinking skills in tactical decision making: a model and a training method. In: Canon-Bowers, J., Salas, E. (eds.) Decision-Making Under Stress: Implications for Training and Simulation. American Psychological Association Publications, Washington, DC (1998)
Pirolli, P., Card, S.: The sensemaking process and leverage points for analyst technology as identified through cognitive task analysis. In: Proceedings of the International Conference on Intelligence Analysis (2005)
Mellers, B., Stone, E., Murray, T., Minster, A., Rohrbaugh, N., Bishop, M.: Ungar, L.: Identifying and cultivating superforecasters as a method of improving probabilistic predictions. Perspect. Psychol. Sci. 10, 267–281 (2015)
Miller, J.E., Patterson, E.S., Woods, D.D.: Elicitation by critiquing as a cognitive task analysis methodology. Cogn. Technol. Work 8, 90–102 (2006)
Zelik D.J., Patterson E.S., Woods D.D.: Measuring attributes of rigor in information analysis. In: Macrocognition Metrics and Scenarios: Design and Evaluation for Real-world Teams (2010)
Pfautz, S.L., Ganberg, G., Fouse, A., Schurr, N.: A general context-aware framework for improved human-system interactions. AI Mag. 36, 42–49 (2015)
Cooke, N.J.: Varieties of knowledge elicitation techniques. Int. J. Hum. Comput. Stud. 41, 801–849 (1994)
Gerhardt-Powals, J.: Cognitive engineering principles for enhancing human-computer performance. Int. J. Hum. Comput. Interact. 8, 189–211 (1996)
Wharton, C., Bradford, J., Jeffries, R., Franzke, M.: Applying cognitive walkthroughs to more complex user interfaces: experiences, issues, and recommendations. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM (1992)
Acknowledgments
The research reported in this document was performed in connection with contract FA8750-14-C-0124 with the U.S. Air Force Research Laboratory. We would like to acknowledge Dr. Caroline Ziemkiewicz, Ms. Stacy Lovell Pfautz, Dr. David Woods, Ms. Valerie Champagne, Ms. Amy Rider, Mr. David Johnson, and Mr. Roger Dziegiel for their contributions to this research as thought partners and subject matter experts.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG
About this paper
Cite this paper
Fouse, A., Mullins, R.S., Ganberg, G., Weiss, C. (2018). The Evolution of User Experiences and Interfaces for Delivering Context-Aware Recommendations to Information Analysts. In: Ahram, T., FalcĂŁo, C. (eds) Advances in Usability and User Experience. AHFE 2017. Advances in Intelligent Systems and Computing, vol 607. Springer, Cham. https://doi.org/10.1007/978-3-319-60492-3_2
Download citation
DOI: https://doi.org/10.1007/978-3-319-60492-3_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-60491-6
Online ISBN: 978-3-319-60492-3
eBook Packages: EngineeringEngineering (R0)