Abstract
This study compared the results of a usability inspection conducted under two separate conditions: An explicit concurrent think-aloud that required explanations and silent working. 12 student analysts inspected two travel websites thinking-aloud and working in silence to produce a set of problem predictions. Overall, the silent working condition produced more initial predictions, but the think-aloud condition yielded a greater proportion of accurate predictions as revealed by falsification testing. The analysts used a range of problem discovery methods with system searching being favoured by the silent working condition and the more active, goal playing discovery method in the think-aloud condition. Thinking-aloud was also associated with a broader spread of knowledge resources.
- Chi, M.T.H., de Leeuw, N., Chiu, M.H. and La Vancher, C. (1994) Eliciting self-explanations improves understanding. Cognitive Science, 18, 439--477Google Scholar
- Cockton, G, Woolrych, A, Hall, L, Hindmarch, M. (2003): Changing Analysts' Tunes: The Surprising Impact of a New Instrument for Usability Inspect. In: Proceedings of the HCI03 Conference on People and Computers XVII , 2003, .pp. 145--162Google Scholar
- Cockton, G, Woolrych, A and Hindmarch, M (2004) Reconditioned merchandise: extended structured report formats in usability inspection. In: Extended abstracts of the 2004 conference on Human factors and computing systems - CHI '04. Association for Computing Machinery, New York, pp. 1433--1436. ISBN 1--58113--703--6Google Scholar
- Cockton, G, Woolrych, A, Hornbæk, K and Frøkjær, E (2012) Inspection-based evaluations. In: The Human-Computer Interaction Handbook: fundamentals, evolving technologies, and emerging applications [3rd edition]. CRC Press, Boca Raton, FL, pp. 1279--1298. ISBN 978--1439829431Google Scholar
- Ericsson, A. and Simon, H. A. (1993) Protocol Analysis: Verbal reports as data. MIT Press, London, UK.Google Scholar
Digital Library
- Fernandez, A., Anrahao, S and Insfran, E (2013) Empirical validation of a usability inspection method for model-driven Web development Journal of Systems and Software, 68 (1) p161--186.Google Scholar
- Fox, M.C., Ericsson, K.A. & Best, R., (2011). Do procedures for verbal reporting of thinking have to be reactive? A meta-analysis and recommendations for best reporting methods. Psychological Bulletin, 137(2), pp.316--344Google Scholar
Cross Ref
- Hertzum, M., Hansen, K.D. and Andersen, H.H.K. (2009). Scrutinising usability evaluation: does thinking aloud affect behaviour and mental workload? Behaviour & Information Technology. 28 (2). pp. 165--181Google Scholar
Digital Library
- Hertzum, M & Jacobsen, NE (2003) The Evaluator Effect: A Chilling Fact About Usability Evaluation Methods, International Journal of Human--Computer Interaction, 15:1, 183--204, DOI: 10.1207/S15327590IJHC1501_14Google Scholar
Cross Ref
- Holtzblatt, K., Wendell, J and Wood, S., (2004) Rapid Contextual Design. Morgan KaufmannGoogle Scholar
- Hornbaek, K. (2010). Dogmas in the assessment of usability evaluation methods. Behaviour and Information Technology, 29(1), 97--111.Google Scholar
Digital Library
- Hornbaek, K and Frokjaer, E (2004) Two psychology-based usability inspection techniques studied in a diary experiment. NordiCHI '04: Proceedings of the third Nordic conference on Human-computer interaction October 2004 Pages 3--12Google Scholar
Digital Library
- Hornbæk, K. and Frøkjær, E. (2008), "Comparison of Techniques for Matching of Usability-Problem Descriptions", Interacting with Computers, 20(6), December 2008, Pages 505--514.ISO 2019Google Scholar
Digital Library
- Hornbæk, K and Frøkjær, E (2008) Making use of business goals in usability evaluation: an experiment with novice evaluators. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '08). ACM, New York, NY, USA, 903--912.Google Scholar
Digital Library
- Hvannberg, Ebba Thora, Law, Effie Lai-Chong and Larusdottir, Marta Kristin (2007) Heuristic evaluation: Comparing ways of finding and reporting usability problems. Interacting with Computers Vol 19, no 2, p 225--240Google Scholar
Digital Library
- International Standards Organisation. (2019) ISO 9241--210:2019 Ergonomics of human-system interaction - Part 210: Human-centred design for interactive systemsGoogle Scholar
- Jacobsen, N.E., Hertzum, M. and John, B.E. (1998) The Evaluator Effect in Usability Studies: Problem Detection and Severity Judgments. Proc. HFES 42nd Annual Meeting. Human Factors and Ergonomics Society, SantaMonica, CA, pp. 1336--1340.Google Scholar
Cross Ref
- Keinonen, T. (2009). Design method: Instrument, competence or agenda. Multiple Ways to Design Research: Research Cases Reshape the Design Discipline. M. Botta (ed). Swiss Design Network, 280--293Google Scholar
- Lavery, D Cockton, G, Atkinson, M.P. (1997): Comparison of Evaluation Methods Using Structured Usability Problem Reports. Behaviour and Information Technology, 16 (4) pp. 246--266Google Scholar
Cross Ref
- McDonald, S., and Petrie, H. (2013) The effect of global instructions on think-aloud testing. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. Paris, France, April 27 - May 2. ACM New York, NY, USA. pp. 2941--2944Google Scholar
Digital Library
- McDonald, S., Zhao, T and Edwards, H.M. (2015) Look Who's Talking: Evaluating the Utility of Interventions During an Interactive Think-Aloud. Interacting with Computers, doi:10.1093/iwc/iwv014Google Scholar
- Molich, (2018). Are usability evaluations reproducible? Interactions 25(6), 82--85. DOI: https://doi.org/10.1145/3278154Google Scholar
Digital Library
- Monahan, K, McDonald, S and Cockton, G (2006) Modified Contextual Design as a Field Evaluation Method. In: Fourth Nordichi conference on human computer interaction.Google Scholar
- Nielsen, J. (1992). Finding usability problems through heuristic evaluation. Proceedings ACM CHI'92 Conference (Monterey, CA, May 3--7), 373--380Google Scholar
Digital Library
- Sears, A and Hess, D. J. (1998). The effect of task description detail on evaluator performance with cognitive walkthroughs. In CHI 98 Conference Summary on Human Factors in Computing Systems (CHI '98). ACM, New York, NY, USA, 259--260.Google Scholar
Digital Library
- Schön, D.A. (1983). The Reflective Practitioner: How Professionals Think in Action. Basic BooksGoogle Scholar
- Van Den Haak, M.J., De Jong, M.D.T., and Schellens, P.J. (2006) Constructive Interaction: An Analysis of Verbal Interaction in a Usability Setting, IEEE Transactions on Professional Communication, 49(4), 311--324.Google Scholar
Cross Ref
- VanLehn, K and Jones, R.M. and Chi, M.T.H (1992) A model of the Self-explanation. Journal of the Learning Sciences. (1), 1--60Google Scholar
Cross Ref
- Woolrych, A., Cockton, G. and Hindmarch, M., (2004) Falsification Testing for Usability Inspection Method Assessment. Proceedings of HCI 2004, Volume 2, eds. A. Dearden and Leon Watts, Research Press International, 137--140, 1--897851--13--8Google Scholar
- Woolrych, A, Hornbaek, K, Frokjaer, E and Cockton, G (2011) Ingredients and meals rather than recipes: a proposal for research that does not treat usability evaluation methods as indivisible wholes. International Journal of Human-Computer Interaction, 27 (10). pp. 940--970. ISSN 1044--7318Google Scholar
Cross Ref
- Woolrych, A. (2011) Understanding and Improving Analyst Decision Making in Usability Inspection Unpublished PhD thesis, University of Sunderland, UKGoogle Scholar
- Zhao, T. McDonald, S., and Edwards, H.M. (2014) The impact of two different think-aloud instructions in a usability test: a case of just following orders? Behaviour and Information Technology, 33, 163--183.Google Scholar
Digital Library
Index Terms
The Impact of Thinking-Aloud on Usability Inspection
Recommendations
Does think aloud work?: how do we know?
CHI EA '06: CHI '06 Extended Abstracts on Human Factors in Computing SystemsThe think aloud method is widely used in usability research to collect user's reports of the experience of interacting with a design so that usability evaluators can find the underlying usability problems. However, concerns remain about the validity and ...
Metaphors of human thinking for usability inspection and design
Usability inspection techniques are widely used, but few focus on users' thinking and many are appropriate only for particular devices and use contexts. We present a new technique (MOT) that guides inspection by metaphors of human thinking. The ...
Eye tracking in retrospective think-aloud usability testing: is there added value?
Eye tracking is the process of recording users' eye movements while they are looking at the location of an object. In usability testing, this technique is commonly used in combination with think-aloud protocols. This paper presents an experimental study ...






Comments