Usability Problem Reports for Comparative Studies: Consistency and Inspectability

Creative Commons License

Vermeeren A. P. O. S. , Attema J., Akar E., de Ridder H., van Doorn A. J. , Erbug C., ...More

HUMAN-COMPUTER INTERACTION, vol.23, no.4, pp.329-380, 2008 (Peer-Reviewed Journal) identifier identifier

  • Publication Type: Article / Article
  • Volume: 23 Issue: 4
  • Publication Date: 2008
  • Doi Number: 10.1080/07370020802536396
  • Journal Indexes: Science Citation Index Expanded, Scopus
  • Page Numbers: pp.329-380


This study explores issues of consistency and inspectability in usability test data analysis processes and reports. Problem reports resulting from usability tests performed by three professional usability labs in three different countries are compared. Each of the labs conducted a usability test on the same product, applying an agreed test protocol that was collaboratively developed by the labs. Each lab first analyzed their own findings as they always do in their regular professional practice. A few weeks later, they again analyzed their findings, but then everyone applied the same method (SlimDEVAN: a simplified version of DEVAN, a method developed for facilitating comparison of findings from usability tests in an academic setting). It was found that levels of agreement between labs did not improve when they all used SlimDEVAN, suggesting that there was inherent subjectivity in their analyses. It was found that consistency of single analyst teams varied considerably and that a method like SlimDEVAN can help in making the analysis process and findings more inspectable. Inspectability is helpful in comparative studies based on identified usability problems because it allows for tracing back findings to original observations as well as for laying bare the subjective parts of the data analysis.