skip to main content
research-article

Benefits and barriers of user evaluation in software engineering research

Published:22 October 2011Publication History
Skip Abstract Section

Abstract

In this paper, we identify trends about, benefits from, and barriers to performing user evaluations in software engineering research. From a corpus of over 3,000 papers spanning ten years, we report on various subtypes of user evaluations (e.g., coding tasks vs. questionnaires) and relate user evaluations to paper topics (e.g., debugging vs. technology transfer). We identify the external measures of impact, such as best paper awards and citation counts, that are correlated with the presence of user evaluations. We complement this with a survey of over 100 researchers from over 40 different universities and labs in which we identify a set of perceived barriers to performing user evaluations.

References

  1. J. W. Backus. The IBM 701 Speedcoding system. J. ACM, 1:4--6, January 1954. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. V. Basili, R. Selby Jr, and D. Hutchens. Experimentation in SE. In P. Oman and S. L. Pfleeger, editors, Applying Software Metrics. Wiley-IEEE Computer Society, 1997.Google ScholarGoogle Scholar
  3. I. Campbell. Chi-squared and Fisher-Irwin tests of two-by-two tables with small sample recommendations. Statistics in Medicine, 26:3661--3675, 2007.Google ScholarGoogle ScholarCross RefCross Ref
  4. T. Couronne, M. Zelkowitz, and D. Wallace. Experimental validation in software engineering. Information and Software Technology, 39(11):735--743, 1997.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. I. Deligiannis, M. Shepperd, S. Webster, and M. Roumeliotis. A review of experimental investigations into object-oriented technology. ESM, 7(3):193--231, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. T. Dybå and T. Dingsøyr. Empirical studies of agile software development: A systematic review. Information and Software Technology, 50(9--10):833--859, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. A. Egyed. Instant consistency checking for the uml. In International Conference on Software Engineering (ICSE), pages 381--390, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. R. Glass, I. Vessey, and V. Ramesh. Research in software engineering: an analysis of the literature. Information and Software Technology, 44(8):491--506, 2002.Google ScholarGoogle ScholarCross RefCross Ref
  9. S. Hanenberg. Faith, hope, and love: an essay on software science's neglect of human factors. In Symposium on Object-Oriented Programming, Systems, Languages, and Applications (OOPSLA), 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. M. Host, C. Wohlin, and T. Thelin. Experimental context classification: incentives and experience of subjects. In International Conference on Software Engineering (ICSE), 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. IDEO. Method Cards: 51 Ways to Inspire Design. William Stout, 2003.Google ScholarGoogle Scholar
  12. M. Jørgensen. A review of studies on expert estimation of software development effort. Journal of Systems and Software, 70(1--2):37--60, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. N. Juristo, A. Moreno, and S. Vegas. Reviewing 25 years of testing technique experiments. Empirical Software Engineering, 9(1):7--44, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. B. Kitchenham. Evaluating software engineering methods and tool part 1: The evaluation context and evaluation methods. ACM SIGSOFT Software Engineering Notes, 21(1):11--14, 1996. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. H. Lam, E. Bertini, P. Isenberg, C. Plaisant, and S. Carpendale. Seven Guiding Scenarios for Information Visualization Evaluation. Technical Report 2011--992-04, University of Calgary, 2011.Google ScholarGoogle Scholar
  16. J. Lazar, J. H. Feng, and H. Hochheiser. Research Methods in Human Computer Interaction. Wiley, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. J. Lung, J. Aranda, S. M. Easterbrook, and G. V. Wilson. On the difficulty of replicating human subjects studies in software engineering. In International Conference on Software Engineering (ICSE), 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. J. McGrath. Methodology matters: Doing research in the behavioral and social sciences. In Human-computer interaction, pages 152--169. Morgan Kaufmann Publishers Inc., 1995. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. J. R. Quinlan. C4.5: programs for machine learning. Morgan Kaufmann Publishers Inc., 1993. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. M. Robnik-\vSikonja and I. Kononenko. Theoretical and empirical analysis of ReliefF and RReliefF. Machine Learning, 53:23--69, October 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. M. Shaw. Writing good software engineering research papers: minitutorial. In International Conference on Software Engineering (ICSE), 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. D. Sjoberg, B. Anda, E. Arisholm, T. Dyba, M. Jorgensen, A. Karahasanovic, E. Koren, and M. Vokác. Conducting realistic experiments in software engineering. In International Symposium on Empirical Software Engineering and Measurement (ESEM), 2003.Google ScholarGoogle Scholar
  23. D. Sjoberg, T. Dyba, and M. Jorgensen. The future of empirical methods in software engineering research. In Future of Software Engineering Workshop (FoSER), 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. D. Sjøeberg, J. Hannay, O. Hansen, V. Kampenes, A. Karahasanovic, N.-K. Liborg, and A. Rekdal. A survey of controlled experiments in software engineering. IEEE Transactions on Software Engineering, 31(9):733 -- 753, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. M. Svahnberg, A. Aurum, and C. Wohlin. Using students as subjects-an empirical evaluation. In International Symposium on Empirical Software Engineering and Measurement (ESEM), 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. L. Teo and B. John. Cogtool-Explorer: towards a tool for predicting user interaction. In Conference on Human factors In computing systems (CHI), 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. W. Tichy, P. Lukowicz, L. Prechelt, and E. Heinz. Experimental evaluation in computer science: A quantitative study. Journal of Systems and Software, 28(1):9--18, 1995. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. M. Zelkowitz. Techniques for Empirical validation. In V. Basili, D. Rombach, K. Schneider, B. Kitchenham, D. Pfahl, and R. Selby, editors, Empirical Software Engineering Issues. Critical Assessment and Future Directions, pages 4--9. Springer, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Benefits and barriers of user evaluation in software engineering research

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in

      Full Access

      • Published in

        cover image ACM SIGPLAN Notices
        ACM SIGPLAN Notices  Volume 46, Issue 10
        OOPSLA '11
        October 2011
        1063 pages
        ISSN:0362-1340
        EISSN:1558-1160
        DOI:10.1145/2076021
        Issue’s Table of Contents
        • cover image ACM Conferences
          OOPSLA '11: Proceedings of the 2011 ACM international conference on Object oriented programming systems languages and applications
          October 2011
          1104 pages
          ISBN:9781450309400
          DOI:10.1145/2048066

        Copyright © 2011 ACM

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 22 October 2011

        Check for updates

        Qualifiers

        • research-article

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader
      About Cookies On This Site

      We use cookies to ensure that we give you the best experience on our website.

      Learn more

      Got it!