10.1145/1189215.1189180acmconferencesArticle/Chapter ViewAbstractPublication PagesiticseConference Proceedings
ARTICLE
Free Access

Research methods in computing: what are they, and how should we teach them?

ABSTRACT

Despite a lack of consensus on the nature of Computing Research Methods (CRM), a growing number of programs are exploring models and content for CRM courses. This report is one step in a participatory design process to develop a general framework for thinking about and teaching CRM.We introduce a novel sense-making structure for teaching CRM. That structure consists of a road map to the CRM literature, a framework grounded in questions rather than answers, and two CRM skill sets: core skills and specific skills. We integrate our structure with a model for the process a learner goes through on the way to becoming an expert computing researcher and offer example learning activities that represent a growing repository of course materials meant to aid those wishing to teach research skills to computing students.Our model is designed to ground discussion of teaching CRM and to serve as a roadmap for institutions, faculty, students and research communities addressing the transition from student to fully enfranchised member of a computing research community of practice. To that end, we offer several possible scenarios for using our model.In computing, research methods have traditionally been passed from advisor to student via apprenticeship. Establishing a richer pedagogy for training researchers in computing will benefit all (see Figure 1).

References

  1. M. Alavi and P. Carlson. A review of MIS research and disciplinary development. J. Mgmt. Inf. Sys., 3(4):45--62, 1992. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. R. Alvarez and J. Urla. Tell me a good story: using narrative analysis to examine information requirements interviews during an ERP implementation. SIGMIS Database, 33(1):38--52, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. P. B. Andersen. A theory of computer semiotics: semiotic approaches to construction and assessment of computer systems. Cambridge University Press, 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. ATIS Committee T1A1: performance and signal processing. T1 Glossary 2000: glossary of telecommunications terms. Alliance for Telecommunications Industry Solutions, 2001.Google ScholarGoogle Scholar
  5. M. Ben-Ari. Situated learning in computer science education. Comp. Sci. Educ., 14(2):85--100, 2004.Google ScholarGoogle ScholarCross RefCross Ref
  6. D. Benyon, P. Turner and S. Turner. Designing interactive systems. Addison-Wesley, 2005.Google ScholarGoogle Scholar
  7. T. Berling and M. Host. A case study investigating the characteristics of verification and validation activities in the software development process. In EUROMICRO '03, Washington, DC, Sept. 2003, 405. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. D. Beymer, D. M. Russell and P. Z. Orton. Wide vs narrow paragraphs: an eye tracking analysis. In INTERACT 2005, M. F. Constable and F Peterno (eds), Rome, Italy, Sept. 2005, 741--752. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. G. Biswas, K. A. Debelak, and K. Kawamura. Applications of qualitative modeling to knowledge-based risk assessment studies. In IEA/AIE '89, Tullahoma, TN, 1989, 92--101. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. B. Bloom. Taxonomy of educational objectives. Longmans, New York, 1956.Google ScholarGoogle Scholar
  11. G. Braught and D. Reed. Disequilibration for teaching the scientific method in computer science. In SIGCSE '02, Covington, KY, Feb 2002, 106--110. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. J. Campbell and D. Stanley. Experimental and quasi-experimental designs for research. Houghton Mifflin Company, 1963.Google ScholarGoogle Scholar
  13. M. J. Cheon, V. Grover, and R. Sabherwal. The evolution of empirical research in IS: a study in IS maturity. Inf. Mgmt., 24(3): 107--119, 1993. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. P. Cohen. Empirical methods for artificial intelligence. The MIT Press, 1995. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Computer Professionals for Social Responsibility. CSPR - participatory design. 1 June 2005 CSPR.org Viewed: 8 July 2006 http://www.cpsr.org/issues/pd/Google ScholarGoogle Scholar
  16. S. Consolvo and J. Towle. Evaluating an ambient display for the home. In CHI '05 extended abstracts, Portland, OR, April 2005, 1304--1307. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. P. Denning. Great principles in computing curricula. In SIGCSE '04, Norfolk, VA, March 2004, 336--341. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. B. Dervin. Sense-making theory and practice: an overview of user interests in knowledge seeking and use. J. Knowl. Mgmt., 2(2):36--46, 1998.Google ScholarGoogle ScholarCross RefCross Ref
  19. A. Dix, J. Finlay, G. D. Abowd, and R. Beale. Human computer interaction, 3rd ed. Prentice Hall, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. H. L. Dreyfus and S. E. Dreyfus. Five steps from novice to expert. Mind over machine: the power of human intuition and expertise in the era of the computer, 16--51. Free Press, New York, 1986.Google ScholarGoogle Scholar
  21. Elsevier Engineering Information. Compendex. http://www.ei.org/databases/compendex.htmlGoogle ScholarGoogle Scholar
  22. K. E. Emam, N. Moukheiber, and N. H. Madhavji. An empirical evaluation of the G/Q/M method. In CASCON '93, Toronto, ON, Oct. 1993, 265--289.Google ScholarGoogle Scholar
  23. P. A. Ertmer and T. J. Newby. The expert learner: strategic, self-regulated, and reflective. Instruct. Sci., 24(1):1--24, 1996.Google ScholarGoogle ScholarCross RefCross Ref
  24. A. Farhoomand. Scientific progress of management information systems. SIGMIS Database, 18(2):48--56, 1987. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. A. Farhoomand and D. H. Drury. A historiographical examination of information systems. Commun. AIS, 1:19, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. A. Fekete. Preparation for research: instruction in interpreting and evaluating research. In SIGCSE '96, 93--97, 1996. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. A. Fekete. "Definitions for computing research methods needed." Online posting. 16 Sept. 2006, SIGCSE-CSRM Listserv. http://listserv.acm.org/scripts/wa.exe? A1 =ind0609c&L=sigcse-csrmGoogle ScholarGoogle Scholar
  28. M. D. Fraser and R. A. Gagliano. Mathematical modeling and Ada simulation of some synchronization processes. In ANSS '87, Tampa, FL, 1987, 163--184. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. R. L. Gay. Educational research: competencies for analysis and application, 5th edition. Prentice-Hall, 1996.Google ScholarGoogle Scholar
  30. C. Gena. Methods and techniques for the evaluation of user-adaptive systems. Knowl. Eng. Rev., 20(01): 1--37, March 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. R. L. Glass, V. Ramesh, and I. Vessey. An analysis of research in computing disciplines. Commun. ACM, 47(6):89--94, June 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. R. L. Glass, I. Vessey, and V. Ramesh. Research in software engineering: an analysis of the literature. Inf. Softw. Tech., 44:491--506, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. R. Goede and C. de Villiers. The applicability of grounded theory as research methodology in studies on the use of methodologies in IS practices. In SAICSIT '03, Sept. 2003, 208--217. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. M. M. Groat. Using fuzzy k-modes to analyze patterns of system calls for intrusion detection. Masters thesis, California State University, East Bay, December 2005.Google ScholarGoogle Scholar
  35. V. Grover, C. C. Lee and D. Durand. Analyzing Methodological Rigor of MIS Survey Research from 1980--1989. Inf. Mgmt., 24:305--317, 1993. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. P. Hagen, T. Robertson, M. Kan, and K. Sadler. Emerging research methods for understanding mobile technology use. In OZCHI '05, Canberra, Australia, Nov. 2005, 1--10. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. S. Hamilton and B. Ives. Knowledge utilization among MIS researchers. MIS Quarterly, 61--77, December 1982.Google ScholarGoogle Scholar
  38. R. Harrison and M. Wells. A meta-analysis of multidisciplinary research. In Papers from The Conference on Empirical Assessment in Software Engineering (EASE), Staffordshire, UK, April 2000.Google ScholarGoogle Scholar
  39. O. Herrerra. Distributed mentor project: comprehensive participant survey analyses (1994-2000). Technical report, LEAD Center, 1900 University Ave, Madison, WI 53726, 2001.Google ScholarGoogle Scholar
  40. D. M. Hilbert and D. F. Redmiles. Extracting usability information from user interface events. ACM Comput. Surv., 32(4):384--421, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. H. Holz and A. Applin. Teaching computer science research methods. In SIGCSE '06, Houston, TX, March 2006, 576.Google ScholarGoogle Scholar
  42. P. Hsia, D. Kung, and C. Sell. Software requirements and acceptance testing. Ann. Softw. Eng., 3:291--317, 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. S. Huang and S. Tilley. Towards a documentation maturity model. In SIGDOC '03, Ft. Lauderdale, FL, April 2003, 93--99. Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. C. D. Hundhausen. Integrating algorithm visualization technology into an undergraduate algorithms course: ethnographic studies of a social constructivist approach. Comp. Educ., 39:237--260, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. J. J. Jensen and M. B. Skov. A review of research methods in children's technology design. In IDC '05, Boulder, CO, Oct. 2005, 80--87. Google ScholarGoogle ScholarDigital LibraryDigital Library
  46. C. Johnson. What is research in computing science? Available from http://www.dcs.gla.ac.uk/~johnson/teaching/research_skills/basics.htmlGoogle ScholarGoogle Scholar
  47. The Joint Task Force on Computing Curricula. Computing curricula 2005: the overview report. ACM Press, 2005.Google ScholarGoogle Scholar
  48. J. Jonides. Evaluation and dissemination of an undergraduate program to improve retention of at-risk students. Technical report, University of Michigan, Ann Arbor College of Literature, Science, and the Arts, 1995.Google ScholarGoogle Scholar
  49. D. Joseph and S. Ang. Turnover of IT professionals: a quantitative analysis of the literature. In SIGMIS CPR '03, Philadelphia, PA, April 2003, 130--132. Google ScholarGoogle ScholarDigital LibraryDigital Library
  50. L. Kantner, D. H. Sova and S. Rosenbaum. Alternative methods for field usability research. In SIGDOC '03, Ft. Lauderdale, FL, April 2003, 68--72. Google ScholarGoogle ScholarDigital LibraryDigital Library
  51. S. Kaski. Data exploration using self-organizing maps. PhD thesis, Helsinki University of Technology, 1997.Google ScholarGoogle Scholar
  52. F. N. Kerlinger. Foundations of behavioral research, 3rd edition. Harcourt Brace and Company, 1992.Google ScholarGoogle Scholar
  53. N. Kock. The three threats of action research: a discussion of methodological antidotes in the context of an information systems study. Decis. Support Syst., 37(2):265--286, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  54. M. Kyrillidou and S. Giersch. Pilot testing the digiQUAL protocol: lessons learned. In JCDL '06, Chapel Hill, NC, June 2006, 369--369. Google ScholarGoogle ScholarDigital LibraryDigital Library
  55. V. S. Lai and R. K. Mahapatra. Exploring the research in information technology implementation. Inf. Mgmt., 32(4):187--201, April 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  56. G. Latouche. Algorithmic analysis of a multiprogramming-multiprocessor computer system. J. ACM, 28(4):662--679, 1981. Google ScholarGoogle ScholarDigital LibraryDigital Library
  57. J. Lave and E. Wenger. Situated learning: legitimate peripheral participation. Cambridge University Press, 1991.Google ScholarGoogle ScholarCross RefCross Ref
  58. R. Lister. Mixed methods: positivists are from mars, constructivists are from venus. SIGCSE Bull., 37(4): 18--19, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  59. G. Marchionini and G. Crane. Evaluating hypermedia and learning: methods and results from the Perseus Project. ACM Trans. Inf. Syst., 12(1):5--34, 1994. Google ScholarGoogle ScholarDigital LibraryDigital Library
  60. B. L. McCombs. Self-regulated learning and academic achievement: a phenomenological view. In B. J. Zimmerman, editor, Self-regulated learning and academic achievement: theoretical perspectives. Lawrence Erlbaum Associates, Inc., 2001.Google ScholarGoogle Scholar
  61. J. P. Mead and G. Gay. Concept mapping: an innovative approach to digital library design and evaluation. SIGOIS Bull., 16(2): 10--14, 1995. Google ScholarGoogle ScholarDigital LibraryDigital Library
  62. R. Molich and R. Jeffries. Comparative expert reviews. In CHI '03 extended abstracts, Ft. Lauderdale, FL, April 2003, 1060--1061. Google ScholarGoogle ScholarDigital LibraryDigital Library
  63. J. Morrison and J. F. George. Exploring the software engineering component in MIS research. Commun. ACM, 38(7):80--91, July 1995. Google ScholarGoogle ScholarDigital LibraryDigital Library
  64. J. S. Murnane and J. W. Warner. An empirical study of junior secondary students' expression of algorithms in natural language. In CRPIT '02, Copenhagen, Denmark, July 2002, 81--85. Google ScholarGoogle ScholarDigital LibraryDigital Library
  65. J. Nielsen and R. Molich. Heuristic evaluation of user interfaces. In CHI '90, Seattle, WA, Feb. 1990, 249--256. Google ScholarGoogle ScholarDigital LibraryDigital Library
  66. S. Orlikowski and J. Baroudi. Studying information technology in organizations: research approaches and assumptions. Inf. Sys. Res., 2(1):3--14, March 1991.Google ScholarGoogle ScholarDigital LibraryDigital Library
  67. A. Paramythis and S. Weibelzahl. A decomposition model for the layered evaluation of interactive adaptive systems. Lecture Notes in Computer Science, 3538:438--442. Springer, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  68. A. Parush. Expanding our toolkit: from descriptive techniques to in-depth analyses. UPA 2005 Advanced Topic Seminar. Montreal, Quebec. http://www.upassoc.org/conference/2007/call/eptopics/Adv Topics_Ex1.pdfGoogle ScholarGoogle Scholar
  69. D. Passig. A taxonomy of future higher thinking skills. INFORMATICA, 2(1): 1--12, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  70. K. Pavlou and R. T. Snodgrass. Forensic analysis of database tampering. In SIGMOD '06, Chicago, IL, June 2006, 109--120. Google ScholarGoogle ScholarDigital LibraryDigital Library
  71. A. Pears, S. Seidman, C. Eney, P. Kinnunen, and L. Malmi. Constructing a core literature for computing education research. SIGCSE Bull., 37(4):152--161, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  72. D. Perry, A. Porter and L. Votta. Empirical studies of software engineering: a roadmap. In ICSE '00, Limerick, Ireland, June 2000, 345--355. Google ScholarGoogle ScholarDigital LibraryDigital Library
  73. N. Pillay. A genetic programming system for the induction of iterative solution algorithms to novice procedural programming problems. In SAICSIT '05, White River, South Africa, Sept. 2005, 66--77. Google ScholarGoogle ScholarDigital LibraryDigital Library
  74. J. Polack-Wahl and K. Anewalt. WIP - Research methods: teaching students how to learn about learning. In FIE 2005, Indianapolis, IN, Oct. 2005, F1F-12 -- F1F-13.Google ScholarGoogle Scholar
  75. J. Preece, Y. Rogers, H. Sharp, D. Benyon, S Holland and T. Carey Human-computer interaction: concepts and design. Addison-Wesley, 1994. Google ScholarGoogle ScholarDigital LibraryDigital Library
  76. V. Ramesh, R. L. Glass, and I. Vessey. Research in computer science: an empirical study. J. Sys. Softw., 70:165--176, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  77. J. Ramey, et. al. Does think aloud work?: How do we know? In CHI '06 extended abstracts, Montreal, Quebec, April 2006, 45--48. Google ScholarGoogle ScholarDigital LibraryDigital Library
  78. M. E. Raven and A. Flanders. Using contextual inquiry to learn about your audiences. SIGDOC Asterisk J. Comput. Doc., 20(1): 1--13, 1996. Google ScholarGoogle ScholarDigital LibraryDigital Library
  79. K. Rönkkö, M. Hellman, B. Kilander and Y. Dittrich. Personas is not applicable: local remedies interpreted in a wider context. In PDC '04, Toronto, Ontario, July 2004, 112--120. Google ScholarGoogle ScholarDigital LibraryDigital Library
  80. S. Sato and T. Salvador. Methods & tools: Playacting and focus troupes: theater techniques for creating quick, intense, immersive, and engaging focus group sessions. interactions, 6(5):35--41, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  81. G. Schneider. A new model for a required senior research experience. SIGCSE Bull., 34(4):48--51, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  82. SearchCIO.com. What is capability maturity model? a definition from Whatis.com. CIO Definitions. 5 Oct 2003. TechTarget. Viewed: 17 Sept 2006 http://searchcio.techtarget.com/sDefinition/0,290660, sid19_gci930057,00.htmlGoogle ScholarGoogle Scholar
  83. SearchSecurity.com What is computer forensics? - a definition from Whatis.com. Definitions. 14 Jan 2005. TechTarget Security Media. Viewed: 14 Sept 2005. http://searchsecurity.techtarget.com/sDefinition/0.290660,sid14_gci1007675,00.htmlGoogle ScholarGoogle Scholar
  84. C. A. Shaffer. Experiences teaching a graduate research methods course. SIGCSE Bull., 38(2):97--101, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  85. R. J. Shavelson. Statistical reasoning for the behavioral sciences, 3rd edition. Allyn and Bacon, 1996.Google ScholarGoogle Scholar
  86. R. J. Shavelson, D. C. Phillips, L. Towne, and M. J. Feuer. On the science of education design studies. Educ. Res., 32, 2003.Google ScholarGoogle Scholar
  87. SIGCSE-CSRM. CSRM Course Materials Repository. http://acc.csueastbay.edu/~csrm/kwiki/index.cgi? CSRMRepository, 2005.Google ScholarGoogle Scholar
  88. SIGCSE-CSRM. CSRM Wiki. http://acc.csueastbay.edu/~csrm/kwiki/, 2005.Google ScholarGoogle Scholar
  89. SIGCSE-CSRM. SIGCSE Committee on Teaching Computer Science Research Methods. http://www.sigcse.org/topics/committees.shtml.Google ScholarGoogle Scholar
  90. SIGCSE-CSRM. SIGCSE-CSRM listserv archive http://listserv.acm.org/archives/sigcse-csrm.htmlGoogle ScholarGoogle Scholar
  91. J. Simonsen and F. Kensing. Make room for ethnography in design!: Overlooked collaborative and educational prospects. SIGDOC Asterisk J. Comput. Doc., 22(1):20--30, 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library
  92. S. P. Smith and M. D. Harrison. Augmenting descriptive scenario analysis for improvements in human reliability design. In SAC '02, Madrid, Spain, March 2002, 739--743. Google ScholarGoogle ScholarDigital LibraryDigital Library
  93. M. Storey. Theories, methods and tools in program comprehension: past, present and future. In IWPC '05, St. Louis, MO, May 2005, 181--191. Google ScholarGoogle ScholarDigital LibraryDigital Library
  94. D. W. Straub, S. Ang, and R. Evaristo. Normative standards for IS research. SIGMIS Database, 25(1):21--34, 1994. Google ScholarGoogle ScholarDigital LibraryDigital Library
  95. D. Tall. The cognitive development of proof: is mathematical proof for all or for some? In Conf. Univ. Chicago School Mathematics Project. August 1998.Google ScholarGoogle Scholar
  96. K. Tuominen, S. Talja, and R. Savolainen. Multiperspective digital libraries: the implications of constructionism for the development of digital libraries. J. Am. Soc. Inf. Sci. Technol., 54(6):561--569, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  97. R. L. Van Horn. Empirical studies of management information systems. SIGMIS Database, 5(2-3-4): 172--182, 1973.Google ScholarGoogle Scholar
  98. I. Vessey, V. Ramesh, and R. L. Glass. A unified classification system for research in the computing disciplines. Technical Report TR107-1, Indiana University, 2001.Google ScholarGoogle Scholar
  99. D. R. Vogel and J. C. Wetherbe. MIS research: a profile of leading journals and universities. SIGMIS Database, 16(1):3--14, 1984. Google ScholarGoogle ScholarDigital LibraryDigital Library
  100. H. M. Walker. SIGCSE: Committee. Available from http://www.sigcse.org/topics/committees.shtml.Google ScholarGoogle Scholar
  101. K. Ward. The fifty-four day thesis proposal: first experiences with a research course. J. Comp. Sci. Colleges, 20(2), 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  102. C. Wharton, J. Bradford, R. Jeffries, and M. Franzke. Applying cognitive walkthroughs to more complex user interfaces: experiences, issues, and recommendations. In CHI '92, Monterey, CA, May 1992, 381--388. Google ScholarGoogle ScholarDigital LibraryDigital Library
  103. Wikipedia. Algorithmic Analysis - Wikipedia, the free encyclopedia. 14 Sept 2006. Wikipedia.org Viewed: 14 Sept 2006 http://en.wikipedia.org/wiki/Analysis_of_algorithmsGoogle ScholarGoogle Scholar
  104. Wikipedia. Mathematical proof-Wikipedia, the free encyclopedia. 14 Sept 2006. Wikipedia.org Viewed: 14 Sept 2006 http://en.wikipedia.org/wiki/Mathematical_proofGoogle ScholarGoogle Scholar
  105. I. Witten and T. Bell. Getting research students started: a tale of two courses. In SIGCSE '93, Indianapolis, IN, March 1993, 165--169. Google ScholarGoogle ScholarDigital LibraryDigital Library
  106. M. Zelkowitz and D. Wallace. Experimental validation in software engineering. Inf. Softw. Tech., 39(11):735--743, November 1997.Google ScholarGoogle ScholarDigital LibraryDigital Library
  107. S. Ziemer, T. Stalhane, and M. Sveen. Trade-off analysis in web development. In 3-WoSQ, St. Louis, MO, May 2005, 70--75. Google ScholarGoogle ScholarDigital LibraryDigital Library
  108. J. Zobel. Writing for Computer Science, 2nd ed. Springer, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Research methods in computing

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader
        About Cookies On This Site

        We use cookies to ensure that we give you the best experience on our website.

        Learn more

        Got it!

        To help support our community working remotely during COVID-19, we are making all work published by ACM in our Digital Library freely accessible through June 30, 2020. Learn more