skip to main content
10.1145/2745802.2745818acmotherconferencesArticle/Chapter ViewAbstractPublication PageseaseConference Proceedingsconference-collections
research-article

Experiences from using snowballing and database searches in systematic literature studies

Published:27 April 2015Publication History

ABSTRACT

Background: Systematic literature studies are commonly used in software engineering. There are two main ways of conducting the searches for these type of studies; they are snowballing and database searches. In snowballing, the reference list (backward snowballing - BSB) and citations (forward snowballing - FSB) of relevant papers are reviewed to identify new papers whereas in a database search, different databases are searched using predefined search strings to identify new papers. Objective: Snowballing has not been in use as extensively as database search. Hence it is important to evaluate its efficiency and reliability when being used as a search strategy in literature studies. Moreover, it is important to compare it to database searches. Method: In this paper, we applied snowballing in a literature study, and reflected on the outcome. We also compared database search with backward and forward snowballing. Database search and snowballing were conducted independently by different researchers. The searches of our literature study were compared with respect to the efficiency and reliability of the findings. Results: Out of the total number of papers found, snowballing identified 83% of the papers in comparison to 46% of the papers for the database search. Snowballing failed to identify a few relevant papers, which potentially could have been addressed by identifying a more comprehensive start set. Conclusion: The efficiency of snowballing is comparable to database search. It can potentially be more reliable than a database search however, the reliability is highly dependent on the creation of a suitable start set.

References

  1. L. Chen, M. Ali Babar, and N. Ali. Variability management in software product lines: a systematic review. In Proceedings of the 13th International Software Product Line Conference, pages 81--90. Carnegie Mellon University, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. T. Greenhalgh and R. Peacock. Effectiveness and efficiency of search methods in systematic reviews of complex evidence: audit of primary sources. Bmj, 331(7524): 1064--1065, 2005.Google ScholarGoogle ScholarCross RefCross Ref
  3. S. Jalali and C. Wohlin. Systematic literature studies: database searches vs. backward snowballing. In Proceedings of the ACM-IEEE international symposium on Empirical software engineering and measurement, pages 29--38. ACM, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. B. A. Kitchenham and S. Charters. Guidelines for performing systematic literature reviews in software engineering. Technical report, Technical report, EBSE Technical Report EBSE-2007-01, 2007.Google ScholarGoogle Scholar
  5. B. A. Kitchenham, T. Dyba, and M. Jorgensen. Evidence-based software engineering. In Software Engineering, 2004. ICSE 2004. Proceedings. 26th International Conference on, pages 273--281. IEEE, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. S. MacDonell, M. Shepperd, B. Kitchenham, and E. Mendes. How reliable are systematic reviews in empirical software engineering? Software Engineering, IEEE Transactions on, 36(5): 676--687, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. K. Petersen and N. B. Ali. Identifying strategies for study selection in systematic reviews and maps. In Proceedings of the 5th International Symposium on Empirical Software Engineering and Measurement, ESEM 2011, Banff, AB, Canada, September 22--23, 2011, pages 351--354, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. M. Riaz, E. Mendes, and E. Tempero. A systematic review of software maintainability prediction and metrics. In Proceedings of the 2009 3rd International Symposium on Empirical Software Engineering and Measurement, pages 367--377. IEEE, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. M. Skoglund and P. Runeson. Reference-based search strategies in systematic reviews. In the Proceedings of the 13th International Conference on Evaluation and Assessment in Software Engineering, Durham, England, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. J. Webster and R. T. Watson. Analyzing the past to prepare for the future: Writing a literature review. Management Information Systems Quarterly, 26(2): 3, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. R. Wieringa, N. A. M. Maiden, N. R. Mead, and C. Rolland. Requirements engineering paper classification and evaluation criteria: a proposal and a discussion. Requir. Eng., 11(1): 102--107, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. C. Wohlin. Guidelines for snowballing in systematic literature studies and a replication in software engineering. In 8th International Conference on Evaluation and Assessment in Software Engineering (EASE 2014), pages 321--330. ACM, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. C. Wohlin, P. Runeson, P. A. da Mota Silveira Neto, E. Engström, I. do Carmo Machado, and E. S. de Almeida. On the reliability of mapping studies in software engineering. Journal of Systems and Software, 86(10): 2594--2610, 2013.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Experiences from using snowballing and database searches in systematic literature studies

          Recommendations

          Comments

          Login options

          Check if you have access through your login credentials or your institution to get full access on this article.

          Sign in
          • Published in

            cover image ACM Other conferences
            EASE '15: Proceedings of the 19th International Conference on Evaluation and Assessment in Software Engineering
            April 2015
            305 pages
            ISBN:9781450333504
            DOI:10.1145/2745802

            Copyright © 2015 ACM

            Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

            Publisher

            Association for Computing Machinery

            New York, NY, United States

            Publication History

            • Published: 27 April 2015

            Permissions

            Request permissions about this article.

            Request Permissions

            Check for updates

            Qualifiers

            • research-article

            Acceptance Rates

            EASE '15 Paper Acceptance Rate20of65submissions,31%Overall Acceptance Rate71of232submissions,31%

          PDF Format

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader