skip to main content
research-article

Put That Needle There: Customized Flexible On-Body Thin-Film Displays for Medical Navigation

Published:30 May 2020Publication History
Skip Abstract Section

Abstract

Informed by modern imaging techniques, current medical navigation systems support physicians during a variety of interventions, such as needle-based operations. During these, an abundance of information is often displayed on monitors placed in positions that are uncomfortable for the operator to view. In this article, we address these issues with the concept and prototype of a customized flexible display that is placed on the patient’s body to provide the essential information at just the right location. We present an empirical evaluation comparing the flexible display against a control condition using a standard interventional monitor setup and an additional condition that combines both. Our results show that the flexible display significantly reduces task load while improving overall usability. Furthermore, we found indications that the flexible display reduces task completion time while also observing a negative effect on accuracy, which needs to be balanced carefully.

References

  1. Fred S. Azar, Nathalie Perrin, Ali Khamene, Sebastian Vogt, and Frank Sauer. 2004. User performance analysis of different image-based navigation systems for needle placement procedures. In Proceedings of SPIE, Vol. 5367. 110--121. DOI:https://doi.org/10.1117/12.536122Google ScholarGoogle Scholar
  2. Holger Baethis. 2013. Brainlab Dash®: iPod®-based navigation system in total knee and hip replacements. In Computer and Template Assisted Orthopedic Surgery. Springer, Berlin, Germany, 89--95.Google ScholarGoogle Scholar
  3. W. Birkfellner, M. Figl, K. Huber, F. Watzinger, F. Wanschitz, J. Hummel, R. Hanel, et al. 2002. A head-mounted operating binocular for augmented reality visualization in medicine—Design and initial evaluation. IEEE Transactions on Medical Imaging 21, 8 (Aug. 2002), 991--997. DOI:https://doi.org/10.1109/TMI.2002.803099Google ScholarGoogle Scholar
  4. Noemi Bitterman. 2006. Technologies and solutions for data display in the operating room. Journal of Clinical Monitoring and Computing 20, 3 (May 2006), 165--173. DOI:https://doi.org/10.1007/s10877-006-9017-0Google ScholarGoogle ScholarCross RefCross Ref
  5. D. Black, J. Al Issawi, C. Hansen, C. Rieder, and H. Hahn. 2013. Auditory support for navigated radiofrequency ablation. In Proceedings of the CURAC Annual Meeting. 30.Google ScholarGoogle Scholar
  6. David Black, Christian Hansen, Arya Nabavi, Ron Kikinis, and Horst Hahn. 2017. A survey of auditory display in image-guided interventions. International Journal of Computer Assisted Radiology and Surgery 12, 10 (March 2017), 1665--1676. DOI:https://doi.org/10.1007/s11548-017-1547-zGoogle ScholarGoogle Scholar
  7. David Black, Julian Hettig, Maria Luz, Christian Hansen, Ron Kikinis, and Horst Hahn. 2017. Auditory feedback to support image-guided medical needle placement. International Journal of Computer Assisted Radiology and Surgery 12, 9 (Sept. 2017), 1655--1663. DOI:https://doi.org/10.1007/s11548-017-1537-1Google ScholarGoogle Scholar
  8. John Brooke. 1996. SUS—A quick and dirty usability scale. In Usability Evaluation in Industry. Taylor 8 Francis, London, England.Google ScholarGoogle Scholar
  9. Florian Echtler, Fabian Sturm, Kay Kindermann, Gudrun Klinker, Joachim Stilla, Joern Trilk, and Hesam Najafi. 2004. The intelligent welding gun: Augmented reality for experimental vehicle construction. In Virtual and Augmented Reality Applications in Manufacturing, S. K. Ong and A. Y. C. Nee (Eds.). Springer, London, England, 333--360. DOI:10.1007/978-1-4471-3873-0_17Google ScholarGoogle Scholar
  10. P. J. Edwards, D. J. Hawkes, D. L. Hill, D. Jewell, R. Spink, A. Strong, and M. Gleeson. 1995. Augmentation of reality using an operating microscope for otolaryngology and neurosurgical guidance. Computer Aided Surgery 1, 3 (Jan. 1995), 172--178. DOI:https://doi.org/10.3109/10929089509105692Google ScholarGoogle Scholar
  11. Luke Franzke. 2013. Decay Designing Ephemeral Interactive Devices. Master’s Thesis. Zürcher Hochschule der Künste.Google ScholarGoogle Scholar
  12. Henry Fuchs, Mark A. Livingston, Ramesh Raskar, D’nardo Colucci, Kurtis Keller, Andrei State, Jessica R. Crawford, Paul Rademacher, Samuel H. Drake, and Anthony A. Meyer. 1998. Augmented reality visualization for laparoscopic surgery. In Medical Image Computing and Computer-Assisted Intervention—MICCAI’98. Lecture Notes in Computer Science, Vol. 1496. Springer, 934--943.Google ScholarGoogle Scholar
  13. Kate Alicia Gavaghan, Sylvain Anderegg, Matthias Peterhans, Thiago Oliveira-Santos, and Stefan Weber. 2012. Augmented reality image overlay projection for image guided open liver ablation of metastatic liver cancer. In Augmented Environments for Computer-Assisted Interventions. Springer, 36--46.Google ScholarGoogle Scholar
  14. Christian Hansen, David Black, Christoph Lange, Fabian Rieber, Wolfram Lamade, Marcello Donati, Karl J. Oldhafer, and Horst K. Hahn. 2013. Auditory support for resection guidance in navigated liver surgery. International Journal of Medical Robotics and Computer Assisted Surgery 9, 1 (March 2013), 36--43. DOI:https://doi.org/10.1002/rcs.1466Google ScholarGoogle Scholar
  15. Christian Hansen, Jan Wieferich, Felix Ritter, Christian Rieder, and Heinz-Otto Peitgen. 2009. Illustrative visualization of 3D planning models for augmented reality in liver surgery. International Journal of Computer Assisted Radiology and Surgery 5, 2 (June 2009), 133--141. DOI:https://doi.org/10.1007/s11548-009-0365-3Google ScholarGoogle Scholar
  16. Sandra G. Hart and Lowell E. Stavenland. 1988. Development of NASA-TLX (task load index): Results of empirical and theoretical research. In Human Mental Workload. Elsevier, 139--183.Google ScholarGoogle Scholar
  17. Marc Herrlich, Jöran Benker, David Black, Frank Dylla, and Rainer Malaka. 2015. Tool-mounted ring displays for intraoperative navigation. In Tagungsband der 14. Jahrestagung der Deutschen Gesellschaft für Computer- und Roboterassistierte Chirurgie (CURAC). Deutsche Gesellschaft für Computer- und Roboterassistierte Chirurgie e.V. (CURAC), 273--278.Google ScholarGoogle Scholar
  18. Marc Herrlich, Parnian Tavakol, David Black, Dirk Wenig, Christian Rieder, Rainer Malaka, and Ron Kikinis. 2017. Instrument-mounted displays for reducing cognitive load during surgical navigation. International Journal of Computer Assisted Radiology and Surgery 12, 9 (Sept. 2017), 1599--1605. DOI:https://doi.org/10.1007/s11548-017-1540-6Google ScholarGoogle ScholarCross RefCross Ref
  19. Yvonne Jansen, Pierre Dragicevic, Petra Isenberg, Jason Alexander, Abhijit Karnik, Johan Kildal, Sriram Subramanian, and Kasper Hornbæk. 2015. Opportunities and challenges for data physicalization. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI’15). ACM, New York, NY, 3227--3236.Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Hsin-Liu (Cindy) Kao, Christian Holz, Asta Roseway, Andres Calvo, and Chris Schmandt. 2016. DuoSkin: Rapidly prototyping on-skin user interfaces using skin-friendly materials. In Proceedings of the 2016 ACM International Symposium on Wearable Computers (ISWC’16). ACM, New York, NY, 16--23.Google ScholarGoogle Scholar
  21. Mustafa Emre Karagozler, Ivan Poupyrev, Gary K. Fedder, and Yuri Suzuki. 2014. Paper generators: Harvesting energy from touching, rubbing and sliding. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology (UIST’14). 161--162.Google ScholarGoogle Scholar
  22. Kevin Kassil and A. James Stewart. 2009. Evaluation of a tool-mounted guidance display for computer-assisted surgery. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’09). ACM, New York, NY, 1275--1278. DOI:https://doi.org/10.1145/1518701.1518892Google ScholarGoogle Scholar
  23. Konstantin Klamka and Raimund Dachselt. 2017. IllumiPaper: Illuminated interactive paper. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI’17). ACM, New York, NY, 5605--5618.Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Uli Mezger, Claudia Jendrewski, and Michael Bartels. 2013. Navigation in surgery. Langenbeck’s Archives of Surgery 398, 4 (Feb. 2013), 501--514. DOI:https://doi.org/10.1007/s00423-013-1059-4Google ScholarGoogle Scholar
  25. Simon Olberding, Michael Wessely, and Juergen Steimle. 2014. PrintScreen: Fabricating highly customizable thin-film touch-displays. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology (UIST’14). ACM, New York, NY, 281--290. DOI:https://doi.org/10.1145/2642918.2647413Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Anke V. Reinschluessel, Marc Herrlich, Tanja Döring, Mark Vangel, Clare Tempany, Rainer Malaka, and Junichi Tokuda. 2018. Insert needle here! A custom display for optimized biopsy needle placement. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI’18). ACM, New York, NY, 263.Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Andrea Schenk, A. Koehn, R. Matsuyama, and I. Endo. 2013. Transfer of liver surgery planning into the operating room: Initial experience with the iPad. In Proceedings of the 10th Congress of the European-African Hepato Pancreato Biliary Association (E-AHPBA’13).Google ScholarGoogle Scholar
  28. Andrei State, Mark A. Livingston, William F. Garrett, Gentaro Hirota, Mary C. Whitton, Etta D. Pisano, and Henry Fuchs. 1996. Technologies for augmented reality systems: Realizing ultrasound-guided needle biopsies. In Proceedings of the 23rd Annual Conference on Computer Graphics and Interactive Techniques (SIGGRAPH’96). ACM, New York, NY, 439--446.Google ScholarGoogle Scholar
  29. G. D. Stetten and V. S. Chib. 2001. Overlaying ultrasonographic images on direct vision. Journal of Ultrasound in Medicine 20, 3 (March 2001), 235--240.Google ScholarGoogle Scholar
  30. Philipp J. Stolka, Pezhman Foroughi, Matthew Rendina, Clifford R. Weiss, Gregory D. Hager, and Emad M. Boctor. 2014. Needle guidance using handheld stereo vision and projection for ultrasound-based interventions. Medical Image Computing and Computer-Assisted Intervention 17, Pt. 2 (2014), 684--691.Google ScholarGoogle Scholar
  31. Joerg Traub, Philipp Stefan, Sandro Michael Heining, Tobias Sielhorst, Christian Riquarts, Ekkehard Euler, and Nassir Navab. 2006. Hybrid navigation interface for orthopedic and trauma surgery. In Medical Image Computing and Computer-Assisted Intervention—MICCAI 2006. Lecture Notes in Computer Science, Vol. 4190. Springer, 373--380.Google ScholarGoogle Scholar
  32. Martin Weigel, Tong Lu, Gilles Bailly, Antti Oulasvirta, Carmel Majidi, and Juergen Steimle. 2015. iSkin: Flexible, stretchable and visually customizable on-body touch sensors for mobile computing. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI’15). ACM, New York, NY, 2991--3000.Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. M. Whelan. 2013. Using Electric Current Through a Phosphor or Semiconductor. Retrieved March 30, 2020 from http://edisontechcenter.org/electroluminescent.html.Google ScholarGoogle Scholar

Index Terms

  1. Put That Needle There: Customized Flexible On-Body Thin-Film Displays for Medical Navigation

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in

        Full Access

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        HTML Format

        View this article in HTML Format .

        View HTML Format
        About Cookies On This Site

        We use cookies to ensure that we give you the best experience on our website.

        Learn more

        Got it!