10.1145/1978942.1979140acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedings
research-article

WYSIWYF: exploring and annotating volume data with a tangible handheld device

ABSTRACT

Visual exploration of volume data often requires the user to manipulate the orientation and position of a slicing plane in order to observe, annotate or measure its internal structures. Such operations, with its many degrees of freedom in 3D space, map poorly into interaction modalities afforded by mouse-keyboard interfaces or flat multi-touch displays alone. We addressed this problem using a what-you-see-is-what-you-feel (WYSIWYF) approach, which integrates the natural user interface of a multi-touch wall display with the untethered physical dexterity provided by a handheld device with multi-touch and 3D-tilt sensing capabilities. A slicing plane can be directly and intuitively manipulated at any desired position within the displayed volume data using a commonly available mobile device such as the iPod touch. 2D image slices can be transferred wirelessly to this small touch screen device, where a novel fast fat finger annotation technique (F3AT) is proposed to perform accurate and speedy contour drawings. Our user studies support the efficacy of our proposed visual exploration and annotation interaction designs.

References

  1. D. Aliakseyeu, S. Subramanian, J.-B. Martens, and M. Rauterberg. Interaction Techniques for Navigation through and Manipulation of 2D and 3D Data. In Eurographics Workshop on Virtual Environment, 79--188, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. R. S. Avila and L. M. Sobierajski. A haptic interaction method for volume visualization. In IEEE Visualization, 197--204, 1996. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. R. Ballagas, J. Borchers, M. Rohs, and J. G. Sheridan. The smart phone: A ubiquitous input device. IEEE Pervasive Computing, 5(1):70--77, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. H. Benko, A. D. Wilson, and P. Baudisch. Precise selection techniques for multi-touch screens. In CHI 2006, 1263--1272, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. L. Bonanni, J. Alonso, N. Chao, G. Vargas, and H. Ishii. Handsaw: Tangible exploration of volumetric data by direct cut-plane projection. In CHI 2008, 251--254, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. S. Boring, M. Jurmu, and A. Butz. Scroll, tilt or move it: Using mobile phones to continuously control pointers on large public displays. In OZCHI 2009, 161--168, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. N. Couture, G. Rivière, and P. Reuter. GeoTUI: A tangible user interface for geoscience. In TEI 2008, 89--96, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. R. Dachselt and R. Buchholz. Natural throw and tilt interaction between mobile phones and distant displays. In CHI 2009, 3253--3258, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. F. Echtler, S. Nestler, A. Dippon, and G. Klinker. Supporting casual interactions between board games on public tabletop displays and mobile devices. Personal and Ubiquitous Computing, 13(8):609--617, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. B. Fröhlich, J. Plate, J. Wind, G. Wesche, and M. Göbel. Cubic-mouse-based interaction in virtual environments. IEEE Computer Graphics and Applications, 20(4):12--15, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. C.-W. Fu, W.-B. Goh, and J. Ng, Allen. Multi-touch techniques for exploring large-scale 3D astrophysical simulations. In CHI 2010, 2213--2223, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. L. Gallo, G. D. Pietro, and I. Marra. 3D interaction with volumetric medical data: Experiencing the Wiimote. International conference on Ambient media and systems, 2008. Article No.: 14. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. J. D. Genetti. Volume-rendered galactic animations. Communications of the ACM, 45(11):62--66, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. R. Hardy and E. Rukzio. Touch & interact: Touch-based interaction of mobile phones with displays. In Proceedings of MobileHCI '08, 245--254, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. K. Hinckley, R. Pausch, J. C. Goble, and N. F. Kassell. Passive real-world interface props for neurosurgical visualization. In CHI 1994, 452--458, 1994. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. N. Katzakis and M. Hori. Mobile devices as multi-DOF controllers. In UIST 2010, 139--140, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. J. Kela, P. Korpipää, J. Mäntyjärvi, S. Kallio, G. Savino, L. Jozzo, and S. D. Marca. Accelerometer-based gesture control for a design environment. Personal and Ubiquitous Computing, 10(5):285--299, 2006.Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. J.-S. Kim, D. Gracanin, K. Matkovic, and F. Quek. iPhone/iPod touch as input devices for navigation in immersive virtual environments. In IEEE Virtual Reality, 261--262, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. M. Levoy. Display of surfaces from volume data. IEEE Computer Graphics and App., 8(3):29--37, 1988. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. A. Olwal and S. Feiner. Spatially aware handhelds for high-precision tangible interaction with large displays. In TEI 2009, 181--188, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. R. Potter, L. Weldon, and B. Shneiderman. Improving the accuracy of touchscreens: An experimental evaluation of three strategies. In CHI 1988, 27--32, 1988. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. W. Qi and J.-B. Martens. Tangible user interfaces for 3D clipping plane interaction with volumetric data: A case study. In International Conference on Multimodal Interfaces, 252--258, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. W. Qiao, D. S. Ebert, A. Entezari, M. Korkusinski, and G. Klimeck. VolQD: Direct volume rendering of multimillion atom quantum dot simulations. In IEEE Visualization 2005, 319--326, 2005.Google ScholarGoogle Scholar
  24. C. Ratti, Y. Wang, B. Piper, H. Ishii, and A. Biderman. PHOXEL-SPACE: An interface for exploring volumetric data with physical voxels. In DIS 2004, 289--296, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. J. Schöning, M. Rohs, and A. Krüger. Using mobile phones to spontaneously authenticate and interact with multi-touch surfaces. In AVI 2008, Workshop on designing multi-touch interaction tech. for coupled private and public displays, 2008.Google ScholarGoogle Scholar
  26. W. Schroeder, K. Marti, and B. Lorensen. The Visualization Toolkit: An object oriented approach to 3D graphics. Kitware Inc., 4th ed., 2006.Google ScholarGoogle Scholar
  27. A. Sears and B. Shneiderman. High precision touchscreens: Design strategies and comparisons with a mouse. International Journal of Man-Machine Studies, 34(4): 593--613, 1991. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. A. S. Shirazi, T. Döring, P. Parvahan, B. Ahrens, and A. Schmidt. Poker surface: Combining a multi-touch table and mobile phones in interactive card games. In MobileHCI '09, 1--2, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. M. Spindler, S. Stellmach, and R. Dachselt. PaperLens: Advanced magic lens interaction above the tabletop. In ITS 2009, 69--76, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. F. Steinicke, K. Hinrichs, J. Schöning, and A. Krüger. Multi-touching 3D data: Towards direct interaction in stereoscopic display environments coupled with mobile devices. In AVI 2008, Workshop on Designing Multi-Touch Interaction Techniques for Coupled Public and Private Displays, 46--49, 2008.Google ScholarGoogle Scholar
  31. S. Subramanian, D. Aliakseyeu, and J.-B. Martens. Empirical evaluation of performance in hybrid 3D and 2D interfaces. In INTERACT, 916-919, 2003.Google ScholarGoogle Scholar
  32. Z. Szalavári and M. Gervautz. The personal interaction panel - A two-handed interface for augmented reality. In Computer Graphics Forum, (Proc. of Eurographics '97), 16(3): 335--346, 1997.Google ScholarGoogle Scholar
  33. S. Voida, M. Tobiasz, J. Stromer, P. Isenberg, and S. Carpendale. Getting practical with interactive tabletop displays: Designing for dense data, "fat fingers," diverse interactions, and face-to-face collaboration. In ITS 2009, 109--116, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. D. Weiskopf, T. Schafhitzel, and T. Ertl. Texture-based visualization of unsteady 3D flow by real-time advection and volumetric illumination. IEEE Tran. on Vis. and Comp. Graphics, 13(3):569--582, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. D. Wigdor, C. Forlines, P. Baudisch, J. Barnwell, and C. Shen. LucidTouch: A see-through mobile device. In UIST 2007, 269--278, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. A. D. Wilson, R. Sarin. BlueTable: Connecting wireless mobile devices on interactive surfaces using visionbased handshaking. In Graphics Interface, 119--125, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Y. Yokokohji, R. L. Hollis, and T. Kanade. WYSIWYF display: A visual/haptic interface to virtual environment. Presence: Teleoperators and Virtual Env., 8(4): 412--434, August 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. A. M. Zsaki. Cutting | Plane: An interactive tool for exploration of 3D datasets via slicing. In C3S2E 2008, 167--171, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library

Supplemental Material

paper225.mp4

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader
About Cookies On This Site

We use cookies to ensure that we give you the best experience on our website.

Learn more

Got it!