Final Project Proposal

Title: Dynamic slicing of volumetric datasets via a multi-touch interface.

Abstract

Through this project, we will report on the development of multi-input gesture techniques that will be used to specify which regions within the volume are to be discarded from rendering. The gesture techniques developed shall be implemented via a multi-touch table-top interface.

Introduction

The visualization of volumetric datasets is a common task in several fields, such as medical imaging, weather analysis and oil prospecting. Despite the evolution of volume rendering techniques, 2D slice-by-slice inspection is still mainly used in practice for several reasons [1]. First, datasets tend to be very large and rendering algorithms may not provide real-time response. More importantly, nearly everyone is familiar with 2D interaction, which is easily accomplished via a mouse and keyboard [1].

Interactive systems that enable visualization and interaction with data volumes in fully immersive 3D experiences have recently become a reality. In such systems, the visualization is provided in projection-based VR displays and the interaction is almost always natural and intuitive, making use of special devices such as data glove and 3D mouse[1]. The argument that holds for this kind of systems is simple: if the problem is 3D, it seems obvious to implement a complete 3D solution, assuring a true mapping between reality and virtual reality. However, these systems are expensive, and require extensive training on their usage.

Table-top user interfaces are novel interfaces that transpose a computer desktop environment onto a familiar table setting. With the incorporation of multi-touch technology, table-top user interfaces have become very powerful tools where many users can freely exchange and manipulate information in a more intuitive manner. Through a series programmable gestures, multi-touch technology is able to interact with 3D technology in a more seamless fashion than would normally be available via a mouse and keyboard combination. Also, with the development of Frustrated Total Internal Reflectance multi-touch technology [3], their implementation has become accessible to the average consumer. By designing the appropriate gestures, such an interface may be used to navigate within a volumetric interface, thereby enabling users with a more intuitive interface with which they can manage their volumetric renderings.

Description

The project will constitute of two main components. First, we have the rendering aspect of our project. For our project, we plan to visualize 3D Computed Tomography datasets. There are numerous algorithms that we can use to visualize the datase, from Marching Cubes to a new technique called “BT Volumes” [4], which segments the data into Bezier Tetrahedra and performs a ray-tracing method to render the data. At this moment, we are leaning towards utilizing a Marching Cubes algorithm, due to prior experience in implementing the algorithm. However, it might be advantageous to implement the “BT Volumes” algorithm, as it has been shown to improve upon the rendering capabilities and the speed of the Marching Cubes Algorithm.

Our second component will be the dynamic segmentation of data via a multi-touch interface. The core techniques have been described in [1], where they utilized a space mouse to implement a virtual eraser, along with other tools, to dynamically segment a volumetric dataset. From the insights acquired through this paper, we will generate a series of gestures unique to the multi-touch interface that will perform the same tasks. The multi-touch interface has been built utilizing the techniques described in [3]. However, a key component will be integrating the rendering aspects of our program with our interface. Also, the proper design of these gestures will be crucial in exploiting all of the benefits inherent in a multi-touch interface

Implementation

Milestones:

March 20th: By this date, we will have a program that is able to read CT-Scan datasets. We will be doing research on how to read DICOM files. This will lead us to either implement a reader from scratch based on the DICOM specifications, or utilizing an already-made program that will be incorporated into our main rendering program. At the same time, coding of our rendering program should have already been started.

March 28th: A simple version of the Marching Cubes algorithm should be implemented by this time. We should start planning how to implement renderings of multiple iso-surface values. At this time, we should begin designing what appropriate multi-touch gestures. After which, we should ensure that our multi-touch surface can recognize these gestures.

April 15th: Full integration between the rendering and interactive programs should be achieved. Begin prototype testing.

April 19th: Prototype testing should be completed, work on presentation.

April 21st: Final Presentation due. Start working on Final Report.

April 30th: Final Report Due

Expected results

We expect that our table-top multi-touch interface will be able to dynamically segment volumetric renderings.

References

[1] Huff, R., Dietrich, C. A., Nedel, L. P., Freitas, C. M., Comba, J. L., and Olabarriaga, S. D. 2006. Erasing, digging and clipping in volumetric datasets with one or two hands. In Proceedings of the 2006 ACM international Conference on Virtual Reality Continuum and Its Applications (Hong Kong, China). VRCIA ’06. ACM, New York, NY, 271-278

[2] William E. Lorensen, Harvey E. Cline, “Marching Cubes: A high resolution 3D surface construction algorithm”. In: Computer Graphics, Vol. 21, Nr. 4, Juli 1987

[3] Han, J., “Low-cost multi-touch sensing through frustrated total internal reflection”, Proc. Of 18 annual ACM Symp. On User Interfaces software and Tech., 2005, P.115-118

[4] Kloetzli, J., Olano, M., and Rheingans, P. 2008. Interactive volume isosurface rendering using BT volumes. In Proceedings of the 2008 Symposium on interactive 3D Graphics and Games (Redwood City, California, February 15 – 17, 2008). SI3D ’08. ACM, New York, NY, 45-52.

1 thought on “Final Project Proposal

Comments are closed.