Date of Award
6-1-2009
Document Type
Thesis (Master's)
Department or Program
Department of Computer Science
First Advisor
Scot Drysdale
Abstract
Surface reconstruction is an area of computational geometry that has been progressing rapidly over the last decade. Current algorithms and their implementations can reconstruct surfaces from a variety of input and the accuracy and precision improve with each new development. These all make use of various heuristics to achieve a reconstruction. Much of this work consists of reconstructing a still object from point samples taken from the object's surface. We examine reconstructing an n-dimensional object and its motion by treating time as an (n + 1)st axis. Our input consists of (n-1)-dimensional scans taken over time and at di?erent positions on the original object. This input is mapped into (n + 1) dimensions where the (n + 1)st dimension is a scaled time axis and then fed into an existing surface reconstruction algorithm. A cross section of the reconstructed surface perpendicular to the time axis yields an approximation to the shape of the n-dimensional surface at the corresponding point in time. The intended application for this work is the reconstruction of medical images from scanning technology such as MRI or CT into moving 3d surfaces. We investigate reconstructing 2d moving surfaces through time as a preliminary step towards the moving 3d problem. We spend most of our efforts in this thesis on the problem of computing a scaling factor for mapping time into the (n + 1)st axis to minimize the number of scans needed to meet the sampling requirements for an existing surface reconstruction algorithm. We give three bounds, based on features of the 2d moving object, that are necessary to accomplish this.
Recommended Citation
Brash, LeeAnn T., "Surface Reconstruction through Time" (2009). Dartmouth College Master’s Theses. 12.
https://digitalcommons.dartmouth.edu/masters_theses/12
Comments
Originally posted in the Dartmouth College Computer Science Technical Report Series, number TR2009-648.