Shape Deformation Statistics and Regional Texture-Based Appearance Models for Segmentation Public Deposited

Downloadable Content

Download PDF
Last Modified
  • March 20, 2019
  • Vicory, Jared
    • Affiliation: College of Arts and Sciences, Department of Computer Science
  • Transferring identified regions of interest (ROIs) from planning-time MRI images to the trans-rectal ultrasound (TRUS) images used to guide prostate biopsy is difficult because of the large difference in appearance between the two modalities as well as the deformation of the prostate's shape caused by the TRUS transducer. This dissertation describes methods for addressing these difficulties by both estimating a patient's prostate shape after the transducer is applied and then locating it in the TRUS image using skeletal models (s-reps) of prostate shapes. First, I introduce a geometrically-based method for interpolating discretely sampled s-reps into continuous objects. This interpolation is important for many tasks involving s-reps, including fitting them to new objects as well as the later applications described in this dissertation. This method is shown to be accurate for ellipsoids where an analytical solution is known. Next, I create a method for estimating a probability distribution on the difference between two shapes. Because s-reps live in a high-dimensional curved space, I use Principal Nested Spheres (PNS) to transform these representations to instead live in a flat space where standard techniques can be applied. This method is shown effective both on synthetic data as well as for modeling the deformation caused by the TRUS transducer to the prostate. In cases where appearance is described via a large number of parameters, such as intensity combined with multiple texture features, it is computationally beneficial to be able to turn these large tuples of descriptors into a scalar value. Using the inherent localization properties of s-reps, I develop a method for using regionally-trained classifiers to turn appearance tuples into the probability that the appearance tuple in question came from inside the prostate boundary. This method is shown to be able to accurately discern inside appearances from outside appearances over a large majority of the prostate boundary. Finally, I combine these techniques into a deformable model-based segmentation framework to segment the prostate in TRUS. By applying the learned mean deformation to a patient's prostate and then deforming it so that voxels with high probability of coming from the prostate's interior are also in the model's interior, I am able to generate prostate segmentations which are comparable to state of the art methods.
Date of publication
Resource type
Rights statement
  • In Copyright
  • Pizer, Stephen M.
  • Paniagua, Beatriz
  • Marron, James Stephen
  • Niethammer, Marc
  • Fenster, Aaron
  • Doctor of Philosophy
Degree granting institution
  • University of North Carolina at Chapel Hill Graduate School
Graduation year
  • 2016

This work has no parents.