Weighted Distance Weighted Discrimination and Pairwise Variable Selection for Classification Public Deposited

Downloadable Content

Download PDF
Last Modified
  • March 20, 2019
  • Qiao, Xingye
    • Affiliation: College of Arts and Sciences, Department of Statistics and Operations Research
  • Statistical machine learning has attracted a lot of attention in recent years due to its broad applications in various fields. The driving statistical problem that is common throughout this dissertation is classification. This dissertation covers two major topics in classification. The first topic is weighted Distance Weighted Discrimination (weighted DWD or wDWD), an improved version of a recently proposed classification method. We show significant improvements are available in several situations. Using our proposed optimal weighting schemes, we show that wDWD is Fisher consistent under the overall misclassification criterion. In addition, we propose three alternative criteria and provide the corresponding optimal weights or adaptive weighting schemes for each of them. Mathematical validation of these ideas is established through the High-Dimensional, Low Sample-Size (HDLSS) asymptotic properties of wDWD. An important contribution is the weakening of the assumptions from Hall et al. (2005) and Ahn et al. (2007). We then extend the results to two classes. The HDLSS asymptotic properties of wDWD that we discuss here contain two results, one is about the misclassification rate of wDWD, the other explores the angle between the DWD direction and the optimal classification direction. The second topic of this dissertation is variable selection for classification. The goal is to find those variables that have weak marginal effects, but can lead to good classification results when they are viewed jointly. To accomplish this, we use a within-class permutation test called Significance test of Joint Effect (SigJEff). The resulting object of SigJEff is a set of pairs of variables with statistically significant joint effects. To extend our scope to joint effects with more than two variables, we introduce a new visualization approach to display the mutiscale joint effects, called Multiscale Significance Display (MSD), and a general framework for variable selection procedures based on MSD, called Multiscale Variable Screening (MVS). MSD is a moving window approach, and it evaluates the joint effects of the variables in this window. The moving window is based on an order of variables. MVS seeks to find the best initial ordering in an iterative manner.
Date of publication
Resource type
Rights statement
  • In Copyright
  • Nobel, Andrew
  • Zhang, Hao
  • Liu, Yufeng
  • Marron, James Stephen
  • Kelly, Douglas
  • Doctor of Philosophy
Degree granting institution
  • University of North Carolina at Chapel Hill Graduate School
Graduation year
  • 2010
  • This item is restricted from public view for 1 year after publication.

This work has no parents.