Authors

Xinyu Tian

Type

Text

Type

Dissertation

Advisor

Kuan, Pei Fen | Wang, Xuefeng | Zhu, Wei | Yu, Xiaxia.

Date

2017-05-01

Keywords

Statistics | Bayesian, feature selection, LASSO, Network constraint, proximal gradient, Stan

Department

Department of Applied Mathematics and Statistics

Language

en_US

Source

This work is sponsored by the Stony Brook University Graduate School in compliance with the requirements for completion of degree.

Identifier

http://hdl.handle.net/11401/77332

Publisher

The Graduate School, Stony Brook University: Stony Brook, NY.

Format

application/pdf

Abstract

High-dimensional datasets are now ubiquitous in biomedical research. Feature selection is an essential step in mining high-dim data to reduce noise, avoid overfitting and improve the interpretation of statistical models. In the last few decades, numerous feature selection methods and algorithms have been proposed for various response types, connections in predictors and requirements on sparsities; and penalized methods, such as LASSO and its variations, are the most efficient and popular ones in this area. In addition, genomic features, such as gene expressions, are usually connected through an underlying biological network, which is an important supplement to the model in improving performance and interpretability. In this study, we first extend the group LASSO to a network-constrained classification model and develop a modified proximal gradient algorithm for the model fitting. In this algorithm, group lasso regularization is used to induce model sparsity, and a network constraint is imposed to induce the smoothness of the coefficients using underlying network structure. The applicability of the proposed method is verified by analyzing both numerical examples and real gene expression data in TCGA. We further work on the feature selection problem with Bayesian hierarchical structure. R. Tibshirani, who introduced LASSO in 1996, also proposed that linear LASSO can be considered as a Bayesian model with Laplace prior on coefficient parameters, which shed lights on the feature selection problem in Bayesian models. Compared to frequentist approaches, Bayesian model copes better with complex hierarchical structures of the data. On one hand, we compare the performance of Laplace, horseshoe and Gaussian priors in linear Bayesian models with extensive simulations. On the other, we extend the projection predictive feature selection scheme to group-wise selection and benchmark its feature selection performance and prediction accuracy with standard Bayesian methods. All Bayesian posterior parameters are estimated using Hamiltonian Monte Carlo implemented in Stan. | 122 pages

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.