Syndicate content

Inference: Dimension Reduction

Dimension reduction methods search for low dimensional structure in high dimensional data. Although unsupervised methods such as PCA have been used for over 80 years, new issues in robustness, very high dimensionality p >> n, nonlinear structure, and so on continue to provide rich sources of statistical and computational issues. Supervised methods, such as Sliced Inverse Regression, which look for low dimensional association between predictors and response have been developed within the past 35 years, and many useful extensions continue to be developed. Performing valid inference to correctly account for the impact of dimension reduction including variable selection also poses problems that have not been thoroughly explored.

This working group will work on problems of extending unsupervised and supervised methods to some of the interesting new areas such as machine learning, functional data, discrete data, and complex data structures such as networks. As well, we will work on inferential issues.

Yifan Xu's picture

On NMF clustering and K-means clustering by Kim & Park (2008)

Naomi Altman's picture

Bagging to improve the accuracy of a clustering procedure

Naomi Altman's picture

A unified view of matrix factorization methods

Naomi Altman's picture

Bayesian Variable Selection - Liechty

Naomi Altman's picture

Chuanhua Xing's slides

ChuckPaulson's picture

Updated LDA Slides

edoh's picture

Comparing SVD, PCA, Non-negative matrix factorization, and Diffusion maps