Special Guest Lecture: Dr. Jean Barbier


Guest Speaker: Jean Barbier – Associate Research Officer, The Abdus Salam International Centre for Theoretical Physics (ICTP)

 

 

 

Title: 0-1 Phase Transitions in High Dimensional Inference and Learning

Talk Abstract:
I will discuss paradigmatic statistical models of inference and learning from high dimensional data, such as sparse PCA and the perceptron neural network, in the sub-linear sparsity regime. In this limit the underlying hidden signal, i.e., the low-rank matrix in PCA or the neural network weights, has a number of non-zero components that scales sub-linearly with the total dimension of the vector. I will provide explicit low-dimensional variational formulas for the asymptotic mutual information between the signal and the data in suitable sparse limits. In the setting of support recovery these formulas imply sharp 0-1 phase transitions for the asymptotic minimum mean-square-error (or generalization error in the neural network setting). A similar phase transition was analyzed recently in the context of sparse high-dimensional linear regression by Reeves et al.

Presentation Slides

Video