May 24-26, 2021
Registration is now closed.
The Fall 2020 SAMSI program on Numerical Analysis in Data Science (NADS) has seen exciting scientific progress by researchers in five different working groups: (i) large-scale inverse problems and uncertainty quantification, (ii) global sensitivity analysis, (iii) randomized algorithms for matrices and data, (iv) reinforcement learning, and (v) dimensionality reduction in time series. The 3-day transition workshop will be held virtually through zoom and will highlight some of the recent advances made as part of this semester program. Additionally, in synergy with the NSF-Funded research training grant titled ‘RTG: Randomized Numerical Analysis’ at NC State, there will be tutorials on the first day of the workshop related to the working group themes including randomized numerical linear algebra, parameter estimation, sensitivity analysis, etc.
TENTATIVE SCHEDULE
(Note: All times are EDT – New York. Talks will not be recorded)
Monday, May 24, 2021
Time | Description | Speaker | Slides |
---|---|---|---|
9:15 | Welcome | David Banks, SAMSI; Mansoor Haider, NCSU | |
Tutorials co-sponsored by NCSU Randomized Numerical Analysis Research Training Group | TUTORIAL VIDEOS | ||
9:30 | The Ideas Behind Randomized Algorithms for Least Squares Problems | Ilse C.F. Ipsen, NCSU | |
11:00 | A Biased Introduction to Sensitivity Analysis | Pierre Gremaud, NCSU | |
12:00 | Lunch Break | ||
1:00 | Optimal Design of Experiments for Large-scale Bayesian Inverse Problems | Alen Alexanderian, NCSU | |
2:30 | Bayesian Inference and Uncertainty Propagation for Physical and Biological Models | Ralph Smith, NCSU | |
3:30 | Adjourn |
Tuesday, May 25, 2021
Time | Description | Speaker | Slides |
---|---|---|---|
Dimension Reduction in Time Series Working Group | |||
9:30 | Fourier Method on Sufficient Dimension Reduction in Time Series | Priyan De Alwis, So. Illinois U. | |
9:52 | Dimension Reduction for Vector Autoregressive Models | Wiranthe Herath, So. Illinois U. | |
10:15 | Bayesian Copula Factor Autoregressive Models for Time Series Mixed Data | Hadi Safari, So. Illinois U. | |
10:38 | Compressed Bayesian Predictive Inference for Time Series Count Data | Rukayya Ibrahim, SIU | |
Global Sensitivity Analysis Working Group | |||
11:15 | Overview of Working Group Activities | Alen Alexanderian, NCSU | |
11:25 | Efficient Global Sensitivity Analysis for Rare Event Simulation | Mike Merritt, NCSU | |
11:45 | A Tailored Parameter Identifiability Approach for Polymerization Models in Wound Healing Applications | Katherine Pearce, NCSU | |
12:05 | All Explanations are Wrong, but Some are Useful: Using Global Sensitivity Analysis to Identify Gaps in Machine Learning Explainability | Erin Acquesta & Mike Smith, Sandia | |
12:25 | A Novel Generation of Mighty Estimators Based on Rank Statistics | Agnes Lagnoux & Thierry Klein, Toulouse | |
12:45 | Lunch Break | ||
Inverse Problems and Uncertainty Quantification Working Group | |||
1:30 | Learning Methods for Inverse Problems | Silvia Gazzola, University of Bath | |
2:00 | Learning Regularization Parameters of Inverse Problems via Deep Neural Networks | Matthias Chung, Virginia Tech | |
2:30 | Hybrid Projection Method for Large-scale Inverse Problems with Mean Estimation in Hierarchical Gaussian Priors | Taewon Cho, Virginia Tech | |
2:45 | Bedrock Inversion and Hyper Differential Sensitivity Analysis for the Shallow Ice Model | Willliam Reese, NCSU | |
3:00 | Efficient Edge-preserving and Sparsity Promoting Methods for Large-scale, Time-dependent, Dynamic Inverse Problems | Mirjeta Pasha, Arizona State | |
3:30 | Adjourn |
Wednesday, May 26, 2021
Time | Description | Speaker | Slides |
---|---|---|---|
9:30 | Meeting – Program Leaders and SAMSI Directors | ||
Reinforcement Learning Working Group (joint with ARL Seminar, www.arlseminar.com) | |||
10:30 | On Function Approximation in Reinforcement Learning: Optimism in the Face of Large State Space | Zhuoran Yang, Princeton | |
11:00 | Is Pessimism Provably Efficient for Offline RL? | Zhaoran Wang, Northwestern | |
11:30 | Solving Inverse Reinforcement Learning, Bootstrapping Bandits | Houssam Nassif, Amazon | |
12:00 | Lunch Break | ||
Randomized Numerical Linear Algebra Working Group | |||
1:00 | Accelerating Stochastic Optimization of Separable Deep Neural Networks via Iterative Sampling Methods | Elizabeth Newman, Emory | |
1:30 | Randomized FEAST Algorithm for Generalized Hermitian Eigenvalue Problems with Probabilistic Error Analysis | Agnieszka Miedlar, U Kansas | |
2:00 | Randomized Algorithms for Rounding the Tensor Train Format | Grey Ballard, Wake Forest | |
2:30 | Closing Remarks | ||
2:45 | Adjourn | ||
Questions: email [email protected]