This course will be taught by David Banks, Professor of the Practice of Statistics, Duke University & Director, SAMSI
This course is being offered in conjunction with the SAMSI semester-long research program on Deep Learning. The course will start with a review of standard neural networks, and then progress to modern deep learning, including convolutional neural networks, recursive neural networks, generative adversarial networks, and various kinds of autoencoders. We shall discuss training strategies, architecture search, regularization and quantization.
There will be mathematics in the course, and a degree of mathematical sophistication is expected from the students, but the material will all be self-contained. The emphasis will be upon heuristics and applications. There will be projects and presentations at the end of the semester, and students will work on those in small groups. Each group will need to have at least one member who can program in Python or a comparable language.
Talk Title:Attacking the Curse of Dimensionality using Sums of Separable Functions Speaker:Martin Mohlenkamp, Assoc. Professor, Dept. of Mathematics, Ohio University
Naive computations involving a function of many variables suffer from the curse of dimensionality: the computational cost grows exponentially with the number of variables. One approach to bypassing the curse is to approximate the function as a sum of products of functions of one variable and compute in this format. When the variables are indices, a function of many variables is called a tensor, and this approach is to approximate and use the tensor in the (so-called) canonical tensor format. In this talk I will describe how such approximations can be used in numerical analysis and in machine learning.