Autumn Semester 2020 takes place in a mixed form of online and classroom teaching.
Please read the published information on the individual courses carefully.

227-0434-10L  Mathematics of Information

SemesterSpring Semester 2018
LecturersH. Bölcskei
Periodicityyearly recurring course
Language of instructionEnglish

AbstractThe class focuses on fundamental mathematical aspects of data sciences: Information theory (lossless and lossy compression), sampling theory, compressed sensing, dimensionality reduction (Johnson-Lindenstrauss Lemma), randomized algorithms for large-scale numerical linear algebra, approximation theory, neural networks as function approximators, mathematical foundations of deep learning.
ObjectiveAfter attending this lecture, participating in the exercise sessions, and working on the homework problem sets, students will have acquired a working knowledge of the most commonly used mathematical theories in data science. Students will also have to carry out a research project, either individually or in groups, with presentations at the end of the semester.
Content1. Information theory: Entropy, mutual information, lossy compression, rate-distortion theory, lossless compression, arithmetic coding, Lempel-Ziv compression

2. Signal representations: Frames in finite-dimensional spaces, frames in Hilbert spaces, wavelets, Gabor expansions

3. Sampling theorems: The sampling theorem as a frame expansion, irregular sampling, multi-band sampling, density theorems, spectrum-blind sampling

4. Sparsity and compressed sensing: Uncertainty principles, recovery algorithms, Lasso, matching pursuits, compressed sensing, non-linear approximation, best k-term approximation, super-resolution

5. High-dimensional data and dimensionality reduction: Random projections, the Johnson-Lindenstrauss Lemma, sketching

6. Randomized algorithms for large-scale numerical linear algebra: Large-scale matrix computations, randomized algorithms for approximate matrix factorizations, matrix sketching, fast algorithms for large-scale FFTs

7. Mathematics of (deep) neural networks: Universal function approximation with single-and multi-layer networks, fundamental limits on compressibility of signal classes, Kolmogorov epsilon-entropy of signal classes, geometry of decision surfaces, convolutional neural networks, scattering networks
Lecture notesDetailed lecture notes will be provided as we go along.
Prerequisites / NoticeThis course is aimed at students with a background in basic linear algebra, analysis, and probability. We will, however, review required mathematical basics throughout the semester in the exercise sessions.