Name | Prof. Dr. Helmut Bölcskei |
Field | Mathematical Information Science |
Address | Professur Math. Informationswiss. ETH Zürich, ETF E 122 Sternwartstrasse 7 8092 Zürich SWITZERLAND |
Telephone | +41 44 632 34 33 |
hboelcskei@ethz.ch | |
URL | https://www.mins.ee.ethz.ch/people/show/boelcskei |
Department | Information Technology and Electrical Engineering |
Relationship | Full Professor |
Number | Title | ECTS | Hours | Lecturers | |
---|---|---|---|---|---|
227-0434-10L | Mathematics of Information ![]() | 8 credits | 3V + 2U + 2A | H. Bölcskei | |
Abstract | The class focuses on fundamental aspects of mathematical information science: Frame theory, sampling theory, sparsity, compressed sensing, uncertainty relations, spectrum-blind sampling, dimensionality reduction and sketching, randomized algorithms for large-scale sparse FFTs, inverse problems, (Kolmogorov) approximation theory, and information theory (lossless and lossy compression). | ||||
Learning objective | After attending this lecture, participating in the exercise sessions, and working on the homework problem sets, students will have acquired a working knowledge of the most commonly used mathematical theories in information science. Students will also have to carry out a research project, either individually or in groups, with presentations at the end of the semester. | ||||
Content | 1. Signal representations: Frames in finite-dimensional spaces, frames in Hilbert spaces, wavelets, Gabor expansions 2. Sampling theorems: The sampling theorem as a frame expansion, irregular sampling, multi-band sampling, density theorems, spectrum-blind sampling 3. Sparsity and compressed sensing: Uncertainty relations in sparse signal recovery, recovery algorithms, Lasso, matching pursuit algorithms, compressed sensing, super-resolution 4. High-dimensional data and dimensionality reduction: Random projections, the Johnson-Lindenstrauss Lemma, sketching 5. Randomized algorithms for large-scale sparse FFTs 6. Approximation theory: Fundamental limits on compressibility of signal classes, Kolmogorov epsilon-entropy of signal classes, optimal encoding and decoding of signal classes 7. Information theory: Entropy, mutual information, lossy compression, rate-distortion theory, lossless compression, arithmetic coding, Lempel-Ziv compression | ||||
Lecture notes | Detailed lecture notes will be provided at the beginning of the semester. | ||||
Prerequisites / Notice | This course is aimed at students with a background in basic linear algebra, analysis, and probability. We will, however, review required mathematical basics throughout the semester in the exercise sessions. | ||||
401-5680-00L | Foundations of Data Science Seminar ![]() | 0 credits | P. L. Bühlmann, H. Bölcskei, J. M. Buhmann, T. Hofmann, A. Krause, A. Lapidoth, H.‑A. Loeliger, M. H. Maathuis, N. Meinshausen, G. Rätsch, S. van de Geer | ||
Abstract | Research colloquium | ||||
Learning objective |