Helmut Bölcskei: Catalogue data in Spring Semester 2019

Award: The Golden Owl
Name Prof. Dr. Helmut Bölcskei
FieldMathematical Information Science
Address
Professur Math. Informationswiss.
ETH Zürich, ETF E 122
Sternwartstrasse 7
8092 Zürich
SWITZERLAND
Telephone+41 44 632 34 33
E-mailhboelcskei@ethz.ch
URLhttps://www.mins.ee.ethz.ch/people/show/boelcskei
DepartmentInformation Technology and Electrical Engineering
RelationshipFull Professor

NumberTitleECTSHoursLecturers
227-0434-10LMathematics of Information Information 8 credits3V + 2U + 2AH. Bölcskei
AbstractThe class focuses on fundamental aspects of mathematical information science: Frame theory, sampling theory, sparsity, compressed sensing, uncertainty relations, spectrum-blind sampling, dimensionality reduction and sketching, randomized algorithms for large-scale sparse FFTs, inverse problems, (Kolmogorov) approximation theory, and information theory (lossless and lossy compression).
Learning objectiveAfter attending this lecture, participating in the exercise sessions, and working on the homework problem sets, students will have acquired a working knowledge of the most commonly used mathematical theories in information science. Students will also have to carry out a research project, either individually or in groups, with presentations at the end of the semester.
Content1. Signal representations: Frames in finite-dimensional spaces, frames in Hilbert spaces, wavelets, Gabor expansions

2. Sampling theorems: The sampling theorem as a frame expansion, irregular sampling, multi-band sampling, density theorems, spectrum-blind sampling

3. Sparsity and compressed sensing: Uncertainty relations in sparse signal recovery, recovery algorithms, Lasso, matching pursuit algorithms, compressed sensing, super-resolution

4. High-dimensional data and dimensionality reduction: Random projections, the Johnson-Lindenstrauss Lemma, sketching

5. Randomized algorithms for large-scale sparse FFTs

6. Approximation theory: Fundamental limits on compressibility of signal classes, Kolmogorov epsilon-entropy of signal classes, optimal encoding and decoding of signal classes

7. Information theory: Entropy, mutual information, lossy compression, rate-distortion theory, lossless compression, arithmetic coding, Lempel-Ziv compression
Lecture notesDetailed lecture notes will be provided at the beginning of the semester.
Prerequisites / NoticeThis course is aimed at students with a background in basic linear algebra, analysis, and probability. We will, however, review required mathematical basics throughout the semester in the exercise sessions.
401-5680-00LFoundations of Data Science Seminar Information 0 creditsP. L. Bühlmann, H. Bölcskei, J. M. Buhmann, T. Hofmann, A. Krause, A. Lapidoth, H.‑A. Loeliger, M. H. Maathuis, N. Meinshausen, G. Rätsch, S. van de Geer
AbstractResearch colloquium
Learning objective