227-0432-00L  Learning, Classification and Compression

SemesterSpring Semester 2021
LecturersE. Riegler
Periodicityyearly recurring course
Language of instructionEnglish


AbstractThe focus of the course is aligned to a theoretical approach of learning theory and classification and an introduction to lossy and lossless compression for general sets and measures. We will mainly focus on a probabilistic approach, where an underlying distribution must be learned/compressed. The concepts acquired in the course are of broad and general interest in data sciences.
ObjectiveAfter attending this lecture and participating in the exercise sessions, students will have acquired a working knowledge of learning theory, classification, and compression.
Content1. Learning Theory
(a) Framework of Learning
(b) Hypothesis Spaces and Target Functions
(c) Reproducing Kernel Hilbert Spaces
(d) Bias-Variance Tradeoff
(e) Estimation of Sample and Approximation Error

2. Classification
(a) Binary Classifier
(b) Support Vector Machines (separable case)
(c) Support Vector Machines (nonseparable case)
(d) Kernel Trick

3. Lossy and Lossless Compression
(a) Basics of Compression
(b) Compressed Sensing for General Sets and Measures
(c) Quantization and Rate Distortion Theory for General Sets and Measures
Lecture notesDetailed lecture notes will be provided.
Prerequisites / NoticeThis course is aimed at students with a solid background in measure theory and linear algebra and basic knowledge in functional analysis.