252-0526-00L  Statistical Learning Theory

SemesterSpring Semester 2019
LecturersJ. M. Buhmann
Periodicityyearly recurring course
Language of instructionEnglish



Courses

NumberTitleHoursLecturers
252-0526-00 VStatistical Learning Theory3 hrs
Mon14:15-16:00HG E 5 »
Tue09:15-10:00HG E 5 »
J. M. Buhmann
252-0526-00 UStatistical Learning Theory2 hrs
Mon16:15-18:00HG E 5 »
J. M. Buhmann
252-0526-00 AStatistical Learning Theory1 hrsJ. M. Buhmann

Catalogue data

AbstractThe course covers advanced methods of statistical learning :
Statistical learning theory;variational methods and optimization, e.g., maximum entropy techniques, information bottleneck, deterministic and simulated annealing; clustering for vectorial, histogram and relational data; model selection; graphical models.
Learning objectiveThe course surveys recent methods of statistical learning. The fundamentals of machine learning as presented in the course "Introduction to Machine Learning" are expanded and in particular, the theory of statistical learning is discussed.
Content# Theory of estimators: How can we measure the quality of a statistical estimator? We already discussed bias and variance of estimators very briefly, but the interesting part is yet to come.

# Variational methods and optimization: We consider optimization approaches for problems where the optimizer is a probability distribution. Concepts we will discuss in this context include:

* Maximum Entropy
* Information Bottleneck
* Deterministic Annealing

# Clustering: The problem of sorting data into groups without using training samples. This requires a definition of ``similarity'' between data points and adequate optimization procedures.

# Model selection: We have already discussed how to fit a model to a data set in ML I, which usually involved adjusting model parameters for a given type of model. Model selection refers to the question of how complex the chosen model should be. As we already know, simple and complex models both have advantages and drawbacks alike.

# Statistical physics models: approaches for large systems approximate optimization, which originate in the statistical physics (free energy minimization applied to spin glasses and other models); sampling methods based on these models
Lecture notesA draft of a script will be provided;
transparencies of the lectures will be made available.
LiteratureHastie, Tibshirani, Friedman: The Elements of Statistical Learning, Springer, 2001.

L. Devroye, L. Gyorfi, and G. Lugosi: A probabilistic theory of pattern recognition. Springer, New York, 1996
Prerequisites / NoticeRequirements:

knowledge of the Machine Learning course
basic knowledge of statistics, interest in statistical methods.

It is recommended that Introduction to Machine Learning (ML I) is taken first; but with a little extra effort Statistical Learning Theory can be followed without the introductory course.

Performance assessment

Performance assessment information (valid until the course unit is held again)
Performance assessment as a semester course
ECTS credits7 credits
ExaminersJ. M. Buhmann
Typesession examination
Language of examinationEnglish
RepetitionThe performance assessment is only offered in the session after the course unit. Repetition only possible after re-enrolling.
Mode of examinationwritten 180 minutes
Additional information on mode of examination70% session examination, 30% project; the final grade will be calculated as weighted average of both these elements. As a compulsory continuous performance assessment task, the project must be passed on its own and has a bonus/penalty function.

The practical project are an integral part (60 hours of work, 2 credits) of the course. Participation is mandatory.
Failing the project results in a failing grade for the overall examination of Statistical Learning Theory (252-0526-00S).

Students who fail to fulfil the project requirement have to de-register from the exam. Otherwise, they are not admitted to the exam and they will be treated as a no show.
Written aids2 sheets A4 (= 4 pages) summary, script
This information can be updated until the beginning of the semester; information on the examination timetable is binding.

Learning materials

 
Main linkInformation
RecordingStatistical Learning Theory recorings
Only public learning materials are listed.

Groups

No information on groups available.

Restrictions

There are no additional restrictions for the registration.

Offered in

ProgrammeSectionType
CAS in Computer ScienceFocus Courses and ElectivesWInformation
Computational Biology and Bioinformatics MasterTheoryWInformation
DAS in Data ScienceMachine Learning and Artificial IntelligenceWInformation
Data Science MasterCore ElectivesWInformation
Electrical Engineering and Information Technology MasterRecommended SubjectsWInformation
Electrical Engineering and Information Technology MasterRecommended SubjectsWInformation
Electrical Engineering and Information Technology MasterSpecialization CoursesWInformation
Electrical Engineering and Information Technology MasterRecommended SubjectsWInformation
Electrical Engineering and Information Technology MasterSpecialization CoursesWInformation
Electrical Engineering and Information Technology MasterSpecialization CoursesWInformation
Computer Science MasterFocus Elective Courses Visual ComputingWInformation
Computer Science MasterFocus Elective Courses Computational ScienceWInformation
Computer Science MasterElective Focus Courses General StudiesWInformation
Mechanical Engineering MasterRobotics, Systems and ControlWInformation
Computational Science and Engineering MasterElectivesWInformation
Robotics, Systems and Control MasterCore CoursesWInformation
Statistics MasterStatistical and Mathematical CoursesWInformation