Das Frühjahrssemester 2021 findet bis auf Weiteres online statt. Ausnahmen: Veranstaltungen, die nur mit Präsenz vor Ort durchführbar sind. Bitte beachten Sie die Informationen der Dozierenden.

Joachim M. Buhmann: Katalogdaten im Frühjahrssemester 2019

NameHerr Prof. Dr. Joachim M. Buhmann
LehrgebietInformatik (Information Science and Engineering)
Adresse
Institut für Maschinelles Lernen
ETH Zürich, CAB G 69.2
Universitätstrasse 6
8092 Zürich
SWITZERLAND
Telefon+41 44 632 31 24
Fax+41 44 632 15 62
E-Mailjbuhmann@inf.ethz.ch
URLhttp://www.ml.inf.ethz.ch/
DepartementInformatik
BeziehungOrdentlicher Professor

NummerTitelECTSUmfangDozierende
252-0526-00LStatistical Learning Theory Information 7 KP3V + 2U + 1AJ. M. Buhmann
KurzbeschreibungThe course covers advanced methods of statistical learning :
Statistical learning theory;variational methods and optimization, e.g., maximum entropy techniques, information bottleneck, deterministic and simulated annealing; clustering for vectorial, histogram and relational data; model selection; graphical models.
LernzielThe course surveys recent methods of statistical learning. The fundamentals of machine learning as presented in the course "Introduction to Machine Learning" are expanded and in particular, the theory of statistical learning is discussed.
Inhalt# Theory of estimators: How can we measure the quality of a statistical estimator? We already discussed bias and variance of estimators very briefly, but the interesting part is yet to come.

# Variational methods and optimization: We consider optimization approaches for problems where the optimizer is a probability distribution. Concepts we will discuss in this context include:

* Maximum Entropy
* Information Bottleneck
* Deterministic Annealing

# Clustering: The problem of sorting data into groups without using training samples. This requires a definition of ``similarity'' between data points and adequate optimization procedures.

# Model selection: We have already discussed how to fit a model to a data set in ML I, which usually involved adjusting model parameters for a given type of model. Model selection refers to the question of how complex the chosen model should be. As we already know, simple and complex models both have advantages and drawbacks alike.

# Statistical physics models: approaches for large systems approximate optimization, which originate in the statistical physics (free energy minimization applied to spin glasses and other models); sampling methods based on these models
SkriptA draft of a script will be provided;
transparencies of the lectures will be made available.
LiteraturHastie, Tibshirani, Friedman: The Elements of Statistical Learning, Springer, 2001.

L. Devroye, L. Gyorfi, and G. Lugosi: A probabilistic theory of pattern recognition. Springer, New York, 1996
Voraussetzungen / BesonderesRequirements:

knowledge of the Machine Learning course
basic knowledge of statistics, interest in statistical methods.

It is recommended that Introduction to Machine Learning (ML I) is taken first; but with a little extra effort Statistical Learning Theory can be followed without the introductory course.
252-0945-08LDoctoral Seminar Machine Learning (FS19) Belegung eingeschränkt - Details anzeigen
Only for Computer Science Ph.D. students.

This doctoral seminar is intended for PhD students affiliated with the Instutute for Machine Learning. Other PhD students who work on machine learning projects or related topics need approval by at least one of the organizers to register for the seminar.
2 KP2SJ. M. Buhmann, T. Hofmann, A. Krause, G. Rätsch
KurzbeschreibungAn essential aspect of any research project is dissemination of the findings arising from the study. Here we focus on oral communication, which includes: appropriate selection of material, preparation of the visual aids (slides and/or posters), and presentation skills.
LernzielThe seminar participants should learn how to prepare and deliver scientific talks as well as to deal with technical questions. Participants are also expected to actively contribute to discussions during presentations by others, thus learning and practicing critical thinking skills.
Voraussetzungen / BesonderesThis doctoral seminar is intended for PhD students affiliated with the Instutute for Machine Learning. Other PhD students who work on machine learning projects or related topics need approval by at least one of the organizers to register for the seminar.
401-5680-00LFoundations of Data Science Seminar Information 0 KPP. L. Bühlmann, H. Bölcskei, J. M. Buhmann, T. Hofmann, A. Krause, A. Lapidoth, H.‑A. Loeliger, M. H. Maathuis, N. Meinshausen, G. Rätsch, S. van de Geer
KurzbeschreibungResearch colloquium
Lernziel