Fan Yang: Catalogue data in Spring Semester 2023 |
Name | Prof. Dr. Fan Yang |
Field | Computer Science |
Address | Professur für Informatik ETH Zürich, CAB G 19.1 Universitätstrasse 6 8092 Zürich SWITZERLAND |
fan.yang@inf.ethz.ch | |
Department | Computer Science |
Relationship | Assistant Professor (Tenure Track) |
Number | Title | ECTS | Hours | Lecturers | ||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
252-0220-00L | Introduction to Machine Learning Preference is given to students in programmes in which the course is being offered. All other students will be waitlisted. Please do not contact Prof. Krause for any questions in this regard. If necessary, please contact studiensekretariat@inf.ethz.ch | 8 credits | 4V + 2U + 1A | A. Krause, F. Yang | ||||||||||||||||||||||||||||||||||||||
Abstract | The course introduces the foundations of learning and making predictions based on data. | |||||||||||||||||||||||||||||||||||||||||
Learning objective | The course will introduce the foundations of learning and making predictions from data. We will study basic concepts such as trading goodness of fit and model complexitiy. We will discuss important machine learning algorithms used in practice, and provide hands-on experience in a course project. | |||||||||||||||||||||||||||||||||||||||||
Content | - Linear regression (overfitting, cross-validation/bootstrap, model selection, regularization, [stochastic] gradient descent) - Linear classification: Logistic regression (feature selection, sparsity, multi-class) - Kernels and the kernel trick (Properties of kernels; applications to linear and logistic regression); k-nearest neighbor - Neural networks (backpropagation, regularization, convolutional neural networks) - Unsupervised learning (k-means, PCA, neural network autoencoders) - The statistical perspective (regularization as prior; loss as likelihood; learning as MAP inference) - Statistical decision theory (decision making based on statistical models and utility functions) - Discriminative vs. generative modeling (benefits and challenges in modeling joint vy. conditional distributions) - Bayes' classifiers (Naive Bayes, Gaussian Bayes; MLE) - Bayesian approaches to unsupervised learning (Gaussian mixtures, EM) | |||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Designed to provide a basis for following courses: - Advanced Machine Learning - Deep Learning - Probabilistic Artificial Intelligence - Seminar "Advanced Topics in Machine Learning" | |||||||||||||||||||||||||||||||||||||||||
Competencies |
| |||||||||||||||||||||||||||||||||||||||||
252-0945-16L | Doctoral Seminar Machine Learning (FS23) Only for Computer Science Ph.D. students. This doctoral seminar is intended for PhD students affiliated with the Institute for Machine Learning. Other PhD students who work on machine learning projects or related topics need approval by at least one of the organizers to register for the seminar. | 2 credits | 1S | N. He, V. Boeva, J. M. Buhmann, R. Cotterell, T. Hofmann, A. Krause, M. Sachan, J. Vogt, F. Yang | ||||||||||||||||||||||||||||||||||||||
Abstract | An essential aspect of any research project is dissemination of the findings arising from the study. Here we focus on oral communication, which includes: appropriate selection of material, preparation of the visual aids (slides and/or posters), and presentation skills. | |||||||||||||||||||||||||||||||||||||||||
Learning objective | The seminar participants should learn how to prepare and deliver scientific talks as well as to deal with technical questions. Participants are also expected to actively contribute to discussions during presentations by others, thus learning and practicing critical thinking skills. | |||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | This doctoral seminar of the Machine Learning Laboratory of ETH is intended for PhD students who work on a machine learning project, i.e., for the PhD students of the ML lab. | |||||||||||||||||||||||||||||||||||||||||
263-3300-00L | Data Science Lab | 14 credits | 9P | A. Ilic, V. Boeva, R. Cotterell, J. Vogt, F. Yang | ||||||||||||||||||||||||||||||||||||||
Abstract | In this class, we bring together data science applications provided by ETH researchers outside computer science and teams of computer science master's students. Two to three students will form a team working on data science/machine learning-related research topics provided by scientists in a diverse range of domains such as astronomy, biology, social sciences etc. | |||||||||||||||||||||||||||||||||||||||||
Learning objective | The goal of this class if for students to gain experience of dealing with data science and machine learning applications "in the wild". Students are expected to go through the full process starting from data cleaning, modeling, execution, debugging, error analysis, and quality/performance refinement. | |||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Prerequisites: At least 8 KP must have been obtained under Data Analysis and at least 8 KP must have been obtained under Data Management and Processing. | |||||||||||||||||||||||||||||||||||||||||
401-5680-00L | Foundations of Data Science Seminar | 0 credits | P. L. Bühlmann, A. Bandeira, H. Bölcskei, S. van de Geer, F. Yang | |||||||||||||||||||||||||||||||||||||||
Abstract | Research colloquium | |||||||||||||||||||||||||||||||||||||||||
Learning objective |