Search result: Catalogue data in Spring Semester 2020
Mathematics Bachelor ![]() | ||||||
![]() | ||||||
![]() ![]() | ||||||
Number | Title | Type | ECTS | Hours | Lecturers | |
---|---|---|---|---|---|---|
401-4944-20L | Mathematics of Data Science | W | 8 credits | 4G | A. Bandeira | |
Abstract | Mostly self-contained, but fast-paced, introductory masters level course on various theoretical aspects of algorithms that aim to extract information from data. | |||||
Objective | Introduction to various mathematical aspects of Data Science. | |||||
Content | These topics lie in overlaps of (Applied) Mathematics with: Computer Science, Electrical Engineering, Statistics, and/or Operations Research. Each lecture will feature a couple of Mathematical Open Problem(s) related to Data Science. The main mathematical tools used will be Probability and Linear Algebra, and a basic familiarity with these subjects is required. There will also be some (although knowledge of these tools is not assumed) Graph Theory, Representation Theory, Applied Harmonic Analysis, among others. The topics treated will include Dimension reduction, Manifold learning, Sparse recovery, Random Matrices, Approximation Algorithms, Community detection in graphs, and several others. | |||||
Lecture notes | https://people.math.ethz.ch/~abandeira/TenLecturesFortyTwoProblems.pdf | |||||
Prerequisites / Notice | The main mathematical tools used will be Probability, Linear Algebra (and real analysis), and a working knowledge of these subjects is required. In addition to these prerequisites, this class requires a certain degree of mathematical maturity--including abstract thinking and the ability to understand and write proofs. We encourage students who are interested in mathematical data science to take both this course and ``227-0434-10L Mathematics of Information'' taught by Prof. H. Bölcskei. The two courses are designed to be complementary. A. Bandeira and H. Bölcskei | |||||
252-0220-00L | Introduction to Machine Learning ![]() ![]() Limited number of participants. Preference is given to students in programmes in which the course is being offered. All other students will be waitlisted. Please do not contact Prof. Krause for any questions in this regard. If necessary, please contact studiensekretariat@inf.ethz.ch | W | 8 credits | 4V + 2U + 1A | A. Krause | |
Abstract | The course introduces the foundations of learning and making predictions based on data. | |||||
Objective | The course will introduce the foundations of learning and making predictions from data. We will study basic concepts such as trading goodness of fit and model complexitiy. We will discuss important machine learning algorithms used in practice, and provide hands-on experience in a course project. | |||||
Content | - Linear regression (overfitting, cross-validation/bootstrap, model selection, regularization, [stochastic] gradient descent) - Linear classification: Logistic regression (feature selection, sparsity, multi-class) - Kernels and the kernel trick (Properties of kernels; applications to linear and logistic regression); k-nearest neighbor - Neural networks (backpropagation, regularization, convolutional neural networks) - Unsupervised learning (k-means, PCA, neural network autoencoders) - The statistical perspective (regularization as prior; loss as likelihood; learning as MAP inference) - Statistical decision theory (decision making based on statistical models and utility functions) - Discriminative vs. generative modeling (benefits and challenges in modeling joint vy. conditional distributions) - Bayes' classifiers (Naive Bayes, Gaussian Bayes; MLE) - Bayesian approaches to unsupervised learning (Gaussian mixtures, EM) | |||||
Literature | Textbook: Kevin Murphy, Machine Learning: A Probabilistic Perspective, MIT Press | |||||
Prerequisites / Notice | Designed to provide a basis for following courses: - Advanced Machine Learning - Deep Learning - Probabilistic Artificial Intelligence - Seminar "Advanced Topics in Machine Learning" | |||||
263-5300-00L | Guarantees for Machine Learning ![]() ![]() | W | 5 credits | 2V + 2A | F. Yang | |
Abstract | This course teaches classical and recent methods in statistics and optimization commonly used to prove theoretical guarantees for machine learning algorithms. The knowledge is then applied in project work that focuses on understanding phenomena in modern machine learning. | |||||
Objective | This course is aimed at advanced master and doctorate students who want to understand and/or conduct independent research on theory for modern machine learning. For this purpose, students will learn common mathematical techniques from statistical learning theory. In independent project work, they then apply their knowledge and go through the process of critically questioning recently published work, finding relevant research questions and learning how to effectively present research ideas to a professional audience. | |||||
Content | This course teaches some classical and recent methods in statistical learning theory aimed at proving theoretical guarantees for machine learning algorithms, including topics in - concentration bounds, uniform convergence - high-dimensional statistics (e.g. Lasso) - prediction error bounds for non-parametric statistics (e.g. in kernel spaces) - minimax lower bounds - regularization via optimization The project work focuses on active theoretical ML research that aims to understand modern phenomena in machine learning, including but not limited to - how overparameterization could help generalization ( interpolating models, linearized NN ) - how overparameterization could help optimization ( non-convex optimization, loss landscape ) - complexity measures and approximation theoretic properties of randomly initialized and trained NN - generalization of robust learning ( adversarial robustness, standard and robust error tradeoff ) - prediction with calibrated confidence ( conformal prediction, calibration ) | |||||
Prerequisites / Notice | It’s absolutely necessary for students to have a strong mathematical background (basic real analysis, probability theory, linear algebra) and good knowledge of core concepts in machine learning taught in courses such as “Introduction to Machine Learning”, “Regression”/ “Statistical Modelling”. It's also helpful to have heard an optimization course or approximation theoretic course. In addition to these prerequisites, this class requires a certain degree of mathematical maturity—including abstract thinking and the ability to understand and write proofs. | |||||
401-3502-20L | Reading Course ![]() To start an individual reading course, contact an authorised supervisor Link and register your reading course in myStudies. | W | 2 credits | 4A | Supervisors | |
Abstract | For this Reading Course proactive students make an individual agreement with a lecturer to acquire knowledge through independent literature study. | |||||
Objective | ||||||
401-3503-20L | Reading Course ![]() To start an individual reading course, contact an authorised supervisor Link and register your reading course in myStudies. | W | 3 credits | 6A | Supervisors | |
Abstract | For this Reading Course proactive students make an individual agreement with a lecturer to acquire knowledge through independent literature study. | |||||
Objective | ||||||
401-3504-20L | Reading Course ![]() To start an individual reading course, contact an authorised supervisor Link and register your reading course in myStudies. | W | 4 credits | 9A | Supervisors | |
Abstract | For this Reading Course proactive students make an individual agreement with a lecturer to acquire knowledge through independent literature study. | |||||
Objective |
Page 1 of 1