Suchergebnis: Katalogdaten im Herbstsemester 2019
Statistik Master ![]() Die hier aufgelisteten Lehrveranstaltungen gehören zum Curriculum des Master-Studiengangs Statistik. Die entsprechenden KP gelten nicht als Mobilitäts-KP, auch wenn gewisse Lerneinheiten nicht an der ETH Zürich belegt werden können. | ||||||
![]() | ||||||
![]() ![]() | ||||||
Nummer | Titel | Typ | ECTS | Umfang | Dozierende | |
---|---|---|---|---|---|---|
401-3601-00L | Probability Theory ![]() Höchstens eines der drei Bachelor-Kernfächer 401-3461-00L Funktionalanalysis I / Functional Analysis I 401-3531-00L Differentialgeometrie I / Differential Geometry I 401-3601-00L Wahrscheinlichkeitstheorie / Probability Theory ist im Master-Studiengang Mathematik anrechenbar. | W | 10 KP | 4V + 1U | A.‑S. Sznitman | |
Kurzbeschreibung | Basics of probability theory and the theory of stochastic processes in discrete time | |||||
Lernziel | This course presents the basics of probability theory and the theory of stochastic processes in discrete time. The following topics are planned: Basics in measure theory, random series, law of large numbers, weak convergence, characteristic functions, central limit theorem, conditional expectation, martingales, convergence theorems for martingales, Galton Watson chain, transition probability, Theorem of Ionescu Tulcea, Markov chains. | |||||
Inhalt | This course presents the basics of probability theory and the theory of stochastic processes in discrete time. The following topics are planned: Basics in measure theory, random series, law of large numbers, weak convergence, characteristic functions, central limit theorem, conditional expectation, martingales, convergence theorems for martingales, Galton Watson chain, transition probability, Theorem of Ionescu Tulcea, Markov chains. | |||||
Skript | available, will be sold in the course | |||||
Literatur | R. Durrett, Probability: Theory and examples, Duxbury Press 1996 H. Bauer, Probability Theory, de Gruyter 1996 J. Jacod and P. Protter, Probability essentials, Springer 2004 A. Klenke, Wahrscheinlichkeitstheorie, Springer 2006 D. Williams, Probability with martingales, Cambridge University Press 1991 | |||||
401-3627-00L | High-Dimensional Statistics | W | 4 KP | 2V | P. L. Bühlmann | |
Kurzbeschreibung | "High-Dimensional Statistics" deals with modern methods and theory for statistical inference when the number of unknown parameters is of much larger order than sample size. Statistical estimation and algorithms for complex models and aspects of multiple testing will be discussed. | |||||
Lernziel | Knowledge of methods and basic theory for high-dimensional statistical inference | |||||
Inhalt | Lasso and Group Lasso for high-dimensional linear and generalized linear models; Additive models and many smooth univariate functions; Non-convex loss functions and l1-regularization; Stability selection, multiple testing and construction of p-values; Undirected graphical modeling | |||||
Literatur | Peter Bühlmann and Sara van de Geer (2011). Statistics for High-Dimensional Data: Methods, Theory and Applications. Springer Verlag. ISBN 978-3-642-20191-2. | |||||
Voraussetzungen / Besonderes | Knowledge of basic concepts in probability theory, and intermediate knowledge of statistics (e.g. a course in linear models or computational statistics). | |||||
401-3612-00L | Stochastic Simulation Findet dieses Semester nicht statt. | W | 5 KP | 3G | ||
Kurzbeschreibung | This course provides an introduction to statistical Monte Carlo methods. This includes applications of simulations in various fields (Bayesian statistics, statistical mechanics, operations research, financial mathematics), algorithms for the generation of random variables (accept-reject, importance sampling), estimating the precision, variance reduction, introduction to Markov chain Monte Carlo. | |||||
Lernziel | Stochastic simulation (also called Monte Carlo method) is the experimental analysis of a stochastic model by implementing it on a computer. Probabilities and expected values can be approximated by averaging simulated values, and the central limit theorem gives an estimate of the error of this approximation. The course shows examples of the many applications of stochastic simulation and explains different algorithms used for simulation. These algorithms are illustrated with the statistical software R. | |||||
Inhalt | Examples of simulations in different fields (computer science, statistics, statistical mechanics, operations research, financial mathematics). Generation of uniform random variables. Generation of random variables with arbitrary distributions (quantile transform, accept-reject, importance sampling), simulation of Gaussian processes and diffusions. The precision of simulations, methods for variance reduction. Introduction to Markov chains and Markov chain Monte Carlo (Metropolis-Hastings, Gibbs sampler, Hamiltonian Monte Carlo, reversible jump MCMC). | |||||
Skript | A script will be available in English. | |||||
Literatur | P. Glasserman, Monte Carlo Methods in Financial Engineering. Springer 2004. B. D. Ripley. Stochastic Simulation. Wiley, 1987. Ch. Robert, G. Casella. Monte Carlo Statistical Methods. Springer 2004 (2nd edition). | |||||
Voraussetzungen / Besonderes | Familiarity with basic concepts of probability theory (random variables, joint and conditional distributions, laws of large numbers and central limit theorem) will be assumed. | |||||
401-4619-67L | Advanced Topics in Computational Statistics Findet dieses Semester nicht statt. | W | 4 KP | 2V | keine Angaben | |
Kurzbeschreibung | This lecture covers selected advanced topics in computational statistics. This year the focus will be on graphical modelling. | |||||
Lernziel | Students learn the theoretical foundations of the selected methods, as well as practical skills to apply these methods and to interpret their outcomes. | |||||
Inhalt | The main focus will be on graphical models in various forms: Markov properties of undirected graphs; Belief propagation; Hidden Markov Models; Structure estimation and parameter estimation; inference for high-dimensional data; causal graphical models | |||||
Voraussetzungen / Besonderes | We assume a solid background in mathematics, an introductory lecture in probability and statistics, and at least one more advanced course in statistics. | |||||
401-4633-00L | Data Analytics in Organisations and Business | W | 5 KP | 2V + 1U | I. Flückiger | |
Kurzbeschreibung | On the end-to-end process of data analytics in organisations & business and how to transform data into insights for fact based decisions. Presentation of the process from the beginning with framing the business problem to presenting the results and making decisions by the use of data analytics. For each topic case studies from the financial service, healthcare and retail sectors will be presented. | |||||
Lernziel | The goal of this course is to give the students the understanding of the data analytics process in the business world, with special focus on the skills and techniques used besides the technical skills. The student will become familiar with the "business language", current problems and thinking in organisations and business and tools used. | |||||
Inhalt | Framing the Business Problem Framing the Analytics Problem Data Methodology Model Building Deployment Model Lifecycle Soft Skills for the Statistical/Mathematical Professional | |||||
Skript | Lecture Notes will be available. | |||||
Voraussetzungen / Besonderes | Prerequisites: Basic statistics and probability theory and regression | |||||
401-6217-00L | Using R for Data Analysis and Graphics (Part II) ![]() | W | 1.5 KP | 1G | M. Mächler | |
Kurzbeschreibung | The course provides the second part an introduction to the statistical software R for scientists. Topics are data generation and selection, graphical functions, important statistical functions, types of objects, models, programming and writing functions. Note: This part builds on "Using R... (Part I)", but can be taken independently if the basics of R are already known. | |||||
Lernziel | The students will be able to use the software R efficiently for data analysis, graphics and simple programming | |||||
Inhalt | The course provides the second part of an introduction to the statistical software R (https://www.r-project.org/) for scientists. R is free software that contains a huge collection of functions with focus on statistics and graphics. If one wants to use R one has to learn the programming language R - on very rudimentary level. The course aims to facilitate this by providing a basic introduction to R. Part II of the course builds on part I and covers the following additional topics: - Elements of the R language: control structures (if, else, loops), lists, overview of R objects, attributes of R objects; - More on R functions; - Applying functions to elements of vectors, matrices and lists; - Object oriented programming with R: classes and methods; - Tayloring R: options - Extending basic R: packages The course focuses on practical work at the computer. We will make use of the graphical user interface RStudio: www.rstudio.org | |||||
Skript | An Introduction to R. http://stat.ethz.ch/CRAN/doc/contrib/Lam-IntroductionToR_LHL.pdf | |||||
Voraussetzungen / Besonderes | Basic knowledge of R equivalent to "Using R .. (part 1)" ( = 401-6215-00L ) is a prerequisite for this course. The course resources will be provided via the Moodle web learning platform. As from FS 2019, subscribing via Mystudies should *automatically* make you a student participant of the Moodle course of this lecture, which is at https://moodle-app2.let.ethz.ch/course/view.php?id=11399 | |||||
401-0627-00L | Smoothing and Nonparametric Regression with Examples ![]() | W | 4 KP | 2G | S. Beran-Ghosh | |
Kurzbeschreibung | Starting with an overview of selected results from parametric inference, kernel smoothing will be introduced along with some asymptotic theory, optimal bandwidth selection, data driven algorithms and some special topics. Examples from environmental research will be used for motivation, but the methods will also be applicable elsewhere. | |||||
Lernziel | The students will learn about methods of kernel smoothing and application of concepts to data. The aim will be to build sufficient interest in the topic and intuition as well as the ability to implement the methods to various different datasets. | |||||
Inhalt | Rough Outline: - Parametric estimation methods: selection of important results o Maximum likelihood, Method of Least squares: regression & diagnostics - Nonparametric curve estimation o Density estimation, Kernel regression, Local polynomials, Bandwidth selection o Selection of special topics (as time permits, we will cover as many topics as possible) such as rapid change points, mode estimation, robust smoothing, partial linear models, etc. - Applications: potential areas of applications will be discussed such as, change assessment, trend and surface estimation, probability and quantile curve estimation, and others. | |||||
Skript | Brief summaries or outlines of some of the lecture material will be posted at https://www.wsl.ch/en/employees/ghosh.html. NOTE: The posted notes will tend to be just sketches whereas only the in-class lessons will contain complete information. LOG IN: In order to have access to the posted notes, you will need the course user id & the password. These will be given out on the first day of the lectures. | |||||
Literatur | References: - Statistical Inference, by S.D. Silvey, Chapman & Hall. - Regression Analysis: Theory, Methods and Applications, by A. Sen and M. Srivastava, Springer. - Density Estimation, by B.W. Silverman, Chapman and Hall. - Kernel Smoothing, by M.P. Wand and M.C. Jones, Chapman and Hall. - Local polynomial modelling and its applications, by J. Fan and I. Gijbels, Chapman & Hall. - Nonparametric Simple Regression, by J. Fox, Sage Publications. - Applied Smoothing Techniques for Data Analysis: the Kernel Approach With S-Plus Illustrations, by A.W. Bowman, A. Azzalini, Oxford University Press. - Kernel Smoothing: Principles, Methods and Applications, by S. Ghosh, Wiley. Additional references will be given out in the lectures. | |||||
Voraussetzungen / Besonderes | Prerequisites: A background in Linear Algebra, Calculus, Probability & Statistical Inference including Estimation and Testing. | |||||
447-6221-00L | Nichtparametrische Regression ![]() Findet dieses Semester nicht statt. Fachstudierende "Universität Zürich (UZH)" im Master-Studiengang Biostatistik von der UZH können diese Lerneinheit nicht direkt in myStudies belegen. Leiten Sie die schriftliche Teilnahmebewilligung des Dozenten an die Kanzlei weiter. Als Einverständnis gilt auch ein direktes E-Mail des Dozenten an kanzlei@ethz.ch. Die Kanzlei wird anschliessend die Belegung vornehmen. | W | 1 KP | 1G | ||
Kurzbeschreibung | Fokus ist die nichtparametrische Schätzung von Wahrscheinlichkeitsdichten und Regressionsfunktionen. Diese neueren Methoden verzichten auf einschränkende Modellannahmen wie 'lineare Funktion'. Sie benötigen eine Gewichtsfunktion und einen Glättungsparameter. Schwerpunkt ist eine Dimension, mehrere Dimensionen und Stichproben von Kurven werden kurz behandelt. Übungen am Computer. | |||||
Lernziel | Kenntnisse der Schätzung von Wahrscheinlichkeitsdichten und Regressionsfunktionen mittels verschiedener statistischer Methoden. Verständnis für die Wahl der Gewichtsfunktion und des Glättungsparameters, auch automatisch. Praktische Anwendung auf Datensätze am Computer. | |||||
447-6233-00L | Spatial Statistics ![]() Findet dieses Semester nicht statt. Fachstudierende "Universität Zürich (UZH)" im Master-Studiengang Biostatistik von der UZH können diese Lerneinheit nicht direkt in myStudies belegen. Leiten Sie die schriftliche Teilnahmebewilligung des Dozenten an die Kanzlei weiter. Als Einverständnis gilt auch ein direktes E-Mail des Dozenten an kanzlei@ethz.ch. Die Kanzlei wird anschliessend die Belegung vornehmen. | W | 1 KP | 1G | ||
Kurzbeschreibung | In many research fields, spatially referenced data are collected. When analysing such data the focus is either on exploring their structure (dependence on explanatory variables, autocorrelation) and/or on spatial prediction. The course provides an introduction to geostatistical methods that are useful for such purposes. | |||||
Lernziel | The course will provide an overview of the basic concepts and stochastic models that are commonly used to model spatial data. In addition, the participants will learn a number of geostatistical techniques and acquire some familiarity with software that is useful for analysing spatial data. | |||||
Inhalt | After an introductory discussion of the types of problems and the kind of data that arise in environmental research, an introduction into linear geostatistics (models: stationary and intrinsic random processes, modelling large-scale spatial patterns by regression, modelling autocorrelation by variogram; kriging: mean-square prediction of spatial data) will be taught. The lectures will be complemented by data analyses that the participants have to do themselves. | |||||
Skript | Slides, descriptions of the problems for the data analyses and worked-out solutions to them will be provided. | |||||
Literatur | P.J. Diggle & P.J. Ribeiro Jr. 2007. Model-based Geostatistics. Springer | |||||
447-6245-00L | Data-Mining ![]() ![]() Findet dieses Semester nicht statt. Fachstudierende "Universität Zürich (UZH)" im Master-Studiengang Biostatistik von der UZH können diese Lerneinheit nicht direkt in myStudies belegen. Leiten Sie die schriftliche Teilnahmebewilligung des Dozenten an die Kanzlei weiter. Als Einverständnis gilt auch ein direktes E-Mail des Dozenten an kanzlei@ethz.ch. Die Kanzlei wird anschliessend die Belegung vornehmen. | W | 1 KP | 1G | ||
Kurzbeschreibung | Block über "Prognoseprobleme", bzw. "Supervised Learning" Teil 1, Klassifikation: logistische Regression, Lineare/Quadratische Diskriminanzanalyse, Bayes-Klassifikator; additive & Baummodelle, weitere flexible ("nichtparametrische") Methoden. Teil 2, Flexible Vorhersage: Additive Modelle, MARS, Y-Transformations-Modelle (ACE,AVAS); Projection Pursuit Regression (PPR), Neuronale Netze. | |||||
Lernziel | ||||||
Inhalt | Aus dem weiten Feld des "Data Mining" behandeln wir in diesem Block nur sogenannte "Prognoseprobleme", bzw. "Supervised Learning". Teil 1, Klassifikation, repetiert logistische Regression und Lineare / Quadratische Diskriminanzanalyse (LDA/QDA), und erweitert diese (im Rahmen des "Bayes-Klassifikators") auf (generalisierte) additive ("GAM") und Baummodelle ("CART"), und (summarisch/kurz) auf weitere flexible ("nichtparametrische") Methoden. Teil 2, Flexible Vorhersage (kontinuierliche oder Klassen-Zielvariable) umfasst Additive Modelle, MARS, Y-Transformations-Modelle (ACE, AVAS); Projection Pursuit Regression (PPR), Neuronale Netze. | |||||
Skript | Grundlage des Kurses ist das Skript. | |||||
Voraussetzungen / Besonderes | Die Uebungen werden ausschliesslich mit der (Free, open source) Software "R" (http://www.r-project.org) durchgeführt, womit am Schluss auch eine "Schnellübung" als Schlussprüfung stattfindet. | |||||
447-6257-00L | Wiederholte Messungen ![]() Findet dieses Semester nicht statt. Fachstudierende "Universität Zürich (UZH)" im Master-Studiengang Biostatistik von der UZH können diese Lerneinheit nicht direkt in myStudies belegen. Leiten Sie die schriftliche Teilnahmebewilligung des Dozenten an die Kanzlei weiter. Als Einverständnis gilt auch ein direktes E-Mail des Dozenten an kanzlei@ethz.ch. Die Kanzlei wird anschliessend die Belegung vornehmen. | W | 1 KP | 1G | ||
Kurzbeschreibung | Entstehung und Strukturen von wiederholten Messungen. Planung und Durchführung entsprechender Studien. Within- und Between-sujects Faktoren. Häufige Kovarianz-Strukturen. Statistische Analysemethoden: Graphische Darstellung, Summary statistics approach, univariate und multivariate Varianzanalyse, gemischtes lineares Modell. | |||||
Lernziel | Befähigung zur Erkennung und adäquaten statistischen Auswertung von wiederholten Messungen. Korrekter Umgang mit Pseudoreplikaten. | |||||
Skript | Es wird ein Skript abgegeben. | |||||
447-6191-00L | Statistical Analysis of Financial Data ![]() Findet dieses Semester nicht statt. Fachstudierende "Universität Zürich (UZH)" im Master-Studiengang Biostatistik von der UZH können diese Lerneinheit nicht direkt in myStudies belegen. Leiten Sie die schriftliche Teilnahmebewilligung des Dozenten an die Kanzlei weiter. Als Einverständnis gilt auch ein direktes E-Mail des Dozenten an kanzlei@ethz.ch. Die Kanzlei wird anschliessend die Belegung vornehmen. | W | 2 KP | 1G | ||
Kurzbeschreibung | Distributions for financial data. Volatility models: ARCH- and GARCH models. Value at risk and expected shortfall. Portfolio theory: minimum-variance portfolio, efficient frontier, Sharpe’s ratio. Factor models: capital asset pricing model, macroeconomic factor models, fundamental factor model. Copulas: Basic theory, Gaussian and t-copulas, archimedean copulas, calibration of copulas. | |||||
Lernziel | Getting to know the typical properties of financial data and appropriate statistical models, incl. the corresponding functions in R. | |||||
447-6289-00L | Stichproben-Erhebungen ![]() Findet dieses Semester nicht statt. Fachstudierende "Universität Zürich (UZH)" im Master-Studiengang Biostatistik von der UZH können diese Lerneinheit nicht direkt in myStudies belegen. Leiten Sie die schriftliche Teilnahmebewilligung des Dozenten an die Kanzlei weiter. Als Einverständnis gilt auch ein direktes E-Mail des Dozenten an kanzlei@ethz.ch. Die Kanzlei wird anschliessend die Belegung vornehmen. | W | 2 KP | 1G | ||
Kurzbeschreibung | Die Elemente einer Stichproben-Erhebung werden erklärt. Die wichtigsten klassischen Stichprobenpläne (Einfach und geschichtete Zufallsstichprobe) mit ihren Schätzern sowie Schätzverfahren mit Hilfsinformationen und der Horvitz-Thompson Schätzer werden eingeführt. Datenaufbereitung, Antwortausfälle und deren Behandlung, Varianzschätzungen sowie Analysen von Stichprobendaten werden diskutiert. | |||||
Lernziel | Kenntnis der Elemente und des Ablaufs einer Stichprobenerhebung. Verständnis für das Paradigma der Zufallsstichproben. Kenntnis der einfachen und geschichteten Stichproben-Strategien und Fähigkeit die entsprechenden Methoden anzuwenden. Kenntnis von weiterführenden Methoden für Schätzverfahren, Datenaufbereitung und Analysen. | |||||
401-3628-14L | Bayesian Statistics | W | 4 KP | 2V | F. Sigrist | |
Kurzbeschreibung | Introduction to the Bayesian approach to statistics: decision theory, prior distributions, hierarchical Bayes models, empirical Bayes, Bayesian tests and model selection, empirical Bayes, Laplace approximation, Monte Carlo and Markov chain Monte Carlo methods. | |||||
Lernziel | Students understand the conceptual ideas behind Bayesian statistics and are familiar with common techniques used in Bayesian data analysis. | |||||
Inhalt | Topics that we will discuss are: Difference between the frequentist and Bayesian approach (decision theory, principles), priors (conjugate priors, noninformative priors, Jeffreys prior), tests and model selection (Bayes factors, hyper-g priors for regression),hierarchical models and empirical Bayes methods, computational methods (Laplace approximation, Monte Carlo and Markov chain Monte Carlo methods) | |||||
Skript | A script will be available in English. | |||||
Literatur | Christian Robert, The Bayesian Choice, 2nd edition, Springer 2007. A. Gelman et al., Bayesian Data Analysis, 3rd edition, Chapman & Hall (2013). Additional references will be given in the course. | |||||
Voraussetzungen / Besonderes | Familiarity with basic concepts of frequentist statistics and with basic concepts of probability theory (random variables, joint and conditional distributions, laws of large numbers and central limit theorem) will be assumed. | |||||
447-6273-00L | Bayes-Methoden ![]() Findet dieses Semester nicht statt. Fachstudierende "Universität Zürich (UZH)" im Master-Studiengang Biostatistik von der UZH können diese Lerneinheit nicht direkt in myStudies belegen. Leiten Sie die schriftliche Teilnahmebewilligung des Dozenten an die Kanzlei weiter. Als Einverständnis gilt auch ein direktes E-Mail des Dozenten an kanzlei@ethz.ch. Die Kanzlei wird anschliessend die Belegung vornehmen. | W | 2 KP | 2G | ||
Kurzbeschreibung | Bedingte Wahrscheinlichkeit; Bayes-Inferenz (konjugierte Verteilungen, HPD-Bereiche, lineare und empirische Verfahren), Bestimmung der a-posteriori Verteilung durch Simulation (Markov Chain Monte-Carlo mit R2Winbugs), Einführung in mehrstufige hierarchische Modelle. | |||||
Lernziel | ||||||
Inhalt | Die Bayes-Statistik ist deshalb attraktiv, da sie ermöglicht, Entscheidungen unter Ungewissheit zu treffen, wo die klassische frequentische Statistik versagt! Der Kurs vermittelt einen Einstieg in die Bayes-Statistik, ist mathematisch nur moderat anspruchsvoll, verlangt aber ein gewisses Umdenken, das nicht unterschätzt werden darf. | |||||
Literatur | Gelman A., Carlin J.B., Stern H.S. and D.B. Rubin, Bayesian Data Analysis, Chapman and Hall, 2nd Edition, 2004. Kruschke, J.K., Doing Bayesian Data Analysis, Elsevier2011. | |||||
Voraussetzungen / Besonderes | Voraussetzung: Statistische Grundkenntnisse ; Kenntnis von R. | |||||
401-3913-01L | Mathematical Foundations for Finance ![]() | W | 4 KP | 3V + 2U | E. W. Farkas | |
Kurzbeschreibung | First introduction to main modelling ideas and mathematical tools from mathematical finance | |||||
Lernziel | This course gives a first introduction to the main modelling ideas and mathematical tools from mathematical finance. It mainly aims at non-mathematicians who need an introduction to the main tools from stochastics used in mathematical finance. However, mathematicians who want to learn some basic modelling ideas and concepts for quantitative finance (before continuing with a more advanced course) may also find this of interest.. The main emphasis will be on ideas, but important results will be given with (sometimes partial) proofs. | |||||
Inhalt | Topics to be covered include - financial market models in finite discrete time - absence of arbitrage and martingale measures - valuation and hedging in complete markets - basics about Brownian motion - stochastic integration - stochastic calculus: Itô's formula, Girsanov transformation, Itô's representation theorem - Black-Scholes formula | |||||
Skript | Lecture notes will be sold at the beginning of the course. | |||||
Literatur | Lecture notes will be sold at the beginning of the course. Additional (background) references are given there. | |||||
Voraussetzungen / Besonderes | Prerequisites: Results and facts from probability theory as in the book "Probability Essentials" by J. Jacod and P. Protter will be used freely. Especially participants without a direct mathematics background are strongly advised to familiarise themselves with those tools before (or very quickly during) the course. (A possible alternative to the above English textbook are the (German) lecture notes for the standard course "Wahrscheinlichkeitstheorie".) For those who are not sure about their background, we suggest to look at the exercises in Chapters 8, 9, 22-25, 28 of the Jacod/Protter book. If these pose problems, you will have a hard time during the course. So be prepared. | |||||
401-3901-00L | Mathematical Optimization ![]() | W | 11 KP | 4V + 2U | R. Zenklusen | |
Kurzbeschreibung | Mathematical treatment of diverse optimization techniques. | |||||
Lernziel | The goal of this course is to get a thorough understanding of various classical mathematical optimization techniques with an emphasis on polyhedral approaches. In particular, we want students to develop a good understanding of some important problem classes in the field, of structural mathematical results linked to these problems, and of solution approaches based on this structural understanding. | |||||
Inhalt | Key topics include: - Linear programming and polyhedra; - Flows and cuts; - Combinatorial optimization problems and techniques; - Equivalence between optimization and separation; - Brief introduction to Integer Programming. | |||||
Literatur | - Bernhard Korte, Jens Vygen: Combinatorial Optimization. 6th edition, Springer, 2018. - Alexander Schrijver: Combinatorial Optimization: Polyhedra and Efficiency. Springer, 2003. This work has 3 volumes. - Ravindra K. Ahuja, Thomas L. Magnanti, James B. Orlin. Network Flows: Theory, Algorithms, and Applications. Prentice Hall, 1993. - Alexander Schrijver: Theory of Linear and Integer Programming. John Wiley, 1986. | |||||
Voraussetzungen / Besonderes | Solid background in linear algebra. | |||||
401-3619-69L | Mathematics Tools in Machine Learning | W | 4 KP | 2G | F. Balabdaoui | |
Kurzbeschreibung | The course reviews many essential mathematical tools used in statistical learning. The lectures will cover the notions of hypotheses classes, sample complexity, PAC learnability, model validation and selection as well as results on several well-known algorithms and their convergence. | |||||
Lernziel | In the exploding world of artifical intelligence and automated learning, there is an urgent need to go back to the basis of what is driving many of the well-establsihed methods in statistical learning. The students attending the lectures will get acquainted with the main theoretical results needed to establish the theory of statistical learning. We start with defining what is meant by learning a task, a training sample, the trade-off between choosing a big class of functions (hypotheses) to learn the task and the difficulty of estimating the unknown function (generating the observed sample). The course will also cover the notion of learnability and the conditions under which it is possible to learn a task. In a second part, the lectures will cover algoritmic apsects where some well-known algorithms will be described and their convergence proved. Through the exerices classes, the students will deepen their understanding using their knowledge of the learned theory on some new situations, examples or some counterexamples. | |||||
Inhalt | The course will cover the following subjects: (*) Definition of Learning and Formal Learning Models (*) Uniform Convergence (*) Linear Predictors (*) The Bias-Complexity Trade-off (*) VC-classes and the VC dimension (*) Model Selection and Validation (*) Convex Learning Problems (*) Regularization and Stability (*) Stochastic Gradient Descent (*) Support Vector Machines (*) Kernels | |||||
Literatur | The course will be based on the book "Understanding Machine Learning: From Theory to Algorithms" by S. Shalev-Shwartz and S. Ben-David, which is available online through the ETH electronic library. Other good sources can be also read. This includes (*) the book "Neural Network Learning: Theoretical Foundations" de Martin Anthony and Peter L. Bartlett. This book can be borrowed from the ETH library. (*) the lectures notes on "Mathematics of Machine Learning" taught by Philippe Rigollet available through the OpenCourseWare website of MIT | |||||
Voraussetzungen / Besonderes | Being able to follow the lectures requires a solid background in Probability Theory and Mathematical Statistical. Notions in computations, convergence of algorithms can be helpful but are not required. | |||||
252-0535-00L | Advanced Machine Learning ![]() | W | 8 KP | 3V + 2U + 2A | J. M. Buhmann | |
Kurzbeschreibung | Machine learning algorithms provide analytical methods to search data sets for characteristic patterns. Typical tasks include the classification of data, function fitting and clustering, with applications in image and speech analysis, bioinformatics and exploratory data analysis. This course is accompanied by practical machine learning projects. | |||||
Lernziel | Students will be familiarized with advanced concepts and algorithms for supervised and unsupervised learning; reinforce the statistics knowledge which is indispensible to solve modeling problems under uncertainty. Key concepts are the generalization ability of algorithms and systematic approaches to modeling and regularization. Machine learning projects will provide an opportunity to test the machine learning algorithms on real world data. | |||||
Inhalt | The theory of fundamental machine learning concepts is presented in the lecture, and illustrated with relevant applications. Students can deepen their understanding by solving both pen-and-paper and programming exercises, where they implement and apply famous algorithms to real-world data. Topics covered in the lecture include: Fundamentals: What is data? Bayesian Learning Computational learning theory Supervised learning: Ensembles: Bagging and Boosting Max Margin methods Neural networks Unsupservised learning: Dimensionality reduction techniques Clustering Mixture Models Non-parametric density estimation Learning Dynamical Systems | |||||
Skript | No lecture notes, but slides will be made available on the course webpage. | |||||
Literatur | C. Bishop. Pattern Recognition and Machine Learning. Springer 2007. R. Duda, P. Hart, and D. Stork. Pattern Classification. John Wiley & Sons, second edition, 2001. T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning: Data Mining, Inference and Prediction. Springer, 2001. L. Wasserman. All of Statistics: A Concise Course in Statistical Inference. Springer, 2004. | |||||
Voraussetzungen / Besonderes | The course requires solid basic knowledge in analysis, statistics and numerical methods for CSE as well as practical programming experience for solving assignments. Students should have followed at least "Introduction to Machine Learning" or an equivalent course offered by another institution. PhD students are required to obtain a passing grade in the course (4.0 or higher based on project and exam) to gain credit points. | |||||
227-0423-00L | Neural Network Theory | W | 4 KP | 2V + 1U | H. Bölcskei, E. Riegler | |
Kurzbeschreibung | The class focuses on fundamental mathematical aspects of neural networks with an emphasis on deep networks: Universal approximation theorems, capacity of separating surfaces, generalization, reproducing Kernel Hilbert spaces, support vector machines, fundamental limits of deep neural network learning, dimension measures, feature extraction with scattering networks | |||||
Lernziel | After attending this lecture, participating in the exercise sessions, and working on the homework problem sets, students will have acquired a working knowledge of the mathematical foundations of neural networks. | |||||
Inhalt | 1. Universal approximation with single- and multi-layer networks 2. Geometry of decision surfaces 3. Separating capacity of nonlinear decision surfaces 4. Generalization 5. Reproducing Kernel Hilbert Spaces, support vector machines 6. Deep neural network approximation theory: Fundamental limits on compressibility of signal classes, Kolmogorov epsilon-entropy of signal classes, covering numbers, fundamental limits of deep neural network learning 7. Learning of real-valued functions: Pseudo-dimension, fat-shattering dimension, Vapnik-Chervonenkis dimension 8. Scattering networks | |||||
Skript | Detailed lecture notes will be provided as we go along. | |||||
Voraussetzungen / Besonderes | This course is aimed at students with a strong mathematical background in general, and in linear algebra, analysis, and probability theory in particular. |
Seite 1 von 2
Alle