Search result: Catalogue data in Spring Semester 2021

Data Science Master Information
Core Courses
Data Analysis
Information and Learning
NumberTitleTypeECTSHoursLecturers
227-0434-10LMathematics of Information Information W8 credits3V + 2U + 2AH. Bölcskei
AbstractThe class focuses on mathematical aspects of

1. Information science: Sampling theorems, frame theory, compressed sensing, sparsity, super-resolution, spectrum-blind sampling, subspace algorithms, dimensionality reduction

2. Learning theory: Approximation theory, greedy algorithms, uniform laws of large numbers, Rademacher complexity, Vapnik-Chervonenkis dimension
ObjectiveThe aim of the class is to familiarize the students with the most commonly used mathematical theories in data science, high-dimensional data analysis, and learning theory. The class consists of the lecture, exercise sessions with homework problems, and of a research project, which can be carried out either individually or in groups. The research project consists of either 1. software development for the solution of a practical signal processing or machine learning problem or 2. the analysis of a research paper or 3. a theoretical research problem of suitable complexity. Students are welcome to propose their own project at the beginning of the semester. The outcomes of all projects have to be presented to the entire class at the end of the semester.
ContentMathematics of Information

1. Signal representations: Frame theory, wavelets, Gabor expansions, sampling theorems, density theorems

2. Sparsity and compressed sensing: Sparse linear models, uncertainty relations in sparse signal recovery, super-resolution, spectrum-blind sampling, subspace algorithms (ESPRIT), estimation in the high-dimensional noisy case, Lasso

3. Dimensionality reduction: Random projections, the Johnson-Lindenstrauss Lemma

Mathematics of Learning

4. Approximation theory: Nonlinear approximation theory, best M-term approximation, greedy algorithms, fundamental limits on compressibility of signal classes, Kolmogorov-Tikhomirov epsilon-entropy of signal classes, optimal compression of signal classes

5. Uniform laws of large numbers: Rademacher complexity, Vapnik-Chervonenkis dimension, classes with polynomial discrimination
Lecture notesDetailed lecture notes will be provided at the beginning of the semester.
Prerequisites / NoticeThis course is aimed at students with a background in basic linear algebra, analysis, statistics, and probability.

We encourage students who are interested in mathematical data science to take both this course and "401-4944-20L Mathematics of Data Science" by Prof. A. Bandeira. The two courses are designed to be complementary.

H. Bölcskei and A. Bandeira
Statistics
NumberTitleTypeECTSHoursLecturers
401-3632-00LComputational StatisticsW8 credits3V + 1UM. Mächler
AbstractWe discuss modern statistical methods for data analysis, including methods for data exploration, prediction and inference. We pay attention to algorithmic aspects, theoretical properties and practical considerations. The class is hands-on and methods are applied using the statistical programming language R.
ObjectiveThe student obtains an overview of modern statistical methods for data analysis, including their algorithmic aspects and theoretical properties. The methods are applied using the statistical programming language R.
ContentSee the class website
Prerequisites / NoticeAt least one semester of (basic) probability and statistics.

Programming experience is helpful but not required.
Data Management
NumberTitleTypeECTSHoursLecturers
261-5110-00LOptimization for Data Science Information W10 credits3V + 2U + 4AB. Gärtner, D. Steurer, N. He
AbstractThis course provides an in-depth theoretical treatment of optimization methods that are particularly relevant in data science.
ObjectiveUnderstanding the theoretical guarantees (and their limits) of relevant optimization methods used in data science. Learning general paradigms to deal with optimization problems arising in data science.
ContentThis course provides an in-depth theoretical treatment of optimization methods that are particularly relevant in machine learning and data science.

In the first part of the course, we will first give a brief introduction to convex optimization, with some basic motivating examples from machine learning. Then we will analyse classical and more recent first and second order methods for convex optimization: gradient descent, Nesterov's accelerated method, proximal and splitting algorithms, subgradient descent, stochastic gradient descent, variance-reduced methods, Newton's method, and Quasi-Newton methods. The emphasis will be on analysis techniques that occur repeatedly in convergence analyses for various classes of convex functions. We will also discuss some classical and recent theoretical results for nonconvex optimization.

In the second part, we discuss convex programming relaxations as a powerful and versatile paradigm for designing efficient algorithms to solve computational problems arising in data science. We will learn about this paradigm and develop a unified perspective on it through the lens of the sum-of-squares semidefinite programming hierarchy. As applications, we are discussing non-negative matrix factorization, compressed sensing and sparse linear regression, matrix completion and phase retrieval, as well as robust estimation.
Prerequisites / NoticeAs background, we require material taught in the course "252-0209-00L Algorithms, Probability, and Computing". It is not necessary that participants have actually taken the course, but they should be prepared to catch up if necessary.
Core Electives
NumberTitleTypeECTSHoursLecturers
151-0566-00LRecursive Estimation Information W4 credits2V + 1UR. D'Andrea
AbstractEstimation of the state of a dynamic system based on a model and observations in a computationally efficient way.
ObjectiveLearn the basic recursive estimation methods and their underlying principles.
ContentIntroduction to state estimation; probability review; Bayes' theorem; Bayesian tracking; extracting estimates from probability distributions; Kalman filter; extended Kalman filter; particle filter; observer-based control and the separation principle.
Lecture notesLecture notes available on course website: Link
Prerequisites / NoticeRequirements: Introductory probability theory and matrix-vector algebra.
227-0150-00LSystems-on-Chip for Data Analytics and Machine Learning
Previously "Energy-Efficient Parallel Computing Systems for Data Analytics"
W6 credits4GL. Benini
AbstractSystems-on-chip architecture and related design issues with a focus on machine learning and data analytics applications. It will cover multi-cores, many-cores, vector engines, GP-GPUs, application-specific processors and heterogeneous compute accelerators. Special emphasis given to energy-efficiency issues and hardware-software techniques for power and energy minimization.
ObjectiveGive in-depth understanding of the links and dependencies between architectures and their energy-efficient implementation and to get a comprehensive exposure to state-of-the-art systems-on-chip platforms for machine learning and data analytics. Practical experience will also be gained through practical exercises and mini-projects (hardware and software) assigned on specific topics.
ContentThe course will cover advanced system-on-chip architectures, with an in-depth view on design challenges related to advanced silicon technology and state-of-the-art system integration options (nanometer silicon technology, novel storage devices, three-dimensional integration, advanced system packaging). The emphasis will be on programmable parallel architectures with application focus on machine learning and data analytics. The main SoC architectural families will be covered: namely, multi and many- cores, GPUs, vector accelerators, application-specific processors, heterogeneous platforms. The course will cover the complex design choices required to achieve scalability and energy proportionality. The course will will also delve into system design, touching on hardware-software tradeoffs and full-system analysis and optimization taking into account non-functional constraints and quality metrics, such as power consumption, thermal dissipation, reliability and variability. The application focus will be on machine learning both in the cloud and at the edges (near-sensor analytics).
Lecture notesSlides will be provided to accompany lectures. Pointers to scientific literature will be given. Exercise scripts and tutorials will be provided.
LiteratureJohn L. Hennessy, David A. Patterson, Computer Architecture: A Quantitative Approach (The Morgan Kaufmann Series in Computer Architecture and Design) 6th Edition, 2017.
Prerequisites / NoticeKnowledge of digital design at the level of "Design of Digital Circuits SS12" is required.

Knowledge of basic VLSI design at the level of "VLSI I: Architectures of VLSI Circuits" is required
227-0155-00LMachine Learning on Microcontrollers Restricted registration - show details
Number of participants limited to 40.
Registration in this class requires the permission of the instructors.
W6 credits3GM. Magno, L. Benini
AbstractMachine Learning (ML) and artificial intelligence are pervading the digital society. Today, even low power embedded systems are incorporating ML, becoming increasingly “smart”. This lecture gives an overview of ML methods and algorithms to process and extracts useful near-sensor information in end-nodes of the “internet-of-things”, using low-power microcontrollers (ARM-Cortex-M; RISC-V).
ObjectiveLearn how to Process data from sensors and how to extract useful information with low power microprocessors using ML techniques. We will analyze data coming from real low-power sensors (accelerometers, microphones, ExG bio-signals, cameras…). The main objective is to study in detail how Machine Learning algorithms can be adapted to the performance constraints and limited resources of low-power microcontrollers becoming Tiny Machine learning algorithms.
ContentThe final goal of the course is a deep understanding of machine learning and its practical implementation on single- and multi-core microcontrollers, coupled with performance and energy efficiency analysis and optimization. The main topics of the course include:

- Sensors and sensor data acquisition with low power embedded systems

- Machine Learning: Overview of supervised and unsupervised learning and in particular supervised learning ( Decision Trees, Random, Support Vector Machines, Artificial Neural Networks, Deep Learning, and Convolutional Networks)

- Low-power embedded systems and their architecture. Low Power microcontrollers (ARM-Cortex M) and RISC-V-based Parallel Ultra Low Power (PULP) systems-on-chip.

- Low power smart sensor system design: hardware-software tradeoffs, analysis, and optimization. Implementation and performance evaluation of ML in battery-operated embedded systems.

The laboratory exercised will show how to address concrete design problems, like motion, gesture recognition, emotion detection, image, and sound classification, using real sensors data and real MCU boards.

Presentations from Ph.D. students and the visit to the Digital Circuits and Systems Group will introduce current research topics and international research projects.
Lecture notesScript and exercise sheets. Books will be suggested during the course.
Prerequisites / NoticePrerequisites: Good experience in C language programming. Microprocessors and computer architecture. Basics of Digital Signal Processing. Some exposure to machine learning concepts is also desirable.
227-0224-00LStochastic Systems
Does not take place this semester.
W4 credits2V + 1Uto be announced
AbstractProbability. Stochastic processes. Stochastic differential equations. Ito. Kalman filters. St Stochastic optimal control. Applications in financial engineering.
ObjectiveStochastic dynamic systems. Optimal control and filtering of stochastic systems. Examples in technology and finance.
Content- Stochastic processes
- Stochastic calculus (Ito)
- Stochastic differential equations
- Discrete time stochastic difference equations
- Stochastic processes AR, MA, ARMA, ARMAX, GARCH
- Kalman filter
- Stochastic optimal control
- Applications in finance and engineering
Lecture notesH. P. Geering et al., Stochastic Systems, Measurement and Control Laboratory, 2007 and handouts
227-0420-00LInformation Theory II Information W6 credits4GA. Lapidoth, S. M. Moser
AbstractThis course builds on Information Theory I. It introduces additional topics in single-user communication, connections between Information Theory and Statistics, and Network Information Theory.
ObjectiveThe course's objective is to introduce the students to additional information measures and to equip them with the tools that are needed to conduct research in Information Theory as it relates to Communication Networks and to Statistics.
ContentSanov's Theorem, Rényi entropy and guessing, differential entropy, maximum entropy, the Gaussian channel, the entropy-power inequality, the broadcast channel, the multiple-access channel, Slepian-Wolf coding, the Gelfand-Pinsker problem, and Fisher information.
Lecture notesn/a
LiteratureT.M. Cover and J.A. Thomas, Elements of Information Theory, second edition, Wiley 2006
Prerequisites / NoticeBasic introductory course on Information Theory.
227-0424-00LModel- and Learning-Based Inverse Problems in ImagingW4 credits2V + 1PV. Vishnevskiy
AbstractReconstruction is an inverse problem which estimates images from noisy measurements. Model-based reconstructions use analytical models of the imaging process and priors. Data-based methods directly approximate inversion using training data. Combining these two approaches yields physics-aware neural nets and state-of-the-art imaging accuracy (MRI, US, CT, microscopy, non-destructive imaging).
ObjectiveThe goal of this course is to introduce the mathematical models of imaging experiments and practice implementation of numerical methods to solve the corresponding inverse problem. Students will learn how to improve reconstruction accuracy by introducing prior knowledge in the form of regularization models and training data. Furthermore, students will practice incorporating imaging model knowledge into deep neural networks.
ContentThe course is based on following fundamental fields: (i) numerical linear algebra, (ii) mathematical statistics and learning theory, (iii) convex optimization and (iv) signal processing. The first part of the course introduces classical linear and nonlinear methods for image reconstruction. The second part considers data-based regularization and covers modern deep learning approaches to inverse problems in imaging. Finally, we introduce advances in the actively developing field of experimental design in biomedical imaging (i.e. how to conduct an experiment in a way to enable the most accurate reconstruction).

1. Introduction: Examples of inverse problems, general introduction. Refresh prerequisites.

2. Linear algebra in imaging: Refresh prerequisites. Demonstrate properties of operators employed in imaging.

3. Linear inverse problems and regularization: Classical theory of inverse problems. Introduce notion of ill-posedness and regularization.

3. Compressed sensing: Sparsity, basis-CS, TV-CS. Notion of analysis and synthesis forms of reconstruction problems. Application of PGD and ADMM to reconstruction.

4. Advanced priors and model selection: Total generalized variation, GMM priors, vectorial TV, low-rank, and tensor models. Stein's unbiased risk estimator.

5. Dictionary and prior learning: Classical dictionary learning. Gentle intro to machine learning. A lot of technical details about patch-models.

6. Deep learning in image reconstruction: Generic convolutional-NN models (automap, residual filtering, u-nets). Talk about the data generation process. Characterized difference between model- and data-based reconstruction methods. Mode averaging.

7. Loop unrolling and physics-aware networks for reconstruction: Autograd, Variational Networks, a lot of examples and intuition. Show how to use them efficiently, e.g. adding preconditioners, attention, etc.

8. Generative models and uncertainty quantification: Amortized posterior, variational autoencoders, adversarial learning. Estimation uncertainty quantification.

9. Inversible networks for estimation: Gradient flows in networks, inversible neural networks for estimation problems.

10. Experimental design in imaging: Acquisition optimization for continuous models. How far can we exploit autograd?

11. Signal sampling optimization in MRI. Reinforcement learning: Acquisition optimization for discrete models. Reinforce and policy gradients, variance minimization for discrete variables (RELAX, REBAR). Cartesian under-sampling pattern design

12. Summary and exam preparation.
Lecture notesLecture slides with references will be provided during the course.
Prerequisites / NoticeStudents are expected to know the basics of (i) numerical linear algebra, (ii) applied methods of convex optimization, (iii) computational statistics, (iv) Matlab and Python.
227-0427-10LAdvanced Signal Analysis, Modeling, and Machine Learning Information W6 credits4GH.‑A. Loeliger
AbstractThe course develops a selection of topics pivoting around graphical models (factor graphs), state space methods, sparsity, and pertinent algorithms.
ObjectiveThe course develops a selection of topics pivoting around factor graphs, state space methods, and pertinent algorithms:
- factor graphs and message passing algorithms
- hidden-​Markov models
- linear state space models, Kalman filtering, and recursive least squares
- Gaussian message passing
- Gibbs sampling, particle filter
- recursive local polynomial fitting & applications
- parameter learning by expectation maximization
- sparsity and spikes
- binary control and digital-​to-analog conversion
- duality and factor graph transforms
Lecture notesLecture notes
Prerequisites / NoticeSolid mathematical foundations (especially in probability, estimation, and linear algebra) as provided by the course "Introduction to Estimation and Machine Learning".
227-0432-00LLearning, Classification and Compression Information W4 credits2V + 1UE. Riegler
AbstractThe focus of the course is aligned to a theoretical approach of learning theory and classification and an introduction to lossy and lossless compression for general sets and measures. We will mainly focus on a probabilistic approach, where an underlying distribution must be learned/compressed. The concepts acquired in the course are of broad and general interest in data sciences.
ObjectiveAfter attending this lecture and participating in the exercise sessions, students will have acquired a working knowledge of learning theory, classification, and compression.
Content1. Learning Theory
(a) Framework of Learning
(b) Hypothesis Spaces and Target Functions
(c) Reproducing Kernel Hilbert Spaces
(d) Bias-Variance Tradeoff
(e) Estimation of Sample and Approximation Error

2. Classification
(a) Binary Classifier
(b) Support Vector Machines (separable case)
(c) Support Vector Machines (nonseparable case)
(d) Kernel Trick

3. Lossy and Lossless Compression
(a) Basics of Compression
(b) Compressed Sensing for General Sets and Measures
(c) Quantization and Rate Distortion Theory for General Sets and Measures
Lecture notesDetailed lecture notes will be provided.
Prerequisites / NoticeThis course is aimed at students with a solid background in measure theory and linear algebra and basic knowledge in functional analysis.
227-0558-00LPrinciples of Distributed Computing Information W7 credits2V + 2U + 2AR. Wattenhofer, M. Ghaffari
AbstractWe study the fundamental issues underlying the design of distributed systems: communication, coordination, fault-tolerance, locality, parallelism, self-organization, symmetry breaking, synchronization, uncertainty. We explore essential algorithmic ideas and lower bound techniques.
ObjectiveDistributed computing is essential in modern computing and communications systems. Examples are on the one hand large-scale networks such as the Internet, and on the other hand multiprocessors such as your new multi-core laptop. This course introduces the principles of distributed computing, emphasizing the fundamental issues underlying the design of distributed systems and networks: communication, coordination, fault-tolerance, locality, parallelism, self-organization, symmetry breaking, synchronization, uncertainty. We explore essential algorithmic ideas and lower bound techniques, basically the "pearls" of distributed computing. We will cover a fresh topic every week.
ContentDistributed computing models and paradigms, e.g. message passing, shared memory, synchronous vs. asynchronous systems, time and message complexity, peer-to-peer systems, small-world networks, social networks, sorting networks, wireless communication, and self-organizing systems.

Distributed algorithms, e.g. leader election, coloring, covering, packing, decomposition, spanning trees, mutual exclusion, store and collect, arrow, ivy, synchronizers, diameter, all-pairs-shortest-path, wake-up, and lower bounds
Lecture notesAvailable. Our course script is used at dozens of other universities around the world.
LiteratureLecture Notes By Roger Wattenhofer. These lecture notes are taught at about a dozen different universities through the world.

Distributed Computing: Fundamentals, Simulations and Advanced Topics
Hagit Attiya, Jennifer Welch.
McGraw-Hill Publishing, 1998, ISBN 0-07-709352 6

Introduction to Algorithms
Thomas Cormen, Charles Leiserson, Ronald Rivest.
The MIT Press, 1998, ISBN 0-262-53091-0 oder 0-262-03141-8

Disseminatin of Information in Communication Networks
Juraj Hromkovic, Ralf Klasing, Andrzej Pelc, Peter Ruzicka, Walter Unger.
Springer-Verlag, Berlin Heidelberg, 2005, ISBN 3-540-00846-2

Introduction to Parallel Algorithms and Architectures: Arrays, Trees, Hypercubes
Frank Thomson Leighton.
Morgan Kaufmann Publishers Inc., San Francisco, CA, 1991, ISBN 1-55860-117-1

Distributed Computing: A Locality-Sensitive Approach
David Peleg.
Society for Industrial and Applied Mathematics (SIAM), 2000, ISBN 0-89871-464-8
Prerequisites / NoticeCourse pre-requisites: Interest in algorithmic problems. (No particular course needed.)
227-0560-00LDeep Learning for Autonomous Driving Information Restricted registration - show details
Registration in this class requires the permission of the instructors.
Class size will be limited to 80 students.
Please send an email to Dengxin Dai <Link> about your courses/projects that are related to machine learning, computer vision, and Robotics.
W6 credits3V + 2PD. Dai, A. Liniger
AbstractAutonomous driving has moved from the realm of science fiction to a very real possibility during the past twenty years, largely due to rapid developments of deep learning approaches, automotive sensors, and microprocessor capacity. This course covers the core techniques required for building a self-driving car, especially the practical use of deep learning through this theme.
ObjectiveStudents will learn about the fundamental aspects of a self-driving car. They will also learn to use modern automotive sensors and HD navigational maps, and to implement, train and debug their own deep neural networks in order to gain a deep understanding of cutting-edge research in autonomous driving tasks, including perception, localization and control.

After attending this course, students will:
1) understand the core technologies of building a self-driving car;
2) have a good overview over the current state of the art in self-driving cars;
3) be able to critically analyze and evaluate current research in this area;
4) be able to implement basic systems for multiple autonomous driving tasks.
ContentWe will focus on teaching the following topics centered on autonomous driving: deep learning, automotive sensors, multimodal driving datasets, road scene perception, ego-vehicle localization, path planning, and control.

The course covers the following main areas:

I) Foundation
a) Fundamentals of a self-driving car
b) Fundamentals of deep-learning


II) Perception
a) Semantic segmentation and lane detection
b) Depth estimation with images and sparse LiDAR data
c) 3D object detection with images and LiDAR data
d) Object tracking and Lane Detection

III) Localization
a) GPS-based and Vision-based Localization
b) Visual Odometry and Lidar Odometry

IV) Path Planning and Control
a) Path planning for autonomous driving
b) Motion planning and vehicle control
c) Imitation learning and reinforcement learning for self driving cars

The exercise projects will involve training complex neural networks and applying them on real-world, multimodal driving datasets. In particular, students should be able to develop systems that deal with the following problems:
- Sensor calibration and synchronization to obtain multimodal driving data;
- Semantic segmentation and depth estimation with deep neural networks ;
- 3D object detection and tracking in LiDAR point clouds
Lecture notesThe lecture slides will be provided as a PDF.
Prerequisites / NoticeThis is an advanced grad-level course. Students must have taken courses on machine learning and computer vision or have acquired equivalent knowledge. Students are expected to have a solid mathematical foundation, in particular in linear algebra, multivariate calculus, and probability. All practical exercises will require basic knowledge of Python and will use libraries such as PyTorch, scikit-learn and scikit-image.
252-0211-00LInformation Security Information W8 credits4V + 3UD. Basin, S. Capkun
AbstractThis course provides an introduction to Information Security. The focus
is on fundamental concepts and models, basic cryptography, protocols and system security, and privacy and data protection. While the emphasis is on foundations, case studies will be given that examine different realizations of these ideas in practice.
ObjectiveMaster fundamental concepts in Information Security and their
application to system building. (See objectives listed below for more details).
Content1. Introduction and Motivation (OBJECTIVE: Broad conceptual overview of information security) Motivation: implications of IT on society/economy, Classical security problems, Approaches to
defining security and security goals, Abstractions, assumptions, and trust, Risk management and the human factor, Course verview. 2. Foundations of Cryptography (OBJECTIVE: Understand basic
cryptographic mechanisms and applications) Introduction, Basic concepts in cryptography: Overview, Types of Security, computational hardness, Abstraction of channel security properties, Symmetric
encryption, Hash functions, Message authentication codes, Public-key distribution, Public-key cryptosystems, Digital signatures, Application case studies, Comparison of encryption at different layers, VPN, SSL, Digital payment systems, blind signatures, e-cash, Time stamping 3. Key Management and Public-key Infrastructures (OBJECTIVE: Understand the basic mechanisms relevant in an Internet context) Key management in distributed systems, Exact characterization of requirements, the role of trust, Public-key Certificates, Public-key Infrastructures, Digital evidence and non-repudiation, Application case studies, Kerberos, X.509, PGP. 4. Security Protocols (OBJECTIVE: Understand network-oriented security, i.e.. how to employ building blocks to secure applications in (open) networks) Introduction, Requirements/properties, Establishing shared secrets, Principal and message origin authentication, Environmental assumptions, Dolev-Yao intruder model and
variants, Illustrative examples, Formal models and reasoning, Trace-based interleaving semantics, Inductive verification, or model-checking for falsification, Techniques for protocol design,
Application case study 1: from Needham-Schroeder Shared-Key to Kerberos, Application case study 2: from DH to IKE. 5. Access Control and Security Policies (OBJECTIVES: Study system-oriented security, i.e., policies, models, and mechanisms) Motivation (relationship to CIA, relationship to Crypto) and examples Concepts: policies versus models versus mechanisms, DAC and MAC, Modeling formalism, Access Control Matrix Model, Roll Based Access Control, Bell-LaPadula, Harrison-Ruzzo-Ullmann, Information flow, Chinese Wall, Biba, Clark-Wilson, System mechanisms: Operating Systems, Hardware Security Features, Reference Monitors, File-system protection, Application case studies 6. Anonymity and Privacy (OBJECTIVE: examine protection goals beyond standard CIA and corresponding mechanisms) Motivation and Definitions, Privacy, policies and policy languages, mechanisms, problems, Anonymity: simple mechanisms (pseudonyms, proxies), Application case studies: mix networks and crowds. 7. Larger application case study: GSM, mobility
252-0526-00LStatistical Learning Theory Information W8 credits3V + 2U + 2AJ. M. Buhmann, C. Cotrini Jimenez
AbstractThe course covers advanced methods of statistical learning:

- Variational methods and optimization.
- Deterministic annealing.
- Clustering for diverse types of data.
- Model validation by information theory.
ObjectiveThe course surveys recent methods of statistical learning. The fundamentals of machine learning, as presented in the courses "Introduction to Machine Learning" and "Advanced Machine Learning", are expanded from the perspective of statistical learning.
Content- Variational methods and optimization. We consider optimization approaches for problems where the optimizer is a probability distribution. We will discuss concepts like maximum entropy, information bottleneck, and deterministic annealing.

- Clustering. This is the problem of sorting data into groups without using training samples. We discuss alternative notions of "similarity" between data points and adequate optimization procedures.

- Model selection and validation. This refers to the question of how complex the chosen model should be. In particular, we present an information theoretic approach for model validation.

- Statistical physics models. We discuss approaches for approximately optimizing large systems, which originate in statistical physics (free energy minimization applied to spin glasses and other models). We also study sampling methods based on these models.
Lecture notesA draft of a script will be provided. Lecture slides will be made available.
LiteratureHastie, Tibshirani, Friedman: The Elements of Statistical Learning, Springer, 2001.

L. Devroye, L. Gyorfi, and G. Lugosi: A probabilistic theory of pattern recognition. Springer, New York, 1996
Prerequisites / NoticeKnowledge of machine learning (introduction to machine learning and/or advanced machine learning)
Basic knowledge of statistics.
252-0538-00LShape Modeling and Geometry Processing Information W8 credits2V + 1U + 4AO. Sorkine Hornung
AbstractThis course covers the fundamentals and some of the latest developments in geometric modeling and geometry processing. Topics include surface modeling based on point clouds and polygonal meshes, mesh generation, surface reconstruction, mesh fairing and parameterization, discrete differential geometry, interactive shape editing, topics in digital shape fabrication.
ObjectiveThe students will learn how to design, program and analyze algorithms and systems for interactive 3D shape modeling and geometry processing.
ContentRecent advances in 3D geometry processing have created a plenitude of novel concepts for the mathematical representation and interactive manipulation of geometric models. This course covers the fundamentals and some of the latest developments in geometric modeling and geometry processing. Topics include surface modeling based on point clouds and triangle meshes, mesh generation, surface reconstruction, mesh fairing and parameterization, discrete differential geometry, interactive shape editing and digital shape fabrication.
Lecture notesSlides and course notes
Prerequisites / NoticePrerequisites:
Visual Computing, Computer Graphics or an equivalent class. Experience with C++ programming. Solid background in linear algebra and analysis. Some knowledge of differential geometry, computational geometry and numerical methods is helpful but not a strict requirement.
252-0579-00L3D Vision Information W5 credits3G + 1AM. Pollefeys, V. Larsson
AbstractThe course covers camera models and calibration, feature tracking and matching, camera motion estimation via simultaneous localization and mapping (SLAM) and visual odometry (VO), epipolar and mult-view geometry, structure-from-motion, (multi-view) stereo, augmented reality, and image-based (re-)localization.
ObjectiveAfter attending this course, students will:
1. understand the core concepts for recovering 3D shape of objects and scenes from images and video.
2. be able to implement basic systems for vision-based robotics and simple virtual/augmented reality applications.
3. have a good overview over the current state-of-the art in 3D vision.
4. be able to critically analyze and asses current research in this area.
ContentThe goal of this course is to teach the core techniques required for robotic and augmented reality applications: How to determine the motion of a camera and how to estimate the absolute position and orientation of a camera in the real world. This course will introduce the basic concepts of 3D Vision in the form of short lectures, followed by student presentations discussing the current state-of-the-art. The main focus of this course are student projects on 3D Vision topics, with an emphasis on robotic vision and virtual and augmented reality applications.
252-3005-00LNatural Language Processing Information Restricted registration - show details
Number of participants limited to 400.
W5 credits2V + 1U + 1AR. Cotterell
AbstractThis course presents topics in natural language processing with an emphasis on modern techniques, primarily focusing on statistical and deep learning approaches. The course provides an overview of the primary areas of research in language processing as well as a detailed exploration of the models and techniques used both in research and in commercial natural language systems.
ObjectiveThe objective of the course is to learn the basic concepts in the statistical processing of natural languages. The course will be project-oriented so that the students can also gain hands-on experience with state-of-the-art tools and techniques.
ContentThis course presents an introduction to general topics and techniques used in natural language processing today, primarily focusing on statistical approaches. The course provides an overview of the primary areas of research in language processing as well as a detailed exploration of the models and techniques used both in research and in commercial natural language systems.
LiteratureJacob Eisenstein: Introduction to Natural Language Processing (Adaptive Computation and Machine Learning series)
261-5130-00LResearch in Data Science Restricted registration - show details
Only for Data Science MSc.
W6 credits13AProfessors
AbstractIndependent work under the supervision of a core or adjunct faculty of data science.
ObjectiveIndependent work under the supervision of a core or adjunct faculty of data science.
An approval of the director of studies is required for a non DS professor.
ContentProject done under supervision of an approved professor.
Prerequisites / NoticeOnly students who have passed at least one core course in Data Management and Processing, and one core course in Data Analysis can start with a research project.

A project description must be submitted at the start of the project to the studies administration.
263-0007-00LAdvanced Systems Lab Information Restricted registration - show details
Only for master students, otherwise a special permission by the study administration of D-INFK is required.
W8 credits3V + 2U + 2AM. Püschel, C. Zhang
AbstractThis course introduces the student to the foundations and state-of-the-art techniques in developing high performance software for mathematical functionality occurring in various fields in computer science. The focus is on optimizing for a single core and includes optimizing for the memory hierarchy, for special instruction sets, and the possible use of automatic performance tuning.
ObjectiveSoftware performance (i.e., runtime) arises through the complex interaction of algorithm, its implementation, the compiler used, and the microarchitecture the program is run on. The first goal of the course is to provide the student with an understanding of this "vertical" interaction, and hence software performance, for mathematical functionality. The second goal is to teach a systematic strategy how to use this knowledge to write fast software for numerical problems. This strategy will be trained in several homeworks and a semester-long group project.
ContentThe fast evolution and increasing complexity of computing platforms pose a major challenge for developers of high performance software for engineering, science, and consumer applications: it becomes increasingly harder to harness the available computing power. Straightforward implementations may lose as much as one or two orders of magnitude in performance. On the other hand, creating optimal implementations requires the developer to have an understanding of algorithms, capabilities and limitations of compilers, and the target platform's architecture and microarchitecture.

This interdisciplinary course introduces the student to the foundations and state-of-the-art techniques in high performance mathematical software development using important functionality such as matrix operations, transforms, filters, and others as examples. The course will explain how to optimize for the memory hierarchy, take advantage of special instruction sets, and other details of current processors that require optimization. The concept of automatic performance tuning is introduced. The focus is on optimization for a single core; thus, the course complements others on parallel and distributed computing.

Finally a general strategy for performance analysis and optimization is introduced that the students will apply in group projects that accompany the course.
Prerequisites / NoticeSolid knowledge of the C programming language and matrix algebra.
263-0008-00LComputational Intelligence Lab
Only for master students, otherwise a special permission by the study administration of D-INFK is required.
W8 credits2V + 2U + 3AT. Hofmann
AbstractThis laboratory course teaches fundamental concepts in computational science and machine learning with a special emphasis on matrix factorization and representation learning. The class covers techniques like dimension reduction, data clustering, sparse coding, and deep learning as well as a wide spectrum of related use cases and applications.
ObjectiveStudents acquire fundamental theoretical concepts and methodologies from machine learning and how to apply these techniques to build intelligent systems that solve real-world problems. They learn to successfully develop solutions to application problems by following the key steps of modeling, algorithm design, implementation and experimental validation.

This lab course has a strong focus on practical assignments. Students work in groups of three to four people, to develop solutions to three application problems: 1. Collaborative filtering and recommender systems, 2. Text sentiment classification, and 3. Road segmentation in aerial imagery.

For each of these problems, students submit their solutions to an online evaluation and ranking system, and get feedback in terms of numerical accuracy and computational speed. In the final part of the course, students combine and extend one of their previous promising solutions, and write up their findings in an extended abstract in the style of a conference paper.

(Disclaimer: The offered projects may be subject to change from year to year.)
Contentsee course description
263-2925-00LProgram Analysis for System Security and Reliability Information W7 credits2V + 1U + 3AM. Vechev
AbstractSecurity issues in modern systems (blockchains, datacenters, deep learning, etc.) result in billions of losses due to hacks and system downtime. This course introduces fundamental techniques (ranging from automated analysis, machine learning, synthesis, zero-knowledge and their combinations) that can be applied in practice so to build more secure and reliable modern systems.
Objective* Understand the fundamental techniques used to create modern security and reliability analysis engines that are used worldwide.

* Understand how symbolic techniques are combined with machine learning (e.g., deep learning, reinforcement learning) so to create new kinds of learning-based analyzers.

* Understand how to quantify and fix security and reliability issues in modern deep learning models.

* Understand open research questions from both theoretical and practical perspectives.
ContentPlease see: Link for detailed course content.
263-3710-00LMachine Perception Information Restricted registration - show details
Number of participants limited to 200.
W8 credits3V + 2U + 2AO. Hilliges, S. Tang
AbstractRecent developments in neural networks (aka “deep learning”) have drastically advanced the performance of machine perception systems in a variety of areas including computer vision, robotics, and intelligent UIs. This course is a deep dive into deep learning algorithms and architectures with applications to a variety of perceptual tasks.
ObjectiveStudents will learn about fundamental aspects of modern deep learning approaches for perception. Students will learn to implement, train and debug their own neural networks and gain a detailed understanding of cutting-edge research in learning-based computer vision, robotics and HCI. The final project assignment will involve training a complex neural network architecture and applying it on a real-world dataset of human activity.

The core competency acquired through this course is a solid foundation in deep-learning algorithms to process and interpret human input into computing systems. In particular, students should be able to develop systems that deal with the problem of recognizing people in images, detecting and describing body parts, inferring their spatial configuration, performing action/gesture recognition from still images or image sequences, also considering multi-modal data, among others.
ContentWe will focus on teaching: how to set up the problem of machine perception, the learning algorithms, network architectures and advanced deep learning concepts in particular probabilistic deep learning models

The course covers the following main areas:
I) Foundations of deep-learning.
II) Probabilistic deep-learning for generative modelling of data (latent variable models, generative adversarial networks and auto-regressive models).
III) Deep learning in computer vision, human-computer interaction and robotics.

Specific topics include: 
I) Deep learning basics:
a) Neural Networks and training (i.e., backpropagation)
b) Feedforward Networks
c) Timeseries modelling (RNN, GRU, LSTM)
d) Convolutional Neural Networks for classification
II) Probabilistic Deep Learning:
a) Latent variable models (VAEs)
b) Generative adversarial networks (GANs)
c) Autoregressive models (PixelCNN, PixelRNN, TCNs)
III) Deep Learning techniques for machine perception:
a) Fully Convolutional architectures for dense per-pixel tasks (i.e., instance segmentation)
b) Pose estimation and other tasks involving human activity
c) Deep reinforcement learning
IV) Case studies from research in computer vision, HCI, robotics and signal processing
LiteratureDeep Learning
Book by Ian Goodfellow and Yoshua Bengio
Prerequisites / Notice***
In accordance with the ETH Covid-19 master plan the lecture will be fully virtual. Details on the course website.
***

This is an advanced grad-level course that requires a background in machine learning. Students are expected to have a solid mathematical foundation, in particular in linear algebra, multivariate calculus, and probability. The course will focus on state-of-the-art research in deep-learning and will not repeat basics of machine learning

Please take note of the following conditions:
1) The number of participants is limited to 200 students (MSc and PhDs).
2) Students must have taken the exam in Machine Learning (252-0535-00) or have acquired equivalent knowledge
3) All practical exercises will require basic knowledge of Python and will use libraries such as Pytorch, scikit-learn and scikit-image. We will provide introductions to Pytorch and other libraries that are needed but will not provide introductions to basic programming or Python.

The following courses are strongly recommended as prerequisite:
* "Visual Computing" or "Computer Vision"

The course will be assessed by a final written examination in English. No course materials or electronic devices can be used during the examination. Note that the examination will be based on the contents of the lectures, the associated reading materials and the exercises.
263-3855-00LCloud Computing Architecture Information W9 credits3V + 2U + 3AG. Alonso, A. Klimovic
AbstractCloud computing hosts a wide variety of online services that we use on a daily basis, including web search, social networks, and video streaming. This course will cover how datacenter hardware, systems software, and application frameworks are designed for the cloud.
ObjectiveAfter successful completion of this course, students will be able to: 1) reason about performance, energy efficiency, and availability tradeoffs in the design of cloud system software, 2) describe how datacenter hardware is organized and explain why it is organized as such, 3) implement cloud applications as well as analyze and optimize their performance.
ContentIn this course, we study how datacenter hardware, systems software, and applications are designed at large scale for the cloud. The course covers topics including server design, cluster management, large-scale storage systems, serverless computing, data analytics frameworks, and performance analysis.
Lecture notesLecture slides will be available on the course website.
Prerequisites / NoticeUndergraduate courses in 1) computer architecture and 2) operating systems, distributed systems, and/or database systems are strongly recommended.
263-4400-00LAdvanced Graph Algorithms and Optimization Information W8 credits3V + 1U + 3AR. Kyng, M. Probst
AbstractThis course will cover a number of advanced topics in optimization and graph algorithms.
ObjectiveThe course will take students on a deep dive into modern approaches to
graph algorithms using convex optimization techniques.

By studying convex optimization through the lens of graph algorithms,
students should develop a deeper understanding of fundamental
phenomena in optimization.

The course will cover some traditional discrete approaches to various graph
problems, especially flow problems, and then contrast these approaches
with modern, asymptotically faster methods based on combining convex
optimization with spectral and combinatorial graph theory.
ContentStudents should leave the course understanding key
concepts in optimization such as first and second-order optimization,
convex duality, multiplicative weights and dual-based methods,
acceleration, preconditioning, and non-Euclidean optimization.

Students will also be familiarized with central techniques in the
development of graph algorithms in the past 15 years, including graph
decomposition techniques, sparsification, oblivious routing, and
spectral and combinatorial preconditioning.
Prerequisites / NoticeThis course is targeted toward masters and doctoral students with an
interest in theoretical computer science.

Students should be comfortable with design and analysis of algorithms, probability, and linear algebra.

Having passed the course Algorithms, Probability, and Computing (APC) is highly recommended, but not formally required. If you are not
sure whether you're ready for this class or not, please consult the
instructor.
263-5000-00LComputational Semantics for Natural Language Processing Information Restricted registration - show details
Limited number of participants: 80. Last cancellation/deregistration date for this graded semester performance: Friday, 26 March 2021! Please note that after that date no deregistration will be accepted and the course will be considered as "fail".
W6 credits2V + 1U + 2AM. Sachan
AbstractThis course presents an introduction to Natural language processing (NLP) with an emphasis on computational semantics i.e. the process of constructing and reasoning with meaning representations of natural language text.
ObjectiveThe objective of the course is to learn about various topics in computational semantics and its importance in natural language processing methodology and research. Exercises and the project will be key parts of the course so the students will be able to gain hands-on experience with state-of-the-art techniques in the field.
ContentWe will take a modern view of the topic, and focus on various statistical and deep learning approaches for computation semantics. We will also overview various primary areas of research in language processing and discuss how the computational semantics view can help us make advances in NLP.
Lecture notesLecture slides will be made available at the course Web site.
LiteratureNo textbook is required, but there will be regularly assigned readings from research literature, linked to the course website.
Prerequisites / NoticeThe student should have successfully completed a graduate level class in machine learning (252-0220-00L), deep learning (263-3210-00L) or natural language processing (252-3005-00L) before. Similar courses from other universities are acceptable too.
263-5300-00LGuarantees for Machine Learning Information Restricted registration - show details
Number of participants limited to 30.

Last cancellation/deregistration date for this graded semester performance: 17 March 2021! Please note that after that date no deregistration will be accepted and a "no show" will appear on your transcript.
W7 credits3G + 3AF. Yang
AbstractThis course is aimed at advanced master and doctorate students who want to conduct independent research on theory for modern machine learning (ML). It teaches classical and recent methods in statistical learning theory commonly used to prove theoretical guarantees for ML algorithms. The knowledge is then applied in independent project work that focuses on understanding modern ML phenomena.
ObjectiveLearning objectives:

- acquire enough mathematical background to understand a good fraction of theory papers published in the typical ML venues. For this purpose, students will learn common mathematical techniques from statistics and optimization in the first part of the course and apply this knowledge in the project work
- critically examine recently published work in terms of relevance and determine impactful (novel) research problems. This will be an integral part of the project work and involves experimental as well as theoretical questions
- find and outline an approach (some subproblem) to prove a conjectured theorem. This will be practiced in lectures / exercise and homeworks and potentially in the final project.
- effectively communicate and present the problem motivation, new insights and results to a technical audience. This will be primarily learned via the final presentation and report as well as during peer-grading of peer talks.
ContentThis course touches upon foundational methods in statistical learning theory aimed at proving theoretical guarantees for machine learning algorithms, touching on the following topics
- concentration bounds
- uniform convergence and empirical process theory
- high-dimensional statistics (e.g. sparsity)
- regularization for non-parametric statistics (e.g. in RKHS, neural networks)
- implicit regularization via gradient descent (e.g. margins, early stopping)
- minimax lower bounds

The project work focuses on current theoretical ML research that aims to understand modern phenomena in machine learning, including but not limited to
- how overparameterization could help generalization ( RKHS, NN )
- how overparameterization could help optimization ( non-convex optimization, loss landscape )
- complexity measures and approximation theoretic properties of randomly initialized and trained NN
- generalization of robust learning ( adversarial robustness, standard and robust error tradeoff, distribution shift)
Prerequisites / NoticeIt’s absolutely necessary for students to have a strong mathematical background (basic real analysis, probability theory, linear algebra) and good knowledge of core concepts in machine learning taught in courses such as “Introduction to Machine Learning”, “Regression”/ “Statistical Modelling”. In addition to these prerequisites, this class requires a high degree of mathematical maturity—including abstract thinking and the ability to understand and write proofs.

Students have usually taken a subset of Fundamentals of Mathematical Statistics, Probabilistic AI, Neural Network Theory, Optimization for Data Science, Advanced ML, Statistical Learning Theory, Probability Theory (D-MATH)
401-0674-00LNumerical Methods for Partial Differential Equations
Not meant for BSc/MSc students of mathematics.
W10 credits2G + 2U + 2P + 4AR. Hiptmair
AbstractDerivation, properties, and implementation of fundamental numerical methods for a few key partial differential equations: convection-diffusion, heat equation, wave equation, conservation laws. Implementation in C++ based on a finite element library.
ObjectiveMain skills to be acquired in this course:
* Ability to implement fundamental numerical methods for the solution of partial differential equations efficiently.
* Ability to modify and adapt numerical algorithms guided by awareness of their mathematical foundations.
* Ability to select and assess numerical methods in light of the predictions of theory
* Ability to identify features of a PDE (= partial differential equation) based model that are relevant for the selection and performance of a numerical algorithm.
* Ability to understand research publications on theoretical and practical aspects of numerical methods for partial differential equations.
* Skills in the efficient implementation of finite element methods on unstructured meshes.

This course is neither a course on the mathematical foundations and numerical analysis of methods nor an course that merely teaches recipes and how to apply software packages.
Content1 Second-Order Scalar Elliptic Boundary Value Problems
1.2 Equilibrium Models: Examples
1.3 Sobolev spaces
1.4 Linear Variational Problems
1.5 Equilibrium Models: Boundary Value Problems
1.6 Diffusion Models (Stationary Heat Conduction)
1.7 Boundary Conditions
1.8 Second-Order Elliptic Variational Problems
1.9 Essential and Natural Boundary Conditions
2 Finite Element Methods (FEM)
2.2 Principles of Galerkin Discretization
2.3 Case Study: Linear FEM for Two-Point Boundary Value Problems
2.4 Case Study: Triangular Linear FEM in Two Dimensions
2.5 Building Blocks of General Finite Element Methods
2.6 Lagrangian Finite Element Methods
2.7 Implementation of Finite Element Methods
2.7.1 Mesh Generation and Mesh File Format
2.7.2 Mesh Information and Mesh Data Structures
2.7.2.1 L EHR FEM++ Mesh: Container Layer
2.7.2.2 L EHR FEM++ Mesh: Topology Layer
2.7.2.3 L EHR FEM++ Mesh: Geometry Layer
2.7.3 Vectors and Matrices
2.7.4 Assembly Algorithms
2.7.4.1 Assembly: Localization
2.7.4.2 Assembly: Index Mappings
2.7.4.3 Distribute Assembly Schemes
2.7.4.4 Assembly: Linear Algebra Perspective
2.7.5 Local Computations
2.7.5.1 Analytic Formulas for Entries of Element Matrices
2.7.5.2 Local Quadrature
2.7.6 Treatment of Essential Boundary Conditions
2.8 Parametric Finite Element Methods
3 FEM: Convergence and Accuracy
3.1 Abstract Galerkin Error Estimates
3.2 Empirical (Asymptotic) Convergence of Lagrangian FEM
3.3 A Priori (Asymptotic) Finite Element Error Estimates
3.4 Elliptic Regularity Theory
3.5 Variational Crimes
3.6 FEM: Duality Techniques for Error Estimation
3.7 Discrete Maximum Principle
3.8 Validation and Debugging of Finite Element Codes
4 Beyond FEM: Alternative Discretizations [dropped]
5 Non-Linear Elliptic Boundary Value Problems [dropped]
6 Second-Order Linear Evolution Problems
6.1 Time-Dependent Boundary Value Problems
6.2 Parabolic Initial-Boundary Value Problems
6.3 Linear Wave Equations
7 Convection-Diffusion Problems [dropped]
8 Numerical Methods for Conservation Laws
8.1 Conservation Laws: Examples
8.2 Scalar Conservation Laws in 1D
8.3 Conservative Finite Volume (FV) Discretization
8.4 Timestepping for Finite-Volume Methods
8.5 Higher-Order Conservative Finite-Volume Schemes
Lecture notesThe lecture will be taught in flipped classroom format:
- Video tutorials for all thematic units will be published online.
- Tablet notes accompanying the videos will be made available to the audience as PDF.
- A comprehensive lecture document will cover all aspects of the course.
LiteratureChapters of the following books provide supplementary reading
(detailed references in course material):

* D. Braess: Finite Elemente,
Theorie, schnelle Löser und Anwendungen in der Elastizitätstheorie, Springer 2007 (available online).
* S. Brenner and R. Scott. Mathematical theory of finite element methods, Springer 2008 (available online).
* A. Ern and J.-L. Guermond. Theory and Practice of Finite Elements, volume 159 of Applied Mathematical Sciences. Springer, New York, 2004.
* Ch. Großmann and H.-G. Roos: Numerical Treatment of Partial Differential Equations, Springer 2007.
* W. Hackbusch. Elliptic Differential Equations. Theory and Numerical Treatment, volume 18 of Springer Series in Computational Mathematics. Springer, Berlin, 1992.
* P. Knabner and L. Angermann. Numerical Methods for Elliptic and Parabolic Partial Differential Equations, volume 44 of Texts in Applied Mathematics. Springer, Heidelberg, 2003.
* S. Larsson and V. Thomée. Partial Differential Equations with Numerical Methods, volume 45 of Texts in Applied Mathematics. Springer, Heidelberg, 2003.
* R. LeVeque. Finite Volume Methods for Hyperbolic Problems. Cambridge Texts in Applied Mathematics. Cambridge University Press, Cambridge, UK, 2002.

However, study of supplementary literature is not important for for following the course.
Prerequisites / NoticeMastery of basic calculus and linear algebra is taken for granted.
Familiarity with fundamental numerical methods (solution methods for linear systems of equations, interpolation, approximation, numerical quadrature, numerical integration of ODEs) is essential.

Important: Coding skills and experience in C++ are essential.

Homework assignments involve substantial coding, partly based on a C++ finite element library. The written examination will be computer based and will comprise coding tasks.
401-3052-10LGraph Theory Information W10 credits4V + 1UB. Sudakov
AbstractBasics, trees, Caley's formula, matrix tree theorem, connectivity, theorems of Mader and Menger, Eulerian graphs, Hamilton cycles, theorems of Dirac, Ore, Erdös-Chvatal, matchings, theorems of Hall, König, Tutte, planar graphs, Euler's formula, Kuratowski's theorem, graph colorings, Brooks' theorem, 5-colorings of planar graphs, list colorings, Vizing's theorem, Ramsey theory, Turán's theorem
ObjectiveThe students will get an overview over the most fundamental questions concerning graph theory. We expect them to understand the proof techniques and to use them autonomously on related problems.
Lecture notesLecture will be only at the blackboard.
LiteratureWest, D.: "Introduction to Graph Theory"
Diestel, R.: "Graph Theory"

Further literature links will be provided in the lecture.
Prerequisites / NoticeStudents are expected to have a mathematical background and should be able to write rigorous proofs.
401-3602-00LApplied Stochastic Processes Information W8 credits3V + 1UV. Tassion
AbstractPoisson processes; renewal processes; Markov chains in discrete and in continuous time; some applications.
ObjectiveStochastic processes are a way to describe and study the behaviour of systems that evolve in some random way. In this course, the evolution will be with respect to a scalar parameter interpreted as time, so that we discuss the temporal evolution of the system. We present several classes of stochastic processes, analyse their properties and behaviour and show by some examples how they can be used. The main emphasis is on theory; in that sense, "applied" should be understood to mean "applicable".
LiteratureR. N. Bhattacharya and E. C. Waymire, "Stochastic Processes with Applications", SIAM (2009), available online: Link
R. Durrett, "Essentials of Stochastic Processes", Springer (2012), available online: Link
M. Lefebvre, "Applied Stochastic Processes", Springer (2007), available online: Link
S. I. Resnick, "Adventures in Stochastic Processes", Birkhäuser (2005)
Prerequisites / NoticePrerequisites are familiarity with (measure-theoretic) probability theory as it is treated in the course "Probability Theory" (401-3601-00L).
401-4632-15LCausality Information W4 credits2GC. Heinze-Deml
AbstractIn statistics, we are used to search for the best predictors of some random variable. In many situations, however, we are interested in predicting a system's behavior under manipulations. For such an analysis, we require knowledge about the underlying causal structure of the system. In this course, we study concepts and theory behind causal inference.
ObjectiveAfter this course, you should be able to
- understand the language and concepts of causal inference
- know the assumptions under which one can infer causal relations from observational and/or interventional data
- describe and apply different methods for causal structure learning
- given data and a causal structure, derive causal effects and predictions of interventional experiments
Prerequisites / NoticePrerequisites: basic knowledge of probability theory and regression
401-4656-21LDeep Learning in Scientific Computing Restricted registration - show details
Aimed at students in a Master's Programme in Mathematics, Engineering and Physics.
W6 credits2V + 1US. Mishra
AbstractMachine Learning, particularly deep learning is being increasingly applied to perform, enhance and accelerate computer simulations of models in science and engineering. This course aims to present a highly topical selection of themes in the general area of deep learning in scientific computing, with an emphasis on the application of deep learning algorithms for systems, modeled by PDEs.
ObjectiveThe objective of this course will be to introduce students to advanced applications of deep learning in scientific computing. The focus will be on the design and implementation of algorithms as well as on the underlying theory that guarantees reliability of the algorithms. We will provide several examples of applications in science and engineering where deep learning based algorithms outperform state of the art methods.
ContentA selection of the following topics will be presented in the lectures.

1. Issues with traditional methods for scientific computing such as Finite Element, Finite Volume etc, particularly for PDE models with high-dimensional state and parameter spaces.

2. Introduction to Deep Learning: Artificial Neural networks, Supervised learning, Stochastic gradient descent algorithms for training, different architectures: Convolutional Neural Networks, Recurrent Neural Networks, ResNets.

3. Theoretical Foundations: Universal approximation properties of the Neural networks, Bias-Variance decomposition, Bounds on approximation and generalization errors.

4. Supervised deep learning for solutions fields and observables of high-dimensional parametric PDEs. Use of low-discrepancy sequences and multi-level training to reduce generalization error.

5. Uncertainty Quantification for PDEs with supervised learning algorithms.

6. Deep Neural Networks as Reduced order models and prediction of solution fields.

7. Active Learning algorithms for PDE constrained optimization.

8. Recurrent Neural Networks and prediction of time series for dynamical systems.

9. Physics Informed Neural networks (PINNs) for the forward problem for PDEs. Applications to high-dimensional PDEs.

10. PINNs for inverse problems for PDEs, parameter identification, optimal control and data assimilation.

All the algorithms will be illustrated on a variety of PDEs: diffusion models, Black-Scholes type PDEs from finance, wave equations, Euler and Navier-Stokes equations, hyperbolic systems of conservation laws, Dispersive PDEs among others.
Lecture notesLecture notes will be provided at the end of the course.
LiteratureAll the material in the course is based on research articles written in last 1-2 years. The relevant references will be provided.
Prerequisites / NoticeThe students should be familiar with numerical methods for PDEs, for instance in courses such as Numerical Methods for PDEs for CSE, Numerical analysis of Elliptic and Parabolic PDEs, Numerical methods for hyperbolic PDEs, Computational methods for Engineering Applications.

Some familiarity with basic concepts in machine learning will be beneficial. The exercises in the course rely on standard machine learning frameworks such as KERAS, TENSORFLOW or PYTORCH. So, competence in Python is helpful.
401-4944-20LMathematics of Data Science
Does not take place this semester.
W8 credits4GA. Bandeira
AbstractMostly self-contained, but fast-paced, introductory masters level course on various theoretical aspects of algorithms that aim to extract information from data.
ObjectiveIntroduction to various mathematical aspects of Data Science.
ContentThese topics lie in overlaps of (Applied) Mathematics with: Computer Science, Electrical Engineering, Statistics, and/or Operations Research. Each lecture will feature a couple of Mathematical Open Problem(s) related to Data Science. The main mathematical tools used will be Probability and Linear Algebra, and a basic familiarity with these subjects is required. There will also be some (although knowledge of these tools is not assumed) Graph Theory, Representation Theory, Applied Harmonic Analysis, among others. The topics treated will include Dimension reduction, Manifold learning, Sparse recovery, Random Matrices, Approximation Algorithms, Community detection in graphs, and several others.
Lecture notesLink
Prerequisites / NoticeThe main mathematical tools used will be Probability, Linear Algebra (and real analysis), and a working knowledge of these subjects is required. In addition
to these prerequisites, this class requires a certain degree of mathematical maturity--including abstract thinking and the ability to understand and write proofs.


We encourage students who are interested in mathematical data science to take both this course and ``227-0434-10L Mathematics of Information'' taught by Prof. H. Bölcskei. The two courses are designed to be
complementary.
A. Bandeira and H. Bölcskei
401-6102-00LMultivariate Statistics
Does not take place this semester.
W4 credits2Gnot available
AbstractMultivariate Statistics deals with joint distributions of several random variables. This course introduces the basic concepts and provides an overview over classical and modern methods of multivariate statistics. We will consider the theory behind the methods as well as their applications.
ObjectiveAfter the course, you should be able to:
- describe the various methods and the concepts and theory behind them
- identify adequate methods for a given statistical problem
- use the statistical software "R" to efficiently apply these methods
- interpret the output of these methods
ContentVisualization / Principal component analysis / Multidimensional scaling / The multivariate Normal distribution / Factor analysis / Supervised learning / Cluster analysis
Lecture notesNone
LiteratureThe course will be based on class notes and books that are available electronically via the ETH library.
Prerequisites / NoticeTarget audience: This course is the more theoretical version of "Applied Multivariate Statistics" (401-0102-00L) and is targeted at students with a math background.

Prerequisite: A basic course in probability and statistics.

Note: The courses 401-0102-00L and 401-6102-00L are mutually exclusive. You may register for at most one of these two course units.
402-0448-01LQuantum Information Processing I: Concepts
This theory part QIP I together with the experimental part 402-0448-02L QIP II (both offered in the Spring Semester) combine to the core course in experimental physics "Quantum Information Processing" (totally 10 ECTS credits). This applies to the Master's degree programme in Physics.
W5 credits2V + 1UP. Kammerlander
AbstractThe course will cover the key concepts and ideas of quantum information processing, including descriptions of quantum algorithms which give the quantum computer the power to compute problems outside the reach of any classical supercomputer.
Key concepts such as quantum error correction will be described. These ideas provide fundamental insights into the nature of quantum states and measurement.
ObjectiveBy the end of the course students are able to explain the basic mathematical formalism of quantum mechanics and apply them to quantum information processing problems. They are able to adapt and apply these concepts and methods to analyse and discuss quantum algorithms and other quantum information-processing protocols.
ContentThe topics covered in the course will include quantum circuits, gate decomposition and universal sets of gates, efficiency of quantum circuits, quantum algorithms (Shor, Grover, Deutsch-Josza,..), error correction, fault-tolerant design, entanglement, teleportation and dense conding, teleportation of gates, and cryptography.
Lecture notesMore details to follow.
LiteratureQuantum Computation and Quantum Information
Michael Nielsen and Isaac Chuang
Cambridge University Press
Prerequisites / NoticeA good understanding of linear algebra is recommended.
701-0104-00LStatistical Modelling of Spatial DataW3 credits2GA. J. Papritz
AbstractIn environmental sciences one often deals with spatial data. When analysing such data the focus is either on exploring their structure (dependence on explanatory variables, autocorrelation) and/or on spatial prediction. The course provides an introduction to geostatistical methods that are useful for such analyses.
ObjectiveThe course will provide an overview of the basic concepts and stochastic models that are used to model spatial data. In addition, participants will learn a number of geostatistical techniques and acquire familiarity with R software that is useful for analyzing spatial data.
ContentAfter an introductory discussion of the types of problems and the kind of data that arise in environmental research, an introduction into linear geostatistics (models: stationary and intrinsic random processes, modelling large-scale spatial patterns by linear regression, modelling autocorrelation by variogram; kriging: mean square prediction of spatial data) will be taught. The lectures will be complemented by data analyses that the participants have to do themselves.
Lecture notesSlides, descriptions of the problems for the data analyses and solutions to them will be provided.
LiteratureP.J. Diggle & P.J. Ribeiro Jr. 2007. Model-based Geostatistics. Springer.
Prerequisites / NoticeFamiliarity with linear regression analysis (e.g. equivalent to the first part of the course 401-0649-00L Applied Statistical Regression) and with the software R (e.g. 401-6215-00L Using R for Data Analysis and Graphics (Part I), 401-6217-00L Using R for Data Analysis and Graphics (Part II)) are required for attending the course.
Interdisciplinary Electives
NumberTitleTypeECTSHoursLecturers
101-0478-00LMeasurement and Modelling of Travel BehaviourW6 credits4GK. W. Axhausen
AbstractComprehensive introduction to survey methods in transport planning and modeling of travel behavior, using advanced discrete choice models.
ObjectiveEnabling the student to understand and apply the various measurement approaches and models of modelling travel behaviour.
ContentBehavioral model and measurement; travel diary, design process, hypothetical markets, discrete choice model, parameter estimation, pattern of travel behaviour, market segments, simulation, advanced discrete choice models
Lecture notesVarious papers and notes are distributed during the course.
103-0228-00LMultimedia Cartography
Prerequisite: Successful completion of Cartography III (103-0227-00L).
W4 credits3GR. Sieber
AbstractFocus of this course is on the realization of an atlas project in a small team. During the first part of the course, the necessary organizational, creative and technological basics will be provided. At the end of the course, the interactive atlas projects will be presented by the team members.
ObjectiveThe goal of this course is to provide the students the theoretical background, knowledge and practical skills necessary to plan, design and create an interactive Web atlas based on modern Web technologies.
ContentThis course will cover the following topics:

- Web map design
- Project management
- Graphical user interfaces in Web atlases
- Interactions in map and atlas applications
- Web standards
- Programming interactive Web applications
- Use of software libraries
- Cartographic Web services
- Code repository
- Copyright and the Internet
Lecture notesLecture notes and additional material are available on Moodle.
Literature- Cartwright, William; Peterson, Michael P. and Georg Gartner (2007); Multimedia Cartography, Springer, Heidelberg
Prerequisites / NoticePrerequisites: Successful completion of Cartography III (103-0227-00L).
Previous knowledge in Web programming.

The students are expected to
- present their work in progress on a regular basis
- present their atlas project at the end of the course
- keep records of all the work done
- document all individual contributions to the project
103-0247-00LMobile GIS and Location-Based ServicesW5 credits4GP. Kiefer
AbstractThe course introduces students to the theoretical and technological background of mobile geographic information systems and location-based services. In lab sessions students acquire competences in mobile GIS design and implementation.
ObjectiveStudents will
- learn about the implications of mobility on GIS
- get a detailed overview on research fields related to mobile GIS
- get an overview on current mobile GIS and LBS technology, and learn how to assess new technologies in this fast-moving field
- achieve an integrated view of Geospatial Web Services and mobile GIS
- acquire competences in mobile GIS design and implementation
Content- LBS and mobile GIS: architectures, market, applications, and application development
- Development for Android
- Introduction to augmented reality development
- Mobile decision-making, context, personalization, and privacy
- Mobile human computer interaction and user interfaces
- Mobile behavior interpretation
Prerequisites / NoticeElementary programming skills (Java)
227-0945-10LCell and Molecular Biology for Engineers II
This course is part II of a two-semester course.
Knowledge of part I is required.
W3 credits2GC. Frei
AbstractThe course gives an introduction into cellular and molecular biology, specifically for students with a background in engineering. The focus will be on the basic organization of eukaryotic cells, molecular mechanisms and cellular functions. Textbook knowledge will be combined with results from recent research and technological innovations in biology.
ObjectiveAfter completing this course, engineering students will be able to apply their previous training in the quantitative and physical sciences to modern biology. Students will also learn the principles how biological models are established, and how these models can be tested.
ContentLectures will include the following topics (part I and II): DNA, chromosomes, genome engineering, RNA, proteins, genetics, synthetic biology, gene expression, membrane structure and function, vesicular traffic, cellular communication, energy conversion, cytoskeleton, cell cycle, cellular growth, apoptosis, autophagy, cancer and stem cells.

In addition, 4 journal clubs will be held, where recent publications will be discussed (2 journal clubs in part I and 2 journal clubs in part II). For each journal club, students (alone or in groups of up to three students) have to write a summary and discussion of the publication. These written documents will be graded and count as 40% for the final grade.
Lecture notesScripts of all lectures will be available.
Literature"Molecular Biology of the Cell" (6th edition) by Alberts, Johnson, Lewis, Morgan, Raff, Roberts, and Walter.
227-0391-00LMedical Image Analysis
Basic knowledge of computer vision would be helpful.
W3 credits2GE. Konukoglu, M. A. Reyes Aguirre
AbstractIt is the objective of this lecture to introduce the basic concepts used
in Medical Image Analysis. In particular the lecture focuses on shape
representation schemes, segmentation techniques, machine learning based predictive models and various image registration methods commonly used in Medical Image Analysis applications.
ObjectiveThis lecture aims to give an overview of the basic concepts of Medical Image Analysis and its application areas.
Prerequisites / NoticePrerequisites:
Basic concepts of mathematical analysis and linear algebra.

Preferred:
Basic knowledge of computer vision and machine learning would be helpful.

The course will be held in English.
261-5113-00LComputational Challenges in Medical Genomics Information Restricted registration - show details
Number of participants limited to 20.
W2 credits2SA. Kahles, G. Rätsch
AbstractThis seminar discusses recent relevant contributions to the fields of computational genomics, algorithmic bioinformatics, statistical genetics and related areas. Each participant will hold a presentation and lead the subsequent discussion.
ObjectivePreparing and holding a scientific presentation in front of peers is a central part of working in the scientific domain. In this seminar, the participants will learn how to efficiently summarize the relevant parts of a scientific publication, critically reflect its contents, and summarize it for presentation to an audience. The necessary skills to succesfully present the key points of existing research work are the same as needed to communicate own research ideas.
In addition to holding a presentation, each student will both contribute to as well as lead a discussion section on the topics presented in the class.
ContentThe topics covered in the seminar are related to recent computational challenges that arise from the fields of genomics and biomedicine, including but not limited to genomic variant interpretation, genomic sequence analysis, compressive genomics tasks, single-cell approaches, privacy considerations, statistical frameworks, etc.
Both recently published works contributing novel ideas to the areas mentioned above as well as seminal contributions from the past are amongst the list of selected papers.
Prerequisites / NoticeKnowledge of algorithms and data structures and interest in applications in genomics and computational biomedicine.
261-5120-00LMachine Learning for Health Care Information Restricted registration - show details
Number of participants limited to 150.
W5 credits3P + 1AV. Boeva, G. Rätsch, J. Vogt
AbstractThe course will review the most relevant methods and applications of Machine Learning in Biomedicine, discuss the main challenges they present and their current technical problems.
ObjectiveDuring the last years, we have observed a rapid growth in the field of Machine Learning (ML), mainly due to improvements in ML algorithms, the increase of data availability and a reduction in computing costs. This growth is having a profound impact in biomedical applications, where the great variety of tasks and data types enables us to get benefit of ML algorithms in many different ways. In this course we will review the most relevant methods and applications of ML in biomedicine, discuss the main challenges they present and their current technical solutions.
ContentThe course will consist of four topic clusters that will cover the most relevant applications of ML in Biomedicine:
1) Structured time series: Temporal time series of structured data often appear in biomedical datasets, presenting challenges as containing variables with different periodicities, being conditioned by static data, etc.
2) Medical notes: Vast amount of medical observations are stored in the form of free text, we will analyze stategies for extracting knowledge from them.
3) Medical images: Images are a fundamental piece of information in many medical disciplines. We will study how to train ML algorithms with them.
4) Genomics data: ML in genomics is still an emerging subfield, but given that genomics data are arguably the most extensive and complex datasets that can be found in biomedicine, it is expected that many relevant ML applications will arise in the near future. We will review and discuss current applications and challenges.
Prerequisites / NoticeData Structures & Algorithms, Introduction to Machine Learning, Statistics/Probability, Programming in Python, Unix Command Line

Relation to Course 261-5100-00 Computational Biomedicine: This course is a continuation of the previous course with new topics related to medical data and machine learning. The format of Computational Biomedicine II will also be different. It is helpful but not essential to attend Computational Biomedicine before attending Computational Biomedicine II.
262-0200-00LBayesian Phylodynamics – Taming the BEASTW4 credits2G + 2AT. Stadler, T. Vaughan
AbstractHow fast is COVID-19 spreading at the moment? How fast was Ebola spreading in West Africa? Where and when did these epidemic outbreak start? How can we construct the phylogenetic tree of great apes, and did gene flow occur between different apes? At the end of the course, students will have designed, performed, presented, and discussed their own phylodynamic data analysis to answer such questions.
ObjectiveAttendees will extend their knowledge of Bayesian phylodynamics obtained in the “Computational Biology” class (636-0017-00L) and will learn how to apply this theory to real world data. The main theoretical concepts introduced are:
* Bayesian statistics
* Phylogenetic and phylodynamic models
* Markov Chain Monte Carlo methods
Attendees will apply these concepts to a number of applications yielding biological insight into:
* Epidemiology
* Pathogen evolution
* Macroevolution of species
ContentDuring the first part of the block course, the theoretical concepts of Bayesian phylodynamics will be presented by us as well as leading international researchers in that area. The presentations will be followed by attendees using the software package BEAST v2 to apply these theoretical concepts to empirical data. We will use previously published datasets on e.g. COVID-19, Ebola, Zika, Yellow Fever, Apes, and Penguins for analysis. Examples of these practical tutorials are available on Link.
In the second part of the block course, students choose an empirical dataset of genetic sequencing data and possibly some non-genetic metadata. They then design and conduct a research project in which they perform Bayesian phylogenetic analyses of their dataset. A final written report on the research project has to be submitted after the block course for grading.
Lecture notesAll material will be available on Link.
LiteratureThe following books provide excellent background material:
• Drummond, A. & Bouckaert, R. 2015. Bayesian evolutionary analysis with BEAST.
• Yang, Z. 2014. Molecular Evolution: A Statistical Approach.
• Felsenstein, J. 2003. Inferring Phylogenies.
More detailed information is available on Link.
Prerequisites / NoticeThis class builds upon the content which we teach in the Computational Biology class (636-0017-00L). Attendees must have either taken the Computational Biology class or acquired the content elsewhere.
636-0702-00LStatistical Models in Computational BiologyW6 credits2V + 1U + 2AN. Beerenwinkel
AbstractThe course offers an introduction to graphical models and their application to complex biological systems. Graphical models combine a statistical methodology with efficient algorithms for inference in settings of high dimension and uncertainty. The unifying graphical model framework is developed and used to examine several classical and topical computational biology methods.
ObjectiveThe goal of this course is to establish the common language of graphical models for applications in computational biology and to see this methodology at work for several real-world data sets.
ContentGraphical models are a marriage between probability theory and graph theory. They combine the notion of probabilities with efficient algorithms for inference among many random variables. Graphical models play an important role in computational biology, because they explicitly address two features that are inherent to biological systems: complexity and uncertainty. We will develop the basic theory and the common underlying formalism of graphical models and discuss several computational biology applications. Topics covered include conditional independence, Bayesian networks, Markov random fields, Gaussian graphical models, EM algorithm, junction tree algorithm, model selection, Dirichlet process mixture, causality, the pair hidden Markov model for sequence alignment, probabilistic phylogenetic models, phylo-HMMs, microarray experiments and gene regulatory networks, protein interaction networks, learning from perturbation experiments, time series data and dynamic Bayesian networks. Some of the biological applications will be explored in small data analysis problems as part of the exercises.
Lecture notesno
Literature- Airoldi EM (2007) Getting started in probabilistic graphical models. PLoS Comput Biol 3(12): e252. doi:10.1371/journal.pcbi.0030252
- Bishop CM. Pattern Recognition and Machine Learning. Springer, 2007.
- Durbin R, Eddy S, Krogh A, Mitchinson G. Biological Sequence Analysis. Cambridge university Press, 2004
263-3501-00LFuture Internet Information
Takes place for the last time!
W7 credits1V + 1U + 4AA. Singla
AbstractThis course will discuss recent advances in networking, with a focus on the Internet, with topics ranging from the algorithmic design of applications like video streaming to the likely near-future of satellite-based networking.
ObjectiveThe goals of the course are to build on basic undergraduate-level networking, and provide an understanding of the tradeoffs and existing technology in the design of large, complex networked systems, together with concrete experience of the challenges through a series of lab exercises.
ContentThe focus of the course is on principles, architectures, protocols, and applications used in modern networked systems. Example topics include:

- How video streaming services like Netflix work, and research on improving their performance.
- How Web browsing could be made faster
- How the Internet's protocols are improving
- Exciting developments in satellite-based networking (ala SpaceX)
- The role of data centers in powering Internet services

A series of programming assignments will form a substantial part of the course grade.
Lecture notesLecture slides will be made available at the course Web site: Link
LiteratureNo textbook is required, but there will be regularly assigned readings from research literature, liked to the course Web site: Link.
Prerequisites / NoticeAn undergraduate class covering the basics of networking, such as Internet routing and TCP. At ETH, Computer Networks (252-0064-00L) and Communication Networks (227-0120-00L) suffice. Similar courses from other universities are acceptable too.
261-5111-00LAsset Management: Advanced Investments (University of Zurich)
Does not take place this semester.
No enrolment to this course at ETH Zurich. Book the corresponding module directly at UZH.
UZH Module Code: MFOEC207

Mind the enrolment deadlines at UZH:
Link
W3 credits2VUniversity lecturers
AbstractComprehension and application of advanced portfolio theory
ObjectiveComprehension and application of advanced portfolio theory
ContentThe theoretical part of the lecture consists of the topics listed below.

- Standard Markowitz Model and Extensions MV Optimization, MV with Liabilities and CAPM.
- The Crux with MV
Resampling, regression, Black-Litterman, Bayesian, shrinkage, constrained and robust optimization.
- Downside and Coherent Risk Measures
Definition of risk measures, MV optimization under VaR and ES constraints.
- Risk Budgeting
Equal risk contribution, most diversified portfolio and other concentration indices
- Regime Switching and Asset Allocation
An introduction to regime switching models and its intuition.
- Strategic Asset Allocation
Introducing a continuous-time framework, solving the HJB equation and the classical Merton problem.
363-1000-00LFinancial EconomicsW3 credits2VA. Bommier, C. Daminato
AbstractThis is a theoretical course on the economics of financial decision making, at the crossroads between Microeconomics and Finance. It discusses portfolio choice theory, risk sharing, market equilibrium and asset pricing.
ObjectiveThe objective is to make students familiar with the economics of financial decision making and develop their intuition regarding the determination of asset prices, the notions of optimal risk sharing. However this is not a practical formation for traders. Moreover, the lecture doesn't cover topics such as market irrationality or systemic risk.

After completing this course:
1. Students will be familiar with the economics of financial decision making and develop their intuition regarding the determination of asset prices;
2. Students will understand the intuition of market equilibrium. They will be able to solve the market equilibrium in a simple model and derive the prices of assets.
3. Students will be familiar with the representation of attitudes towards risk. They will be able to explain how risk, wealth and agents’ preferences affect the demand for assets.
4. Students will understand the notion of risk diversification.
5. Students will understand the notion of optimal risk sharing.
ContentThe following topics will be discussed:
1. Introduction to financial assets: The first lecture provides an overview of most common financial assets. We will also discuss the formation of asset prices and the role of markets in the valuation of these assets.

2. Option valuation: this lecture focuses on options, which are a certain type of financial asset. You will learn about arbitrage, which is a key notion to understand the valuation of options. This lecture will give you the intuition of the mechanisms underlying the pricing of assets in more general settings.

3. Introduction to the economic analysis of asset markets: this chapter will familiarize you with the notion of market equilibrium and the role it plays concerning asset pricing. Relying on economic theory, we will consider the properties of the market equilibrium: In which cases does the equilibrium exist? Is it optimal? How does it depend on individual’s wealth and preferences? The concepts defined in this chapter are essential to understand the following parts of the course.

4. A simplified approach to asset markets: based on the notions introduced in the previous lectures, you will learn about the key concepts necessary to understand financial markets, such as market completeness and the no-arbitrage theorem.

5. Choice under uncertainty: this class covers fundamental concepts concerning agents’ decisions when facing risk. These models are crucial to understand how the demand for financial assets originates.

6. Demand for risk: Building up on the previous chapters, we will study portfolio choice in a simplified setting. We will discuss how asset demand varies with risk, agent’s preferences and wealth.

7. Asset prices in a simplified context: We will focus on the portfolio choices of an investor, in a particular setting called mean-variance analysis. The mean-variance analysis will be a first step to introduce the notion of risk diversification, which is essential in finance.

8. Risk sharing and insurance: in this lecture, you will understand that risk can be shared among different agents and how, under certain conditions, this sharing can be optimal. You will learn about the distinction between individual idiosyncratic risk and macroeconomic risk.

9. Risk sharing and asset prices in a market equilibrium: this course builds up on previous lessons and presents the consumption-based Capital Asset Pricing Model (CAPM). The focus will be on how consumption, assets and prices are determined in equilibrium.
LiteratureMain reading material:

- "Investments", by Z. Bodie, A. Kane and A. Marcus, for the
introductory part of the course (see chapters 20 and 21 in
particular).
- "Finance and the Economics of Uncertainty" by G. Demange and G. Laroque, Blackwell, 2006.
- "The Economics of Risk and Time", by C. Gollier, MIT Press, 2001.

Other readings:
- "Intermediate Financial Theory" by J.-P. Danthine and J.B. Donaldson.
- Ingersoll, J., E., Theory of Financial Decision Making, Rowman and Littlefield Publishers.
- Leroy S and J. Werner, Principles of Financial Economics, Cambridge University Press, 2001
Prerequisites / NoticeBasic mathematical skills needed (calculus, linear algebra, convex analysis). Students must be able to solve simple optimization problems (e.g. Lagrangian methods). Some knowledge in microeconomics would help but is not compulsory. The bases will be covered in class.
363-1091-00LSocial Data ScienceW3 credits2GD. Garcia Becerra
AbstractSocial Data Science is introduced as a set of techniques to analyze human behaviour and social interaction through digital traces.
The course focuses both on the fundamentals and applications of Data Science in the Social Sciences, including technologies for data retrieval, processing, and analysis with the aim to derive insights that are interpretable from a wider theoretical perspective.
ObjectiveA successful participant of this course will be able to
- understand a wide variety of techniques to retrieve digital trace data from online data sources
- store, process, and summarize online data for quantitative analysis
- perform statistical analyses to test hypotheses, derive insights, and formulate predictions
- implement streamlined software that integrates data retrieval, processing, statistical analysis, and visualization
- interpret the results of data analysis with respect to theoretical and testable principles of human behavior
- understand the limitations of observational data analysis with respect to data volume, statistical power, and external validity
ContentSocial Data Science (SDS) provides a broad approach to the quantitative analysis of human behavior through digital trace data.
SDS integrates the implementation of data retrieval and processing, the application of statistical analysis methods, and the interpretation of results to derive insights of human behavior at high resolutions and large scales.
The motivation of SDS stems from theories in the Social Sciences, which are addressed with respect to societal phenomena and formulated as principles that can be tested against empirical data.
Data retrieval in SDS is performed in an automated manner, accessing online databases and programming interfaces that capture the digital traces of human behavior.
Data processing is computerized with calibrated methods that quantify human behavior, for example constructing social networks or measuring emotional expression.
These quantities are used in statistical analyses to both test hypotheses and explore new aspects on human behavior.

The course starts with an introduction to Social Data Science and the R statistical language, followed by three content blocks: collective behavior, sentiment analysis, and social network analysis. The course ends with a datathon that sets the starting point of final student projects.

The course will cover various examples of the application of SDS:
- Search trends to measure information seeking
- Popularity and social impact
- Evaluation of sentiment analysis techniques
- Quantitative analysis of emotions and social media sharing
- Twitter social network analysis

The lectures include theoretical foundations of the application of digital trace data in the Social Sciences, as well as practical examples of data retrieval, processing, and analysis cases in the R statistical language from a literate programming perspective.
The block course contains lectures and exercise sessions during the morning and afternoon of five days.
Exercise classes provide practical skills and discuss the solutions to exercises that build on the concepts and methods presented in the previous lectures.
Lecture notesThe lecture slides will be available on the Moodle platform, for registered students only.
LiteratureSee handouts. Specific literature is provided for download, for registered students only.
Prerequisites / NoticeParticipants of the course should have some basic background in statistics and programming, and an interest to learn about human behavior from a quantitative perspective.

Prior knowledge of advanced R, information retrieval, or information systems is not necessary.

Exercise sessions build on technical and theoretical content explained in the lectures. Students need a working laptop with Internet access to perform guided exercises.
363-1100-00LRisk Case Study Challenge Restricted registration - show details
Does not take place this semester.
W3 credits2S
AbstractThis seminar provides master students at ETH with the challenging opportunity to work on a real risk-modelling and risk-management case in close collaboration with a Risk Center Corporate Partner. The Corporate Partner for the Spring 2021 Edition will be announced soon.
ObjectiveDuring the challenge students acquire a basic understanding of
o The insurance and reinsurance business
o Risk management and risk modelling
o The role of operational risk management

as well as learn to frame a real risk-related business case together with a case manager from the Corporate Partner. Students learn to coordinate as a group. They also learn to integrate and learn from business insights in order to elaborate a solution for their case. Finally, students communicate their solution to an assembly of professionals from the Corporate Partner.
ContentStudents work on a real-world, risk-related case. The case is based on a business-relevant topic. Topics are provided by experts from the Risk Center's Corporate Partners. While gaining substantial insights into the industry's risk modeling and management, students explore the case or problem on their own. They work in teams and develop solutions. The cases allow students to use logical problem-solving skills with an emphasis on evidence and application. Cases offer students the opportunity to apply their scientific knowledge. Typically, the risk-related cases can be complex, contain ambiguities, and may be addressed in more than one way. During the seminar, students visit the Corporate Partner’s headquarters, conduct interviews with members of the management team as well as internal and external experts, and finally present their results in a professional manner.
Prerequisites / NoticePlease apply for this course via the official website (Link). Apply no later than February 20, 2021.
The number of participants is limited to 16.
401-3629-00LQuantitative Risk Management Information W4 credits2V + 1UP. Cheridito
AbstractThis course introduces methods from probability theory and statistics that can be used to model financial risks. Topics addressed include loss distributions, risk measures, extreme value theory, multivariate models, copulas, dependence structures and operational risk.
ObjectiveThe goal is to learn the most important methods from probability theory and statistics used in financial risk modeling.
Content1. Introduction
2. Basic Concepts in Risk Management
3. Empirical Properties of Financial Data
4. Financial Time Series
5. Extreme Value Theory
6. Multivariate Models
7. Copulas and Dependence
8. Operational Risk
Lecture notesCourse material is available on Link
LiteratureQuantitative Risk Management: Concepts, Techniques and Tools
AJ McNeil, R Frey and P Embrechts
Princeton University Press, Princeton, 2015 (Revised Edition)
Link
Prerequisites / NoticeThe course corresponds to the Risk Management requirement for the SAA ("Aktuar SAV Ausbildung") as well as for the Master of Science UZH-ETH in Quantitative Finance.
401-3888-00LIntroduction to Mathematical Finance Information
A related course is 401-3913-01L Mathematical Foundations for Finance (3V+2U, 4 ECTS credits). Although both courses can be taken independently of each other, only one will be recognised for credits in the Bachelor and Master degree. In other words, it is not allowed to earn credit points with one for the Bachelor and with the other for the Master degree.
W10 credits4V + 1UD. Possamaï
AbstractThis is an introductory course on the mathematics for investment, hedging, portfolio management, asset pricing and financial derivatives in discrete-time financial markets. We discuss arbitrage, completeness, risk-neutral pricing and utility maximisation. We prove the fundamental theorem of asset pricing and the hedging duality theorems, and also study convex duality in utility maximization.
ObjectiveThis is an introductory course on the mathematics for investment, hedging, portfolio management, asset pricing and financial derivatives in discrete-time financial markets. We discuss arbitrage, completeness, risk-neutral pricing and utility maximisation, and maybe other topics. We prove the fundamental theorem of asset pricing and the hedging duality theorems in discrete time, and also study convex duality in utility maximization.
ContentThis course focuses on discrete-time financial markets. It presumes a knowledge of measure-theoretic probability theory (as taught e.g. in the course "Probability Theory"). The course is offered every year in the Spring semester.

This course is the first of a sequence of two courses on mathematical finance. The second course "Mathematical Finance" (MF II), 401-4889-00, focuses on continuous-time models. It is advisable that the present course, MF I, is taken prior to MF II.

For an overview of courses offered in the area of mathematical finance, see Link.
Lecture notesThe course is based on different parts from different textbooks as well as on original research literature. Lecture notes will not be available.
LiteratureLiterature:

Michael U. Dothan, "Prices in Financial Markets", Oxford University Press

Hans Föllmer and Alexander Schied, "Stochastic Finance: An Introduction in Discrete Time", de Gruyter

Marek Capinski and Ekkehard Kopp, "Discrete Models of Financial Markets", Cambridge University Press

Robert J. Elliott and P. Ekkehard Kopp, "Mathematics of Financial Markets", Springer
Prerequisites / NoticeA related course is "Mathematical Foundations for Finance" (MFF), 401-3913-01. Although both courses can be taken independently of each other, only one will be given credit points for the Bachelor and the Master degree. In other words, it is also not possible to earn credit points with one for the Bachelor and with the other for the Master degree.

This course is the first of a sequence of two courses on mathematical finance. The second course "Mathematical Finance" (MF II), 401-4889-00, focuses on continuous-time models. It is advisable that the present course, MF I, is taken prior to MF II.

For an overview of courses offered in the area of mathematical finance, see Link.
401-3936-00LData Analytics for Non-Life Insurance Pricing Restricted registration - show details W4 credits2VC. M. Buser, M. V. Wüthrich
AbstractWe study statistical methods in supervised learning for non-life insurance pricing such as generalized linear models, generalized additive models, Bayesian models, neural networks, classification and regression trees, random forests and gradient boosting machines.
ObjectiveThe student is familiar with classical actuarial pricing methods as well as with modern machine learning methods for insurance pricing and prediction.
ContentWe present the following chapters:
- generalized linear models (GLMs)
- generalized additive models (GAMs)
- neural networks
- credibility theory
- classification and regression trees (CARTs)
- bagging, random forests and boosting
Lecture notesThe lecture notes are available from:
Link
Prerequisites / NoticeThis course will be held in English and counts towards the diploma of "Aktuar SAV".
For the latter, see details under Link

Good knowledge in probability theory, stochastic processes and statistics is assumed.
401-4658-00LComputational Methods for Quantitative Finance: PDE Methods Information Restricted registration - show details W6 credits3V + 1UC. Marcati, A. Stein
AbstractIntroduction to principal methods of option pricing. Emphasis on PDE-based methods. Prerequisite MATLAB and Python programming
and knowledge of numerical mathematics at ETH BSc level.
ObjectiveIntroduce the main methods for efficient numerical valuation of derivative contracts in a
Black Scholes as well as in incomplete markets due Levy processes or due to stochastic volatility
models. Develop implementation of pricing methods in MATLAB and Python.
Finite-Difference/ Finite Element based methods for the solution of the pricing integrodifferential equation.
Content1. Review of option pricing. Wiener and Levy price process models. Deterministic, local and stochastic
volatility models.
2. Finite Difference Methods for option pricing. Relation to bi- and multinomial trees.
European contracts.
3. Finite Difference methods for Asian, American and Barrier type contracts.
4. Finite element methods for European and American style contracts.
5. Pricing under local and stochastic volatility in Black-Scholes Markets.
6. Finite Element Methods for option pricing under Levy processes. Treatment of
integrodifferential operators.
7. Stochastic volatility models for Levy processes.
8. Techniques for multidimensional problems. Baskets in a Black-Scholes setting and
stochastic volatility models in Black Scholes and Levy markets.
9. Introduction to sparse grid option pricing techniques.
Lecture notesThere will be english lecture notes as well as MATLAB or Python software for registered participants in the course.
LiteratureMain reference (course text):
N. Hilber, O. Reichmann, Ch. Schwab and Ch. Winter: Computational Methods for Quantitative Finance, Springer Finance, Springer, 2013.

Supplementary texts:
R. Cont and P. Tankov : Financial Modelling with Jump Processes, Chapman and Hall Publ. 2004.

Y. Achdou and O. Pironneau : Computational Methods for Option Pricing, SIAM Frontiers in Applied Mathematics, SIAM Publishers, Philadelphia 2005.

D. Lamberton and B. Lapeyre : Introduction to stochastic calculus Applied to Finance (second edition), Chapman & Hall/CRC Financial Mathematics Series, Taylor & Francis Publ. Boca Raton, London, New York 2008.

J.-P. Fouque, G. Papanicolaou and K.-R. Sircar : Derivatives in financial markets with stochastic volatility, Cambridge Univeristy Press, Cambridge, 2000.
Prerequisites / NoticeKnowledge of Numerical Analysis/ Scientific Computing Techniques
corresponding roughly to BSc MATH or BSc RW/CSE at ETH is expected.
Basic programming skills in MATLAB or Python are required for the exercises,
and are _not_ taught in this course.
401-8915-00LAdvanced Financial Economics (University of Zurich)
No enrolment to this course at ETH Zurich. Book the corresponding module directly at UZH.
UZH Module Code: MFOEC206

Mind the enrolment deadlines at UZH:
Link
W6 credits4GUniversity lecturers
AbstractPortfolio Theory, CAPM, Financial Derivatives, Incomplete Markets, Corporate Finance, Behavioural Finance, Evolutionary Finance
ObjectiveStudents should get familiar with the cornerstones of modern financial economics.
Prerequisites / NoticeThis course replaces "Advanced Financial Economics" (MFOEC105), which will be discontinued. Students who have taken "Advanced Financial Economics" (MFOEC105) in the past, are not allowed to book this course "Advanced Financial Economics" (MFOEC206).

There will be a podcast for this lecture.
701-0412-00LClimate SystemsW3 credits2GS. I. Seneviratne, L. Gudmundsson
AbstractThis course introduces the most important physical components of the climate system and their interactions. The mechanisms of anthropogenic climate change are analysed against the background of climate history and variability. Those completing the course will be in a position to identify and explain simple problems in the area of climate systems.
ObjectiveStudents are able
- to describe the most important physical components of the global climate system and sketch their interactions
- to explain the mechanisms of anthropogenic climate change
- to identify and explain simple problems in the area of climate systems
Lecture notesCopies of the slides are provided in electronic form.
LiteratureA comprehensive list of references is provided in the class. Two books are
particularly recommended:
- Hartmann, D., 2016: Global Physical Climatology. Academic Press, London, 485 pp.
- Peixoto, J.P. and A.H. Oort, 1992: Physics of Climate. American Institute of Physics, New York, 520 pp.
Prerequisites / NoticeTeaching: Sonia I. Seneviratne & Lukas Gudmundsson, several keynotes to special topics by other professors
Course taught in german/english, slides in english
701-1216-00LNumerical Modelling of Weather and Climate Information W4 credits3GC. Schär, J. Vergara Temprado, M. Wild
AbstractThe course provides an introduction to weather and climate models. It discusses how these models are built addressing both the dynamical core and the physical parameterizations, and it provides an overview of how these models are used in numerical weather prediction and climate research. As a tutorial, students conduct a term project and build a simple atmospheric model using the language PYTHON.
ObjectiveAt the end of this course, students understand how weather and climate models are formulated from the governing physical principles, and how they are used for climate and weather prediction purposes.
ContentThe course provides an introduction into the following themes: numerical methods (finite differences and spectral methods); adiabatic formulation of atmospheric models (vertical coordinates, hydrostatic approximation); parameterization of physical processes (e.g. clouds, convection, boundary layer, radiation); atmospheric data assimilation and weather prediction; predictability (chaos-theory, ensemble methods); climate models (coupled atmospheric, oceanic and biogeochemical models); climate prediction. Hands-on experience with simple models will be acquired in the tutorials.
Lecture notesSlides and lecture notes will be made available at
Link
LiteratureList of literature will be provided.
Prerequisites / NoticePrerequisites: to follow this course, you need some basic background in atmospheric science, numerical methods (e.g., "Numerische Methoden in der Umweltphysik", 701-0461-00L) as well as experience in programming. Previous experience with PYTHON is useful but not required.
701-1226-00LInter-Annual Phenomena and Their Prediction Information W2 credits2GC. Appenzeller
AbstractThis course provides an overview of the current ability to understand and predict intra-seasonal and inter-annual climate variability in the tropical and extra-tropical region and provides insights on how operational weather and climate services are organized.
ObjectiveStudents will acquire an understanding of the key atmosphere and ocean processes involved, will gain experience in analyzing and predicting sub-seasonal to inter-annual variability and learn how operational weather and climate services are organised and how scientific developments can improve these services.
ContentThe course covers the following topics:

Part 1:
- Introduction, some basic concepts and examples of sub-seasonal and inter-annual variability
- Weather and climate data and the statistical concepts used for analysing inter-annual variability (e.g. correlation analysis, teleconnection maps, EOF analysis)

Part 2:
- Inter-annual variability in the tropical region (e.g. ENSO, MJO)
- Inter-annual variability in the extra-tropical region (e.g. Blocking, NAO, PNA, regimes)

Part 3:
- Prediction of inter-annual variability (statistical methods, ensemble prediction systems, monthly and seasonal forecasts, seamless forecasts)
- Verification and interpretation of probabilistic forecast systems
- Climate change and inter-annual variability

Part 4:
- Scientific challenges for operational weather and climate services
- A visit to the forecasting centre of MeteoSwiss
Lecture notesA pdf version of the slides will be available at
Link
LiteratureReferences are given during the lecture.
701-1252-00LClimate Change Uncertainty and Risk: From Probabilistic Forecasts to Economics of Climate Adaptation Restricted registration - show details
Number of participants limited to 50.

Waiting list until 05.03.2021.
W3 credits2V + 1UD. N. Bresch, R. Knutti
AbstractThe course introduces the concepts of predictability, probability, uncertainty and probabilistic risk modelling and their application to climate modeling and the economics of climate adaptation.
ObjectiveStudents will acquire knowledge in uncertainty and risk quantification (probabilistic modelling) and an understanding of the economics of climate adaptation. They will become able to construct their own uncertainty and risk assessment models (in Python), hence basic understanding of scientific programming forms a prerequisite of the course.
ContentThe first part of the course covers methods to quantify uncertainty in detecting and attributing human influence on climate change and to generate probabilistic climate change projections on global to regional scales. Model evaluation, calibration and structural error are discussed. In the second part, quantification of risks associated with local climate impacts and the economics of different baskets of climate adaptation options are assessed – leading to informed decisions to optimally allocate resources. Such pre-emptive risk management allows evaluating a mix of prevention, preparation, response, recovery, and (financial) risk transfer actions, resulting in an optimal balance of public and private contributions to risk management, aiming at a more resilient society.
The course provides an introduction to the following themes:
1) basics of probabilistic modelling and quantification of uncertainty from global climate change to local impacts of extreme events
2) methods to optimize and constrain model parameters using observations
3) risk management from identification (perception) and understanding (assessment, modelling) to actions (prevention, preparation, response, recovery, risk transfer)
4) basics of economic evaluation, economic decision making in the presence of climate risks and pre-emptive risk management to optimally allocate resources
Lecture notesPowerpoint slides will be made available.
LiteratureMany papers for in-depth study will be referred to during the lecture.
Prerequisites / NoticeHands-on experience with probabilistic climate models and risk models will be acquired in the tutorials; hence good understanding of scientific programming forms a prerequisite of the course, in Python (teaching language, object oriented) or similar. Basic understanding of the climate system, e.g. as covered in the course 'Klimasysteme' is required, as well as beginner level in statistical and time series analysis.

Examination: graded tutorials during the semester (benotete Semesterleistung)
701-1270-00LHigh Performance Computing for Weather and ClimateW3 credits3GO. Fuhrer
AbstractState-of-the-art weather and climate simulations rely on large and complex software running on supercomputers. This course focuses on programming methods and tools for understanding, developing and optimizing the computational aspects of weather and climate models. Emphasis will be placed on the foundations of parallel computing, practical exercises and emerging trends such as using GPUs.
ObjectiveAfter attending this course, students will be able to:
- Understand a broad variety of high performance computing concepts relevant for weather and climate simulations
- Work with weather and climate simulation codes that run on large supercomputers
ContentHPC Overview:
- Why does weather and climate require HPC?
- Today's HPC: Beowulf-style clusters, massively parallel architectures, hybrid computing, accelerators
- Scaling / Parallel efficiency
- Algorithmic motifs in weather and climate

Writing HPC code:
- Data locality and single node efficiency
- Shared memory parallelism with OpenMP
- Distributed memory parallelism with MPI
- GPU computing
- High-level programming and domain-specific languages
Literature- Introduction to High Performance Computing for Scientists and Engineers, G. Hager and G. Wellein, CRC Press, 2011
- Computer Organization and Design, D.H. Patterson and J.L. Hennessy
- Parallel Computing, A. Grama, A. Gupta, G. Karypis, V. Kumar (Link)
- Parallel Programming in MPI and OpenMP, V. Eijkhout (Link)
Prerequisites / Notice- fundamentals of numerical analysis and atmospheric modeling
- basic experience in a programming language (C/C++, Fortran, Python, …)
- experience using command line interfaces in *nix environments (e.g., Unix, Linux)
227-0395-00LNeural SystemsW6 credits2V + 1U + 1AR. Hahnloser, M. F. Yanik, B. Grewe
AbstractThis course introduces principles of information processing in neural systems. It covers basic neuroscience for engineering students, experiment techniques used in animal research and methods for inferring neural mechanisms. Students learn about neural information processing and basic principles of natural intelligence and their impact on artificially intelligent systems.
ObjectiveThis course introduces
- Basic neurophysiology and mathematical descriptions of neurons
- Methods for dissecting animal behavior
- Neural recordings in intact nervous systems and information decoding principles
- Methods for manipulating the state and activity in selective neuron types
- Neuromodulatory systems and their computational roles
- Reward circuits and reinforcement learning
- Imaging methods for reconstructing the synaptic networks among neurons
- Birdsong and language
- Neurobiological principles for machine learning.
ContentFrom active membranes to propagation of action potentials. From synaptic physiology to synaptic learning rules. From receptive fields to neural population decoding. From fluorescence imaging to connectomics. Methods for reading and manipulation neural ensembles. From classical conditioning to reinforcement learning. From the visual system to deep convolutional networks. Brain architectures for learning and memory. From birdsong to computational linguistics.
Prerequisites / NoticeBefore taking this course, students are encouraged to complete "Bioelectronics and Biosensors" (227-0393-10L).

As part of the exercises for this class, students are expected to complete a programming or literature review project to be defined at the beginning of the semester.
227-0973-00LTranslational NeuromodelingW8 credits3V + 2U + 1AK. Stephan
AbstractThis course provides a systematic introduction to Translational Neuromodeling (the development of mathematical models for diagnostics of brain diseases) and their application to concrete clinical questions (Computational Psychiatry/Psychosomatics). It focuses on a generative modeling strategy and teaches (hierarchical) Bayesian models of neuroimaging data and behaviour, incl. exercises.
ObjectiveTo obtain an understanding of the goals, concepts and methods of Translational Neuromodeling and Computational Psychiatry/Psychosomatics, particularly with regard to Bayesian models of neuroimaging (fMRI, EEG) and behavioural data.
ContentThis course provides a systematic introduction to Translational Neuromodeling (the development of mathematical models for inferring mechanisms of brain diseases from neuroimaging and behavioural data) and their application to concrete clinical questions (Computational Psychiatry/Psychosomatics). The first part of the course will introduce disease concepts from psychiatry and psychosomatics, their history, and clinical priority problems. The second part of the course concerns computational modeling of neuronal and cognitive processes for clinical applications. A particular focus is on Bayesian methods and generative models, for example, dynamic causal models for inferring neuronal processes from neuroimaging data, and hierarchical Bayesian models for inference on cognitive processes from behavioural data. The course discusses the mathematical and statistical principles behind these models, illustrates their application to various psychiatric diseases, and outlines a general research strategy based on generative models.

Lecture topics include:
1. Introduction to Translational Neuromodeling and Computational Psychiatry/Psychosomatics
2. Psychiatric nosology
3. Pathophysiology of psychiatric disease mechanisms
4. Principles of Bayesian inference and generative modeling
5. Variational Bayes (VB)
6. Bayesian model selection
7. Markov Chain Monte Carlo techniques (MCMC)
8. Bayesian frameworks for understanding psychiatric and psychosomatic diseases
9. Generative models of fMRI data
10. Generative models of electrophysiological data
11. Generative models of behavioural data
12. Computational concepts of schizophrenia, depression and autism
13. Model-based predictions about individual patients

Practical exercises include mathematical derivations and the implementation of specific models and inference methods. In additional project work, students are required to use one of the examples discussed in the course as a basis for developing their own generative model and use it for simulations and/or inference in application to a clinical question. Group work (up to 3 students) is required.
LiteratureSee TNU website:
Link
Prerequisites / NoticeGood knowledge of principles of statistics, good programming skills (MATLAB, Julia, or Python)
227-1032-00LNeuromorphic Engineering II Information
Information for UZH students:
Enrolment to this course unit only possible at ETH. No enrolment to module INI405 at UZH.

Please mind the ETH enrolment deadlines for UZH students: Link
W6 credits5GT. Delbrück, G. Indiveri, S.‑C. Liu
AbstractThis course teaches the basics of analog chip design and layout with an emphasis on neuromorphic circuits, which are introduced in the fall semester course "Neuromorphic Engineering I".
ObjectiveDesign of a neuromorphic circuit for implementation with CMOS technology.
ContentThis course teaches the basics of analog chip design and layout with an emphasis on neuromorphic circuits, which are introduced in the autumn semester course "Neuromorphic Engineering I".

The principles of CMOS processing technology are presented. Using a set of inexpensive software tools for simulation, layout and verification, suitable for neuromorphic circuits, participants learn to simulate circuits on the transistor level and to make their layouts on the mask level. Important issues in the layout of neuromorphic circuits will be explained and illustrated with examples. In the latter part of the semester students simulate and layout a neuromorphic chip. Schematics of basic building blocks will be provided. The layout will then be fabricated and will be tested by students during the following fall semester.
LiteratureS.-C. Liu et al.: Analog VLSI Circuits and Principles; software documentation.
Prerequisites / NoticePrerequisites: Neuromorphic Engineering I strongly recommended
227-1034-00LComputational Vision (University of Zurich)
No enrolment to this course at ETH Zurich. Book the corresponding module directly at UZH.
UZH Module Code: INI402

Mind the enrolment deadlines at UZH:
Link
W6 credits2V + 1UD. Kiper
AbstractThis course focuses on neural computations that underlie visual perception. We study how visual signals are processed in the retina, LGN and visual cortex. We study the morpholgy and functional architecture of cortical circuits responsible for pattern, motion, color, and three-dimensional vision.
ObjectiveThis course considers the operation of circuits in the process of neural computations. The evolution of neural systems will be considered to demonstrate how neural structures and mechanisms are optimised for energy capture, transduction, transmission and representation of information. Canonical brain circuits will be described as models for the analysis of sensory information. The concept of receptive fields will be introduced and their role in coding spatial and temporal information will be considered. The constraints of the bandwidth of neural channels and the mechanisms of normalization by neural circuits will be discussed.
The visual system will form the basis of case studies in the computation of form, depth, and motion. The role of multiple channels and collective computations for object recognition will
be considered. Coordinate transformations of space and time by cortical and subcortical mechanisms will be analysed. The means by which sensory and motor systems are integrated to allow for adaptive behaviour will be considered.
ContentThis course considers the operation of circuits in the process of neural computations. The evolution of neural systems will be considered to demonstrate how neural structures and mechanisms are optimised for energy capture, transduction, transmission and representation of information. Canonical brain circuits will be described as models for the analysis of sensory information. The concept of receptive fields will be introduced and their role in coding spatial and temporal information will be considered. The constraints of the bandwidth of neural channels and the mechanisms of normalization by neural circuits will be discussed.
The visual system will form the basis of case studies in the computation of form, depth, and motion. The role of multiple channels and collective computations for object recognition will
be considered. Coordinate transformations of space and time by cortical and subcortical mechanisms will be analysed. The means by which sensory and motor systems are integrated to allow for adaptive behaviour will be considered.
LiteratureBooks: (recommended references, not required)
1. An Introduction to Natural Computation, D. Ballard (Bradford Books, MIT Press) 1997.
2. The Handbook of Brain Theorie and Neural Networks, M. Arbib (editor), (MIT Press) 1995.
851-0252-06LIntroduction to Social Networks: Theory, Methods and Applications
This course is intended for students interested in data analysis and with basic knowledge of inferential statistics.
W3 credits2GC. Stadtfeld, U. Brandes
AbstractHumans are connected by various social relations. When aggregated, we speak of social networks. This course discusses how social networks are structured, how they change over time and how they affect the individuals that they connect. It integrates social theory with practical knowledge of cutting-edge statistical methods and applications from a number of scientific disciplines.
ObjectiveThe aim is to enable students to contribute to social networks research and to be discriminating consumers of modern literature on social networks. Students will acquire a thorough understanding of social networks theory (1), practical skills in cutting-edge statistical methods (2) and their applications in a number of scientific fields (3).
In particular, at the end of the course students will
- Know the fundamental theories in social networks research (1)
- Understand core concepts of social networks and their relevance in different contexts (1, 3)
- Be able to describe and visualize networks data in the R environment (2)
- Understand differences regarding analysis and collection of network data and other type of survey data (2)
- Know state-of-the-art inferential statistical methods and how they are used in R (2)
- Be familiar with the core empirical studies in social networks research (2, 3)
- Know how network methods can be employed in a variety of scientific disciplines (3)
851-0254-00LNetwork Science Project
Does not take place this semester.
It is advisable to take at least one of 851-0252-06 Introduction to Social Networks, 851-0252-15 Network Analysis, or 851-0252-13 Network Modeling beforehand.

Proficiency in programming and data analysis are helpful but can be compensated for by a firm understanding of the foundations relevant for the particular study.
W3 credits2PU. Brandes, C. Stadtfeld
AbstractStudy project involving network data in a selected field.
ObjectivePractical experience with, and a contextual understanding of, the links between a research question, domain-specific theory, and computational methods in network science.
ContentIndividually or in small groups, students carry out a project in which an original research question is addressed using network data. While network approaches are increasingly common in domains from archaeology and digital media to transportation and zoology, applications are often driven by the availability of (found, observational) data.

Special emphasis is therefore placed on the consideration of domain-specific theory and the possibility to adapt data collection and mathematical methods accordingly. Studies may vary by domain of interest and the relative importance of theory, data, methods, implementation issues, and other aspects. In particular, the focus may be on data collection instruments or theory-inspired method development and implementation.
Prerequisites / NoticeProject topics will be introduced during an initial meeting on Friday, March 5, 16:15-17:45, in WEP J 11. Subsequent meetings with the respective project teams will be by appointment.
851-0586-03LApplied Network Science: Sports Networks Restricted registration - show details
Number of participant limited to 20
W3 credits2SU. Brandes
AbstractWe study applications of network science methods, this time in the domain of sports.
Topics are selected for diversity in research questions and techniques
with applications such as passing networks, team rankings, and career trajectories.
Student teams present results from the recent literature, possibly with replication, in a mini-conference shortly before the start of EURO 2020 [sic].
ObjectiveNetwork science as a paradigm is entering domains from engineering to the humantities but application is tricky.
By examples from recent research on sports, sports administration, and the sociology of sports, students learn to appreciate that, and how, context matters.
They will be able to assess the appropriateness of approaches
for substantive research problems, and especially when and why quantitative approaches are or are not suitable.
LiteratureOriginal research articles will be introduced in the first session. General introduction:

Wäsche, Dickson, Woll & Brandes (2017). Social Network Analysis in Sport Research: An Emerging Paradigm. European Journal for Sport and Society 14(2):138-165. DOI: 10.1080/16138171.2017.1318198
851-0739-01LSequencing Legal DNA: NLP for Law and Political Economy
Particularly suitable for students of D-INFK, D-ITET, D-MTEC
W3 credits2VE. Ash
AbstractThis course explores the application of natural language processing techniques to texts in law, politics, and the news media.
ObjectiveStudents will be introduced to a broad array of tools in natural language processing (NLP). They will learn to evaluate and apply NLP tools to a variety of problems. The applications will focus on social-science contexts, including law, politics, and the news media. Topics include text classification, topic modeling, transformers, model explanation, and bias in language.
ContentNLP technologies have the potential to assist judges and other decision-makers by making tasks more efficient and consistent. On the other hand, language choices could be biased toward some groups, and automated systems could entrench those biases.

We will explore the use of NLP for social science research, not just in the law but also in politics, the economy, and culture. We will explore, critique, and integrate the emerging set of tools for debiasing language models and think carefully about how notions of fairness should be applied in this domain.
Prerequisites / NoticeSome programming experience in Python is required, and some experience with NLP is highly recommended.
851-0739-02LSequencing Legal DNA: NLP for Law and Political Economy (Course Project)
This is the optional course project for "Building a Robot Judge: Data Science for the Law."

Please register only if attending the lecture course or with consent of the instructor.

Some programming experience in Python is required, and some experience with text mining is highly recommended.
W2 credits2VE. Ash
AbstractThis is the companion course for extra credit for a course project, for the course "Sequencing Legal DNA: NLP for Law and Political Economy".
ObjectiveStudents will be introduced to a broad array of tools in natural language processing (NLP). They will learn to evaluate and apply NLP tools to a variety of problems. The applications will focus on social-science contexts, including law, politics, and the news media. Topics include text classification, topic modeling, transformers, model explanation, and bias in language.
851-0740-00LBig Data, Law, and Policy Restricted registration - show details
Number of participants limited to 35.
Students will be informed by 1.3.2021 the latest.
W3 credits2SS. Bechtold
AbstractThis course introduces students to societal perspectives on the big data revolution. Discussing important contributions from machine learning and data science, the course explores their legal, economic, ethical, and political implications in the past, present, and future.
ObjectiveThis course is intended both for students of machine learning and data science who want to reflect on the societal implications of their field, and for students from other disciplines who want to explore the societal impact of data sciences. The course will first discuss some of the methodological foundations of machine learning, followed by a discussion of research papers and real-world applications where big data and societal values may clash. Potential topics include the implications of big data for privacy, liability, insurance, health systems, voting, and democratic institutions, as well as the use of predictive algorithms for price discrimination and the criminal justice system. Guest speakers, weekly readings and reaction papers ensure a lively debate among participants from various backgrounds.
860-0033-00LBig Data for Public Policy Information Restricted registration - show details
Only for Master students and PhD students.
W3 credits2GE. Ash, M. Guillot
AbstractThis course provides an introduction to big data methods for public policy analysis. Students will put these techniques to work on a course project using real-world data, to be designed and implemented in consultation with the instructors.
ObjectiveMany policy problems involve prediction. For example, a budget office might want to predict the number of applications for benefits payments next month, based on labor market conditions this month. This course provides a hands-on introduction to the "big data" techniques for making such predictions.
ContentMany policy problems involve prediction. For example, a budget office might want to predict the number of applications for benefits payments next month, based on labor market conditions this month. This course provides a hands-on introduction to the "big data" techniques for making such predictions. These techniques include:

-- procuring big datasets, especially through web scraping or API interfaces, including social media data;
-- pre-processing and dimension reduction of massive datasets for tractable computation;
-- machine learning for predicting outcomes, including how to select and tune the model, evaluate model performance using held-out test data, and report results;
-- interpreting machine learning model predictions to understand what is going on inside the black box;
-- data visualization including interactive web apps.

Students will put these techniques to work on a course project using real-world data, to be designed and implemented in consultation with the instructors.
Data Science Lab
NumberTitleTypeECTSHoursLecturers
263-3300-00LData Science Lab Restricted registration - show details
Only for Data Science MSc.
W14 credits9PC. Zhang, V. Boeva, R. Cotterell, J. Vogt, F. Yang
AbstractIn this class, we bring together data science applications
provided by ETH researchers outside computer science and
teams of computer science master's students. Two to three
students will form a team working on data science/machine
learning-related research topics provided by scientists in
a diverse range of domains such as astronomy, biology,
social sciences etc.
ObjectiveThe goal of this class if for students to gain experience
of dealing with data science and machine learning applications
"in the wild". Students are expected to go through the full
process starting from data cleaning, modeling, execution,
debugging, error analysis, and quality/performance refinement.
Prerequisites / NoticePrerequisites: At least 8 KP must have been obtained under Data Analysis and at least 8 KP must have been obtained under Data Management and Processing.
Seminar
NumberTitleTypeECTSHoursLecturers
261-5113-00LComputational Challenges in Medical Genomics Information Restricted registration - show details
Number of participants limited to 20.
W2 credits2SA. Kahles, G. Rätsch
AbstractThis seminar discusses recent relevant contributions to the fields of computational genomics, algorithmic bioinformatics, statistical genetics and related areas. Each participant will hold a presentation and lead the subsequent discussion.
ObjectivePreparing and holding a scientific presentation in front of peers is a central part of working in the scientific domain. In this seminar, the participants will learn how to efficiently summarize the relevant parts of a scientific publication, critically reflect its contents, and summarize it for presentation to an audience. The necessary skills to succesfully present the key points of existing research work are the same as needed to communicate own research ideas.
In addition to holding a presentation, each student will both contribute to as well as lead a discussion section on the topics presented in the class.
ContentThe topics covered in the seminar are related to recent computational challenges that arise from the fields of genomics and biomedicine, including but not limited to genomic variant interpretation, genomic sequence analysis, compressive genomics tasks, single-cell approaches, privacy considerations, statistical frameworks, etc.
Both recently published works contributing novel ideas to the areas mentioned above as well as seminal contributions from the past are amongst the list of selected papers.
Prerequisites / NoticeKnowledge of algorithms and data structures and interest in applications in genomics and computational biomedicine.
263-5225-00LAdvanced Topics in Machine Learning and Data Science Information Restricted registration - show details
Number of participants limited to 20.

The deadline for deregistering expires at the end of the fourth week of the semester. Students who are still registered after that date, but do not attend the seminar, will officially fail the seminar.
W2 credits2SF. Perez Cruz
AbstractIn this seminar, recent papers of the machine learning and data science literature are presented and discussed. Possible topics cover statistical models, machine learning algorithms and its applications.
ObjectiveThe seminar “Advanced Topics in Machine Learning and Data Science” familiarizes students with recent developments in machine learning and data science. Recently published articles, as well as influential papers, have to be presented and critically reviewed. The students will learn how to structure a scientific presentation, which covers the motivation, key ideas and main results of a scientific paper. An important goal of the seminar presentation is to summarize the essential ideas of the paper in sufficient depth for the audience to be able to follow its main conclusion, especially why the article is (or is not) worth attention. The presentation style will play an important role and should reach the level of professional scientific presentations.
ContentThe seminar will cover a number of recent papers which have emerged as important contributions to the machine learning and data science literatures. The topics will vary from year to year but they are centered on methodological issues in machine learning and its application, not only to text or images, but other scientific
domains like medicine, climate or physics.
LiteratureThe papers will be presented in the first session of the seminar.
401-3620-21LStudent Seminar in Statistics: Statistical Network Modeling Information Restricted registration - show details
Number of participants limited to 48.
Mainly for students from the Mathematics Bachelor and Master Programmes who, in addition to the introductory course unit 401-2604-00L Probability and Statistics, have heard at least one core or elective course in statistics. Also offered in the Master Programmes Statistics resp. Data Science.
W4 credits2SP. L. Bühlmann, M. Azadkia
AbstractNetwork models can be used to analyze non-iid data because their structure incorporates interconnectedness between the individuals. We introduce networks, describe them mathematically, and consider applications.
ObjectiveNetwork models can be used to analyze non-iid data because their structure incorporates interconnectedness between the individuals. The participants of the seminar acquire knowledge to formulate and analyze network models and to apply them in examples.
LiteratureE. D. Kolaczyk and G. Csárdi. Statistical analysis of network data with R. Springer, Cham, Switzerland, second edition, 2020.

Tianxi Li, Elizaveta Levina, and Ji Zhu. Network cross-validation by edge sampling, 2020. Preprint arXiv:1612.04717.

Tianxi Li, Elizaveta Levina, and Ji Zhu. Community models for partially observed networks from surveys, 2020. Preprint arXiv:2008.03652.

Tianxi Li, Elizaveta Levina, and Ji Zhu. Prediction Models for Network-Linked Data, 2018. Preprint arXiv:1602.01192.
Prerequisites / NoticeEvery class will consist of an oral presentation highlighting key ideas of selected book chapters by a pair of students. Another two students will be responsible for asking questions during the presentation and providing a discussion of the the presented concepts and ideas, including pros+cons, at the end. Finally, an additional two students are responsible for giving an evaluation on the quality of the presentations/discussions and provide constructive feedback for improvement.
GESS Science in Perspective
NumberTitleTypeECTSHoursLecturers
851-0740-00LBig Data, Law, and Policy Restricted registration - show details
Number of participants limited to 35.
Students will be informed by 1.3.2021 the latest.
W3 credits2SS. Bechtold
AbstractThis course introduces students to societal perspectives on the big data revolution. Discussing important contributions from machine learning and data science, the course explores their legal, economic, ethical, and political implications in the past, present, and future.
ObjectiveThis course is intended both for students of machine learning and data science who want to reflect on the societal implications of their field, and for students from other disciplines who want to explore the societal impact of data sciences. The course will first discuss some of the methodological foundations of machine learning, followed by a discussion of research papers and real-world applications where big data and societal values may clash. Potential topics include the implications of big data for privacy, liability, insurance, health systems, voting, and democratic institutions, as well as the use of predictive algorithms for price discrimination and the criminal justice system. Guest speakers, weekly readings and reaction papers ensure a lively debate among participants from various backgrounds.
» see Science in Perspective: Type A: Enhancement of Reflection Capability
» Recommended Science in Perspective (Type B) for D-INFK
» see Science in Perspective: Language Courses ETH/UZH
Master's Thesis
NumberTitleTypeECTSHoursLecturers
261-0800-00LMaster's Thesis
The minimal prerequisites for the Master’s thesis registration are:
- Completed Bachelor’s program
- All additional requirements completed (additional requirements, if any, are listed in the admission decree)
- Minimum degree requirements fulfilled of the course categories Data Analysis and Data Management and overall 50 credits obtained in the course category Core Courses
- Data Science Lab (14 credits) completed
O30 credits64DProfessors
AbstractThe Master's thesis concludes the study program and demonstrates the students' ability to use the knowledge and skills acquired during Master’s studies to solve a complex data science problem.
ObjectiveTo work independently and to produce a scientifically structured work.