Search result: Catalogue data in Spring Semester 2020

Cyber Security Master Information
Field of Specialization
Core Courses
NumberTitleTypeECTSHoursLecturers
263-4660-00LApplied Cryptography Information Restricted registration - show details
Number of participants limited to 150.
W8 credits3V + 2U + 2PK. Paterson
AbstractThis course will introduce the basic primitives of cryptography, using rigorous syntax and game-based security definitions. The course will show how these primitives can be combined to build cryptographic protocols and systems.
Learning objectiveThe goal of the course is to put students' understanding of cryptography on sound foundations, to enable them to start to build well-designed cryptographic systems, and to expose them to some of the pitfalls that arise when doing so.
ContentBasic symmetric primitives (block ciphers, modes, hash functions); generic composition; AEAD; basic secure channels; basic public key primitives (encryption,signature, DH key exchange); ECC; randomness; applications.
LiteratureTextbook: Boneh and Shoup, “A Graduate Course in Applied Cryptography”, https://crypto.stanford.edu/~dabo/cryptobook/BonehShoup_0_4.pdf.
Prerequisites / NoticeIdeally, students will have taken the D-INFK Bachelors course “Information Security" or an equivalent course at Bachelors level.
Electives
NumberTitleTypeECTSHoursLecturers
252-0408-00LCryptographic Protocols Information W6 credits2V + 2U + 1AM. Hirt, U. Maurer
AbstractThe course presents a selection of hot research topics in cryptography. The choice of topics varies and may include provable security, interactive proofs, zero-knowledge protocols, secret sharing, secure multi-party computation, e-voting, etc.
Learning objectiveIndroduction to a very active research area with many gems and paradoxical
results. Spark interest in fundamental problems.
ContentThe course presents a selection of hot research topics in cryptography. The choice of topics varies and may include provable security, interactive proofs, zero-knowledge protocols, secret sharing, secure multi-party computation, e-voting, etc.
Lecture notesthe lecture notes are in German, but they are not required as the entire
course material is documented also in other course material (in english).
Prerequisites / NoticeA basic understanding of fundamental cryptographic concepts
(as taught for example in the course Information Security or
in the course Cryptography Foundations) is useful, but not required.
263-2925-00LProgram Analysis for System Security and Reliability Information W6 credits2V + 1U + 2AP. Tsankov
AbstractSecurity issues in modern systems (blockchains, datacenters, AI) result in billions of losses due to hacks. This course introduces the security issues in modern systems and state-of-the-art automated techniques for building secure and reliable systems. The course has a practical focus and covers systems built by successful ETH spin-offs.
Learning objective* Learn about security issues in modern systems -- blockchains, smart contracts, AI-based systems (e.g., autonomous cars), data centers -- and why they are challenging to address.

* Understand how the latest automated analysis techniques work, both discrete and probabilistic.

* Understand how these techniques combine with machine-learning methods, both supervised and unsupervised.

* Understand how to use these methods to build reliable and secure modern systems.

* Learn about new open problems that if solved can lead to research and commercial impact.
ContentPart I: Security of Blockchains

- We will cover existing blockchains (e.g., Ethereum, Bitcoin), how they work, what the core security issues are, and how these have led to massive financial losses.
- We will show how to extract useful information about smart contracts and transactions using interactive analysis frameworks for querying blockchains (e.g. Google's Ethereum BigQuery).
- We will discuss the state-of-the-art security tools (e.g., https://securify.ch) for ensuring that smart contracts are free of security vulnerabilities.
- We will study the latest automated reasoning systems (e.g., https://verx.ch) for checking custom (temporal) properties of smart contracts and illustrate their operation on real-world use cases.
- We will study the underlying methods for automated reasoning and testing (e.g., abstract interpretation, symbolic execution, fuzzing) are used to build such tools.


Part II: Security of Datacenters and Networks

- We will show how to ensure that datacenters and ISPs are secured using declarative reasoning methods (e.g., Datalog). We will also see how to automatically synthesize secure configurations (e.g. using SyNET and NetComplete) which lead to desirable behaviors, thus automating the job of the network operator and avoiding critical errors.
- We will discuss how to apply modern discrete probabilistic inference (e.g., PSI and Bayonet) so to reason about probabilistic network properties (e.g., the probability of a packet reaching a destination if links fail).


Part III: Machine Learning for Security

- We will discuss how machine learning models for structured prediction are used to address security tasks, including de-obfuscation of binaries (Debin: https://debin.ai), Android APKs (DeGuard: http://apk-deguard.com) and JavaScript (JSNice: http://jsnice.org).
- We will study to leverage program abstractions in combination with clustering techniques to learn security rules for cryptography APIs from large codebases.
- We will study how to automatically learn to identify security vulnerabilities related to the handling of untrusted inputs (cross-Site scripting, SQL injection, path traversal, remote code execution) from large codebases.


To gain a deeper understanding, the course will involve a hands-on programming project where the methods studied in the class will be applied.
263-4600-00LFormal Methods for Information Security Information W5 credits2V + 1U + 1AR. Sasse, C. Sprenger
AbstractThe course focuses on formal methods for the modelling and analysis of security protocols for critical systems, ranging from authentication protocols for network security to electronic voting protocols and online banking.
Learning objectiveThe students will learn the key ideas and theoretical foundations of formal modelling and analysis of security protocols. The students will complement their theoretical knowledge by solving practical exercises, completing a small project, and using state-of-the-art tools.
ContentThe course treats formal methods mainly for the modelling and analysis of security protocols. Cryptographic protocols (such as SSL/TLS, SSH, Kerberos, SAML single-sign on, and IPSec) form the basis for secure communication and business processes. Numerous attacks on published protocols show that the design of cryptographic protocols is extremely error-prone. A rigorous analysis of these protocols is therefore indispensable, and manual analysis is insufficient. The lectures cover the theoretical basis for the (tool-supported) formal modeling and analysis of such protocols. Specifically, we discuss their operational semantics, the formalization of security properties, and techniques and algorithms for their verification.

In addition to the classical security properties for confidentiality and authentication, we will study strong secrecy and privacy properties. We will discuss electronic voting protocols, and RFID protocols (a staple of the Internet of Things), where these properties are central. The accompanying tutorials provide an opportunity to apply the theory and tools to concrete protocols. Moreover, we will discuss methods to abstract and refine security protocols and the link between symbolic protocol models and cryptographic models.

Furthermore, we will also present a security notion for general systems based on non-interference as well as language-based information flow security where non-interference is enforced via a type system.
263-4656-00LDigital Signatures Information W4 credits2V + 1AD. Hofheinz
AbstractDigital signatures as one central cryptographic building block. Different security goals and security definitions for digital signatures, followed by a variety of popular and fundamental signature schemes with their security analyses.
Learning objectiveThe student knows a variety of techniques to construct and analyze the security of digital signature schemes. This includes modularity as a central tool of constructing secure schemes, and reductions as a central tool to proving the security of schemes.
ContentWe will start with several definitions of security for signature schemes, and investigate the relations among them. We will proceed to generic (but inefficient) constructions of secure signatures, and then move on to a number of efficient schemes based on concrete computational hardness assumptions. On the way, we will get to know paradigms such as hash-then-sign, one-time signatures, and chameleon hashing as central tools to construct secure signatures.
LiteratureJonathan Katz, "Digital Signatures."
Prerequisites / NoticeIdeally, students will have taken the D-INFK Bachelors course "Information Security" or an equivalent course at Bachelors level.
Seminar
NumberTitleTypeECTSHoursLecturers
263-4651-00LCurrent Topics in Cryptography Information Restricted registration - show details
Number of participants limited to 24.

The deadline for deregistering expires at the end of the second week of the semester. Students who are still registered after that date, but do not attend the seminar, will officially fail the seminar.
W2 credits2SD. Hofheinz, U. Maurer, K. Paterson
AbstractIn this seminar course, students present and discuss a variety of recent research papers in Cryptography.
Learning objectiveIndependent study of scientific literature and assessment of its contributions as well as learning and practicing presentation techniques.
ContentThe course lecturers will provide a list of papers from which students will select.
LiteratureThe reading list will be published on the course website.
Prerequisites / NoticeIdeally, students will have taken the D-INFK Bachelors course “Information Security" or an equivalent course at Bachelors level. Ideally, they will have attended or will attend in parallel the Masters course in "Applied Cryptography”.
Semester Project
NumberTitleTypeECTSHoursLecturers
260-0100-00LSemester Project
Only for Cyber Security MSc
W12 credits26AProfessors
AbstractThe Semester Project provides students with the opportunity to apply acquired knowledge and skills.
Learning objectiveThe students can gain hand-on experience by solving independently a technical-scientific problem.
Prerequisites / NoticePrerequisites: At least one core course in Cyber Security and one inter focus course must have been completed successfully.
Minor
Computational Science
Core Courses
NumberTitleTypeECTSHoursLecturers
401-3632-00LComputational StatisticsW8 credits3V + 1UM. H. Maathuis
AbstractWe discuss modern statistical methods for data analysis, including methods for data exploration, prediction and inference. We pay attention to algorithmic aspects, theoretical properties and practical considerations. The class is hands-on and methods are applied using the statistical programming language R.
Learning objectiveThe student obtains an overview of modern statistical methods for data analysis, including their algorithmic aspects and theoretical properties. The methods are applied using the statistical programming language R.
ContentSee the class website
Prerequisites / NoticeAt least one semester of (basic) probability and statistics.

Programming experience is helpful but not required.
Electives
NumberTitleTypeECTSHoursLecturers
252-0526-00LStatistical Learning Theory Information W7 credits3V + 2U + 1AJ. M. Buhmann, C. Cotrini Jimenez
AbstractThe course covers advanced methods of statistical learning:

- Variational methods and optimization.
- Deterministic annealing.
- Clustering for diverse types of data.
- Model validation by information theory.
Learning objectiveThe course surveys recent methods of statistical learning. The fundamentals of machine learning, as presented in the courses "Introduction to Machine Learning" and "Advanced Machine Learning", are expanded from the perspective of statistical learning.
Content- Variational methods and optimization. We consider optimization approaches for problems where the optimizer is a probability distribution. We will discuss concepts like maximum entropy, information bottleneck, and deterministic annealing.

- Clustering. This is the problem of sorting data into groups without using training samples. We discuss alternative notions of "similarity" between data points and adequate optimization procedures.

- Model selection and validation. This refers to the question of how complex the chosen model should be. In particular, we present an information theoretic approach for model validation.

- Statistical physics models. We discuss approaches for approximately optimizing large systems, which originate in statistical physics (free energy minimization applied to spin glasses and other models). We also study sampling methods based on these models.
Lecture notesA draft of a script will be provided. Lecture slides will be made available.
LiteratureHastie, Tibshirani, Friedman: The Elements of Statistical Learning, Springer, 2001.

L. Devroye, L. Gyorfi, and G. Lugosi: A probabilistic theory of pattern recognition. Springer, New York, 1996
Prerequisites / NoticeKnowledge of machine learning (introduction to machine learning and/or advanced machine learning)
Basic knowledge of statistics.
261-5120-00LMachine Learning for Health Care Information Restricted registration - show details
Number of participants limited to 150.
W5 credits3P + 1AG. Rätsch, J. Vogt, V. Boeva
AbstractThe course will review the most relevant methods and applications of Machine Learning in Biomedicine, discuss the main challenges they present and their current technical problems.
Learning objectiveDuring the last years, we have observed a rapid growth in the field of Machine Learning (ML), mainly due to improvements in ML algorithms, the increase of data availability and a reduction in computing costs. This growth is having a profound impact in biomedical applications, where the great variety of tasks and data types enables us to get benefit of ML algorithms in many different ways. In this course we will review the most relevant methods and applications of ML in biomedicine, discuss the main challenges they present and their current technical solutions.
ContentThe course will consist of four topic clusters that will cover the most relevant applications of ML in Biomedicine:
1) Structured time series: Temporal time series of structured data often appear in biomedical datasets, presenting challenges as containing variables with different periodicities, being conditioned by static data, etc.
2) Medical notes: Vast amount of medical observations are stored in the form of free text, we will analyze stategies for extracting knowledge from them.
3) Medical images: Images are a fundamental piece of information in many medical disciplines. We will study how to train ML algorithms with them.
4) Genomics data: ML in genomics is still an emerging subfield, but given that genomics data are arguably the most extensive and complex datasets that can be found in biomedicine, it is expected that many relevant ML applications will arise in the near future. We will review and discuss current applications and challenges.
Prerequisites / NoticeData Structures & Algorithms, Introduction to Machine Learning, Statistics/Probability, Programming in Python, Unix Command Line

Relation to Course 261-5100-00 Computational Biomedicine: This course is a continuation of the previous course with new topics related to medical data and machine learning. The format of Computational Biomedicine II will also be different. It is helpful but not essential to attend Computational Biomedicine before attending Computational Biomedicine II.
263-5300-00LGuarantees for Machine Learning Information Restricted registration - show details W5 credits2V + 2AF. Yang
AbstractThis course teaches classical and recent methods in statistics and optimization commonly used to prove theoretical guarantees for machine learning algorithms. The knowledge is then applied in project work that focuses on understanding phenomena in modern machine learning.
Learning objectiveThis course is aimed at advanced master and doctorate students who want to understand and/or conduct independent research on theory for modern machine learning. For this purpose, students will learn common mathematical techniques from statistical learning theory. In independent project work, they then apply their knowledge and go through the process of critically questioning recently published work, finding relevant research questions and learning how to effectively present research ideas to a professional audience.
ContentThis course teaches some classical and recent methods in statistical learning theory aimed at proving theoretical guarantees for machine learning algorithms, including topics in

- concentration bounds, uniform convergence
- high-dimensional statistics (e.g. Lasso)
- prediction error bounds for non-parametric statistics (e.g. in kernel spaces)
- minimax lower bounds
- regularization via optimization

The project work focuses on active theoretical ML research that aims to understand modern phenomena in machine learning, including but not limited to

- how overparameterization could help generalization ( interpolating models, linearized NN )
- how overparameterization could help optimization ( non-convex optimization, loss landscape )
- complexity measures and approximation theoretic properties of randomly initialized and
trained NN
- generalization of robust learning ( adversarial robustness, standard and robust error tradeoff )
- prediction with calibrated confidence ( conformal prediction, calibration )
Prerequisites / NoticeIt’s absolutely necessary for students to have a strong mathematical background (basic real analysis, probability theory, linear algebra) and good knowledge of core concepts in machine learning taught in courses such as “Introduction to Machine Learning”, “Regression”/ “Statistical Modelling”. It's also helpful to have heard an optimization course or approximation theoretic course. In addition to these prerequisites, this class requires a certain degree of mathematical maturity—including abstract thinking and the ability to understand and write proofs.
Distributed Systems
Core Courses
NumberTitleTypeECTSHoursLecturers
227-0558-00LPrinciples of Distributed Computing Information W7 credits2V + 2U + 2AR. Wattenhofer, M. Ghaffari
AbstractWe study the fundamental issues underlying the design of distributed systems: communication, coordination, fault-tolerance, locality, parallelism, self-organization, symmetry breaking, synchronization, uncertainty. We explore essential algorithmic ideas and lower bound techniques.
Learning objectiveDistributed computing is essential in modern computing and communications systems. Examples are on the one hand large-scale networks such as the Internet, and on the other hand multiprocessors such as your new multi-core laptop. This course introduces the principles of distributed computing, emphasizing the fundamental issues underlying the design of distributed systems and networks: communication, coordination, fault-tolerance, locality, parallelism, self-organization, symmetry breaking, synchronization, uncertainty. We explore essential algorithmic ideas and lower bound techniques, basically the "pearls" of distributed computing. We will cover a fresh topic every week.
ContentDistributed computing models and paradigms, e.g. message passing, shared memory, synchronous vs. asynchronous systems, time and message complexity, peer-to-peer systems, small-world networks, social networks, sorting networks, wireless communication, and self-organizing systems.

Distributed algorithms, e.g. leader election, coloring, covering, packing, decomposition, spanning trees, mutual exclusion, store and collect, arrow, ivy, synchronizers, diameter, all-pairs-shortest-path, wake-up, and lower bounds
Lecture notesAvailable. Our course script is used at dozens of other universities around the world.
LiteratureLecture Notes By Roger Wattenhofer. These lecture notes are taught at about a dozen different universities through the world.

Distributed Computing: Fundamentals, Simulations and Advanced Topics
Hagit Attiya, Jennifer Welch.
McGraw-Hill Publishing, 1998, ISBN 0-07-709352 6

Introduction to Algorithms
Thomas Cormen, Charles Leiserson, Ronald Rivest.
The MIT Press, 1998, ISBN 0-262-53091-0 oder 0-262-03141-8

Disseminatin of Information in Communication Networks
Juraj Hromkovic, Ralf Klasing, Andrzej Pelc, Peter Ruzicka, Walter Unger.
Springer-Verlag, Berlin Heidelberg, 2005, ISBN 3-540-00846-2

Introduction to Parallel Algorithms and Architectures: Arrays, Trees, Hypercubes
Frank Thomson Leighton.
Morgan Kaufmann Publishers Inc., San Francisco, CA, 1991, ISBN 1-55860-117-1

Distributed Computing: A Locality-Sensitive Approach
David Peleg.
Society for Industrial and Applied Mathematics (SIAM), 2000, ISBN 0-89871-464-8
Prerequisites / NoticeCourse pre-requisites: Interest in algorithmic problems. (No particular course needed.)
263-3800-00LAdvanced Operating Systems Information W7 credits2V + 2U + 2AD. A. Cock, T. Roscoe
AbstractThis course is intended to give students a thorough understanding of design and implementation issues for modern operating systems, with a particular emphasis on the challenges of modern hardware features. We will cover key design issues in implementing an operating system, such as memory management, scheduling, protection, inter-process communication, device drivers, and file systems.
Learning objectiveThe goals of the course are, firstly, to give students:

1. A broader perspective on OS design than that provided by knowledge of Unix or Windows, building on the material in a standard undergraduate operating systems class

2. Practical experience in dealing directly with the concurrency, resource management, and abstraction problems confronting OS designers and implementers

3. A glimpse into future directions for the evolution of OS and computer hardware design
ContentThe course is based on practical implementation work, in C and assembly language, and requires solid knowledge of both. The work is mostly carried out in teams of 3-4, using real hardware, and is a mixture of team milestones and individual projects which fit together into a complete system at the end. Emphasis is also placed on a final report which details the complete finished artifact, evaluates its performance, and discusses the choices the team made while building it.
Prerequisites / NoticeThe course is based around a milestone-oriented project, where students work in small groups to implement major components of a microkernel-based operating system. The final assessment will be a combination grades awarded for milestones during the course of the project, a final written report on the work, and a set of test cases run on the final code.
Elective Courses
NumberTitleTypeECTSHoursLecturers
252-0312-00LUbiquitous Computing Information W4 credits2V + 1AC. Holz, F. Mattern, S. Mayer
AbstractUnlike desktop computing, ubiquitous computing occurs anytime and everywhere, using any device, in any location, and in any format. Computers exist in different forms, from watches and phones to refrigerators or pairs of glasses.
Main topics: Smart environments, IoT, mobiles & wearables, context & location, sensing & tracking, computer vision on embedded systems, health monitoring, fabrication.
Learning objectiveUnlike desktop computing, ubiquitous computing occurs anytime and everywhere, using any device, in any location, and in any format. Computers exist in different forms, from watches and phones to refrigerators or pairs of glasses.
Main topics: Smart environments, IoT, mobiles & wearables, context & location, sensing & tracking, computer vision on embedded systems, health monitoring, fabrication.
Lecture notesCopies of slides will be made available
LiteratureWill be provided in the lecture. To put you in the mood:
Mark Weiser: The Computer for the 21st Century. Scientific American, September 1991, pp. 94-104
252-0437-00LDistributed Algorithms Information W5 credits3V + 1AF. Mattern
AbstractModels of distributed computations, time space diagrams, virtual time, logical clocks and causality, wave algorithms, parallel and distributed graph traversal, consistent snapshots, mutual exclusion, election and symmetry breaking, distributed termination detection, garbage collection in distributed systems, monitoring distributed systems, global predicates.
Learning objectiveBecome acquainted with models and algorithms for distributed systems.
ContentVerteilte Algorithmen sind Verfahren, die dadurch charakterisiert sind, dass mehrere autonome Prozesse gleichzeitig Teile eines gemeinsamen Problems in kooperativer Weise bearbeiten und der dabei erforderliche Informationsaustausch ausschliesslich über Nachrichten erfolgt. Derartige Algorithmen kommen im Rahmen verteilter Systeme zum Einsatz, bei denen kein gemeinsamer Speicher existiert und die Übertragungszeit von Nachrichten i.a. nicht vernachlässigt werden kann. Da dabei kein Prozess eine aktuelle konsistente Sicht des globalen Zustands besitzt, führt dies zu interessanten Problemen.
Im einzelnen werden u.a. folgende Themen behandelt:
Modelle verteilter Berechnungen; Raum-Zeit Diagramme; Virtuelle Zeit; Logische Uhren und Kausalität; Wellenalgorithmen; Verteilte und parallele Graphtraversierung; Berechnung konsistenter Schnappschüsse; Wechselseitiger Ausschluss; Election und Symmetriebrechung; Verteilte Terminierung; Garbage-Collection in verteilten Systemen; Beobachten verteilter Systeme; Berechnung globaler Prädikate.
Literature- F. Mattern: Verteilte Basisalgorithmen, Springer-Verlag
- G. Tel: Topics in Distributed Algorithms, Cambridge University Press
- G. Tel: Introduction to Distributed Algorithms, Cambridge University Press, 2nd edition
- A.D. Kshemkalyani, M. Singhal: Distributed Computing, Cambridge University Press
- N. Lynch: Distributed Algorithms, Morgan Kaufmann Publ
252-0817-00LDistributed Systems Laboratory Information
In the Master Programme max. 10 credits can be accounted by Labs
on top of the Interfocus Courses. Additional Labs will be listed on the Addendum.
W10 credits9PG. Alonso, T. Hoefler, F. Mattern, A. Singla, R. Wattenhofer, C. Zhang
AbstractThis course involves the participation in a substantial development and/or evaluation project involving distributed systems technology. There are projects available in a wide range of areas: from web services to ubiquitous computing including as well wireless networks, ad-hoc networks, and distributed application on mobile phones.
Learning objectiveStudents acquire practical knowledge about technologies from the area of distributed systems.
ContentThis course involves the participation in a substantial development and/or evaluation project involving distributed systems technology. There are projects available in a wide range of areas: from web services to ubiquitous computing including as well wireless networks, ad-hoc networks, and distributed application on mobile phones. The objecte of the project is for the students to gain hands-on-experience with real products and the latest technology in distributed systems. There is no lecture associated to the course.
For information of the course or projects available, please contact Prof. Mattern, Prof. Wattenhofer, Prof. Roscoe or Prof. G. Alonso.
263-3710-00LMachine Perception Information Restricted registration - show details
Number of participants limited to 200.
W5 credits2V + 1U + 1AO. Hilliges
AbstractRecent developments in neural networks (aka “deep learning”) have drastically advanced the performance of machine perception systems in a variety of areas including computer vision, robotics, and intelligent UIs. This course is a deep dive into deep learning algorithms and architectures with applications to a variety of perceptual tasks.
Learning objectiveStudents will learn about fundamental aspects of modern deep learning approaches for perception. Students will learn to implement, train and debug their own neural networks and gain a detailed understanding of cutting-edge research in learning-based computer vision, robotics and HCI. The final project assignment will involve training a complex neural network architecture and applying it on a real-world dataset of human activity.

The core competency acquired through this course is a solid foundation in deep-learning algorithms to process and interpret human input into computing systems. In particular, students should be able to develop systems that deal with the problem of recognizing people in images, detecting and describing body parts, inferring their spatial configuration, performing action/gesture recognition from still images or image sequences, also considering multi-modal data, among others.
ContentWe will focus on teaching: how to set up the problem of machine perception, the learning algorithms, network architectures and advanced deep learning concepts in particular probabilistic deep learning models

The course covers the following main areas:
I) Foundations of deep-learning.
II) Probabilistic deep-learning for generative modelling of data (latent variable models, generative adversarial networks and auto-regressive models).
III) Deep learning in computer vision, human-computer interaction and robotics.

Specific topics include: 
I) Deep learning basics:
a) Neural Networks and training (i.e., backpropagation)
b) Feedforward Networks
c) Timeseries modelling (RNN, GRU, LSTM)
d) Convolutional Neural Networks for classification
II) Probabilistic Deep Learning:
a) Latent variable models (VAEs)
b) Generative adversarial networks (GANs)
c) Autoregressive models (PixelCNN, PixelRNN, TCNs)
III) Deep Learning techniques for machine perception:
a) Fully Convolutional architectures for dense per-pixel tasks (i.e., instance segmentation)
b) Pose estimation and other tasks involving human activity
c) Deep reinforcement learning
IV) Case studies from research in computer vision, HCI, robotics and signal processing
LiteratureDeep Learning
Book by Ian Goodfellow and Yoshua Bengio
Prerequisites / NoticeThis is an advanced grad-level course that requires a background in machine learning. Students are expected to have a solid mathematical foundation, in particular in linear algebra, multivariate calculus, and probability. The course will focus on state-of-the-art research in deep-learning and will not repeat basics of machine learning

Please take note of the following conditions:
1) The number of participants is limited to 200 students (MSc and PhDs).
2) Students must have taken the exam in Machine Learning (252-0535-00) or have acquired equivalent knowledge
3) All practical exercises will require basic knowledge of Python and will use libraries such as TensorFlow, scikit-learn and scikit-image. We will provide introductions to TensorFlow and other libraries that are needed but will not provide introductions to basic programming or Python.

The following courses are strongly recommended as prerequisite:
* "Visual Computing" or "Computer Vision"

The course will be assessed by a final written examination in English. No course materials or electronic devices can be used during the examination. Note that the examination will be based on the contents of the lectures, the associated reading materials and the exercises.
Information Systems
Core Courses
NumberTitleTypeECTSHoursLecturers
263-2925-00LProgram Analysis for System Security and Reliability Information W6 credits2V + 1U + 2AP. Tsankov
AbstractSecurity issues in modern systems (blockchains, datacenters, AI) result in billions of losses due to hacks. This course introduces the security issues in modern systems and state-of-the-art automated techniques for building secure and reliable systems. The course has a practical focus and covers systems built by successful ETH spin-offs.
Learning objective* Learn about security issues in modern systems -- blockchains, smart contracts, AI-based systems (e.g., autonomous cars), data centers -- and why they are challenging to address.

* Understand how the latest automated analysis techniques work, both discrete and probabilistic.

* Understand how these techniques combine with machine-learning methods, both supervised and unsupervised.

* Understand how to use these methods to build reliable and secure modern systems.

* Learn about new open problems that if solved can lead to research and commercial impact.
ContentPart I: Security of Blockchains

- We will cover existing blockchains (e.g., Ethereum, Bitcoin), how they work, what the core security issues are, and how these have led to massive financial losses.
- We will show how to extract useful information about smart contracts and transactions using interactive analysis frameworks for querying blockchains (e.g. Google's Ethereum BigQuery).
- We will discuss the state-of-the-art security tools (e.g., https://securify.ch) for ensuring that smart contracts are free of security vulnerabilities.
- We will study the latest automated reasoning systems (e.g., https://verx.ch) for checking custom (temporal) properties of smart contracts and illustrate their operation on real-world use cases.
- We will study the underlying methods for automated reasoning and testing (e.g., abstract interpretation, symbolic execution, fuzzing) are used to build such tools.


Part II: Security of Datacenters and Networks

- We will show how to ensure that datacenters and ISPs are secured using declarative reasoning methods (e.g., Datalog). We will also see how to automatically synthesize secure configurations (e.g. using SyNET and NetComplete) which lead to desirable behaviors, thus automating the job of the network operator and avoiding critical errors.
- We will discuss how to apply modern discrete probabilistic inference (e.g., PSI and Bayonet) so to reason about probabilistic network properties (e.g., the probability of a packet reaching a destination if links fail).


Part III: Machine Learning for Security

- We will discuss how machine learning models for structured prediction are used to address security tasks, including de-obfuscation of binaries (Debin: https://debin.ai), Android APKs (DeGuard: http://apk-deguard.com) and JavaScript (JSNice: http://jsnice.org).
- We will study to leverage program abstractions in combination with clustering techniques to learn security rules for cryptography APIs from large codebases.
- We will study how to automatically learn to identify security vulnerabilities related to the handling of untrusted inputs (cross-Site scripting, SQL injection, path traversal, remote code execution) from large codebases.


To gain a deeper understanding, the course will involve a hands-on programming project where the methods studied in the class will be applied.
Elective Courses
NumberTitleTypeECTSHoursLecturers
252-0312-00LUbiquitous Computing Information W4 credits2V + 1AC. Holz, F. Mattern, S. Mayer
AbstractUnlike desktop computing, ubiquitous computing occurs anytime and everywhere, using any device, in any location, and in any format. Computers exist in different forms, from watches and phones to refrigerators or pairs of glasses.
Main topics: Smart environments, IoT, mobiles & wearables, context & location, sensing & tracking, computer vision on embedded systems, health monitoring, fabrication.
Learning objectiveUnlike desktop computing, ubiquitous computing occurs anytime and everywhere, using any device, in any location, and in any format. Computers exist in different forms, from watches and phones to refrigerators or pairs of glasses.
Main topics: Smart environments, IoT, mobiles & wearables, context & location, sensing & tracking, computer vision on embedded systems, health monitoring, fabrication.
Lecture notesCopies of slides will be made available
LiteratureWill be provided in the lecture. To put you in the mood:
Mark Weiser: The Computer for the 21st Century. Scientific American, September 1991, pp. 94-104
Software Engineering
Core Courses
NumberTitleTypeECTSHoursLecturers
263-2925-00LProgram Analysis for System Security and Reliability Information W6 credits2V + 1U + 2AP. Tsankov
AbstractSecurity issues in modern systems (blockchains, datacenters, AI) result in billions of losses due to hacks. This course introduces the security issues in modern systems and state-of-the-art automated techniques for building secure and reliable systems. The course has a practical focus and covers systems built by successful ETH spin-offs.
Learning objective* Learn about security issues in modern systems -- blockchains, smart contracts, AI-based systems (e.g., autonomous cars), data centers -- and why they are challenging to address.

* Understand how the latest automated analysis techniques work, both discrete and probabilistic.

* Understand how these techniques combine with machine-learning methods, both supervised and unsupervised.

* Understand how to use these methods to build reliable and secure modern systems.

* Learn about new open problems that if solved can lead to research and commercial impact.
ContentPart I: Security of Blockchains

- We will cover existing blockchains (e.g., Ethereum, Bitcoin), how they work, what the core security issues are, and how these have led to massive financial losses.
- We will show how to extract useful information about smart contracts and transactions using interactive analysis frameworks for querying blockchains (e.g. Google's Ethereum BigQuery).
- We will discuss the state-of-the-art security tools (e.g., https://securify.ch) for ensuring that smart contracts are free of security vulnerabilities.
- We will study the latest automated reasoning systems (e.g., https://verx.ch) for checking custom (temporal) properties of smart contracts and illustrate their operation on real-world use cases.
- We will study the underlying methods for automated reasoning and testing (e.g., abstract interpretation, symbolic execution, fuzzing) are used to build such tools.


Part II: Security of Datacenters and Networks

- We will show how to ensure that datacenters and ISPs are secured using declarative reasoning methods (e.g., Datalog). We will also see how to automatically synthesize secure configurations (e.g. using SyNET and NetComplete) which lead to desirable behaviors, thus automating the job of the network operator and avoiding critical errors.
- We will discuss how to apply modern discrete probabilistic inference (e.g., PSI and Bayonet) so to reason about probabilistic network properties (e.g., the probability of a packet reaching a destination if links fail).


Part III: Machine Learning for Security

- We will discuss how machine learning models for structured prediction are used to address security tasks, including de-obfuscation of binaries (Debin: https://debin.ai), Android APKs (DeGuard: http://apk-deguard.com) and JavaScript (JSNice: http://jsnice.org).
- We will study to leverage program abstractions in combination with clustering techniques to learn security rules for cryptography APIs from large codebases.
- We will study how to automatically learn to identify security vulnerabilities related to the handling of untrusted inputs (cross-Site scripting, SQL injection, path traversal, remote code execution) from large codebases.


To gain a deeper understanding, the course will involve a hands-on programming project where the methods studied in the class will be applied.
Elective Courses
In spring 2020 there will be no course offered in this category.
Theoretical Computer Science
Core Courses
NumberTitleTypeECTSHoursLecturers
261-5110-00LOptimization for Data Science Information W8 credits3V + 2U + 2AB. Gärtner, D. Steurer
AbstractThis course provides an in-depth theoretical treatment of optimization methods that are particularly relevant in data science.
Learning objectiveUnderstanding the theoretical guarantees (and their limits) of relevant optimization methods used in data science. Learning general paradigms to deal with optimization problems arising in data science.
ContentThis course provides an in-depth theoretical treatment of optimization methods that are particularly relevant in machine learning and data science.

In the first part of the course, we will first give a brief introduction to convex optimization, with some basic motivating examples from machine learning. Then we will analyse classical and more recent first and second order methods for convex optimization: gradient descent, projected gradient descent, subgradient descent, stochastic gradient descent, Nesterov's accelerated method, Newton's method, and Quasi-Newton methods. The emphasis will be on analysis techniques that occur repeatedly in convergence analyses for various classes of convex functions. We will also discuss some classical and recent theoretical results for nonconvex optimization.

In the second part, we discuss convex programming relaxations as a powerful and versatile paradigm for designing efficient algorithms to solve computational problems arising in data science. We will learn about this paradigm and develop a unified perspective on it through the lens of the sum-of-squares semidefinite programming hierarchy. As applications, we are discussing non-negative matrix factorization, compressed sensing and sparse linear regression, matrix completion and phase retrieval, as well as robust estimation.
Prerequisites / NoticeAs background, we require material taught in the course "252-0209-00L Algorithms, Probability, and Computing". It is not necessary that participants have actually taken the course, but they should be prepared to catch up if necessary.
Elective Courses
NumberTitleTypeECTSHoursLecturers
252-1424-00LModels of ComputationW6 credits2V + 2U + 1AM. Cook
AbstractThis course surveys many different models of computation: Turing Machines, Cellular Automata, Finite State Machines, Graph Automata, Circuits, Tilings, Lambda Calculus, Fractran, Chemical Reaction Networks, Hopfield Networks, String Rewriting Systems, Tag Systems, Diophantine Equations, Register Machines, Primitive Recursive Functions, and more.
Learning objectiveThe goal of this course is to become acquainted with a wide variety of models of computation, to understand how models help us to understand the modeled systems, and to be able to develop and analyze models appropriate for new systems.
ContentThis course surveys many different models of computation: Turing Machines, Cellular Automata, Finite State Machines, Graph Automata, Circuits, Tilings, Lambda Calculus, Fractran, Chemical Reaction Networks, Hopfield Networks, String Rewriting Systems, Tag Systems, Diophantine Equations, Register Machines, Primitive Recursive Functions, and more.
272-0302-00LApproximation and Online Algorithms Information W5 credits2V + 1U + 1AH.‑J. Böckenhauer, D. Komm
AbstractThis lecture deals with approximative algorithms for hard optimization problems and algorithmic approaches for solving online problems as well as the limits of these approaches.
Learning objectiveGet a systematic overview of different methods for designing approximative algorithms for hard optimization problems and online problems. Get to know methods for showing the limitations of these approaches.
ContentApproximation algorithms are one of the most succesful techniques to attack hard optimization problems. Here, we study the so-called approximation ratio, i.e., the ratio of the cost of the computed approximating solution and an optimal one (which is not computable efficiently).
For an online problem, the whole instance is not known in advance, but it arrives pieceweise and for every such piece a corresponding part of the definite output must be given. The quality of an algorithm for such an online problem is measured by the competitive ratio, i.e., the ratio of the cost of the computed solution and the cost of an optimal solution that could be given if the whole input was known in advance.

The contents of this lecture are
- the classification of optimization problems by the reachable approximation ratio,
- systematic methods to design approximation algorithms (e.g., greedy strategies, dynamic programming, linear programming relaxation),
- methods to show non-approximability,
- classic online problem like paging or scheduling problems and corresponding algorithms,
- randomized online algorithms,
- the design and analysis principles for online algorithms, and
- limits of the competitive ratio and the advice complexity as a way to do a deeper analysis of the complexity of online problems.
LiteratureThe lecture is based on the following books:

J. Hromkovic: Algorithmics for Hard Problems, Springer, 2004

D. Komm: An Introduction to Online Computation: Determinism, Randomization, Advice, Springer, 2016

Additional literature:

A. Borodin, R. El-Yaniv: Online Computation and Competitive Analysis, Cambridge University Press, 1998
263-4400-00LAdvanced Graph Algorithms and Optimization Information Restricted registration - show details
Number of participants limited to 30.
W5 credits3G + 1AR. Kyng
AbstractThis course will cover a number of advanced topics in optimization and graph algorithms.
Learning objectiveThe course will take students on a deep dive into modern approaches to
graph algorithms using convex optimization techniques.

By studying convex optimization through the lens of graph algorithms,
students should develop a deeper understanding of fundamental
phenomena in optimization.

The course will cover some traditional discrete approaches to various graph
problems, especially flow problems, and then contrast these approaches
with modern, asymptotically faster methods based on combining convex
optimization with spectral and combinatorial graph theory.
ContentStudents should leave the course understanding key
concepts in optimization such as first and second-order optimization,
convex duality, multiplicative weights and dual-based methods,
acceleration, preconditioning, and non-Euclidean optimization.

Students will also be familiarized with central techniques in the
development of graph algorithms in the past 15 years, including graph
decomposition techniques, sparsification, oblivious routing, and
spectral and combinatorial preconditioning.
Prerequisites / NoticeThis course is targeted toward masters and doctoral students with an
interest in theoretical computer science.

Students should be comfortable with design and analysis of algorithms, probability, and linear algebra.

Having passed the course Algorithms, Probability, and Computing (APC) is highly recommended, but not formally required. If you are not
sure whether you're ready for this class or not, please consult the
instructor.
401-3052-05LGraph Theory Information W5 credits2V + 1UB. Sudakov
AbstractBasic notions, trees, spanning trees, Caley's formula, vertex and edge connectivity, 2-connectivity, Mader's theorem, Menger's theorem, Eulerian graphs, Hamilton cycles, Dirac's theorem, matchings, theorems of Hall, König and Tutte, planar graphs, Euler's formula, basic non-planar graphs, graph colorings, greedy colorings, Brooks' theorem, 5-colorings of planar graphs
Learning objectiveThe students will get an overview over the most fundamental questions concerning graph theory. We expect them to understand the proof techniques and to use them autonomously on related problems.
Lecture notesLecture will be only at the blackboard.
LiteratureWest, D.: "Introduction to Graph Theory"
Diestel, R.: "Graph Theory"

Further literature links will be provided in the lecture.
Prerequisites / NoticeStudents are expected to have a mathematical background and should be able to write rigorous proofs.


NOTICE: This course unit was previously offered as 252-1408-00L Graphs and Algorithms.
401-3903-11LGeometric Integer ProgrammingW6 credits2V + 1UJ. Paat
AbstractInteger programming is the task of minimizing a linear function over all the integer points in a polyhedron. This lecture introduces the key concepts of an algorithmic theory for solving such problems.
Learning objectiveThe purpose of the lecture is to provide a geometric treatment of the theory of integer optimization.
ContentKey topics are:

- Lattice theory and the polynomial time solvability of integer optimization problems in fixed dimension.

- Structural properties of integer sets that reveal other parameters affecting the complexity of integer problems

- Duality theory for integer optimization problems from the vantage point of lattice free sets.
Lecture notesnot available, blackboard presentation
LiteratureLecture notes will be provided.

Other helpful materials include

Bertsimas, Weismantel: Optimization over Integers, 2005

and

Schrijver: Theory of linear and integer programming, 1986.
Prerequisites / Notice"Mathematical Optimization" (401-3901-00L)
Visual Computing
Core Courses
NumberTitleTypeECTSHoursLecturers
252-0538-00LShape Modeling and Geometry Processing Information W6 credits2V + 1U + 2AO. Sorkine Hornung
AbstractThis course covers the fundamentals and some of the latest developments in geometric modeling and geometry processing. Topics include surface modeling based on point clouds and polygonal meshes, mesh generation, surface reconstruction, mesh fairing and parameterization, discrete differential geometry, interactive shape editing, topics in digital shape fabrication.
Learning objectiveThe students will learn how to design, program and analyze algorithms and systems for interactive 3D shape modeling and geometry processing.
ContentRecent advances in 3D geometry processing have created a plenitude of novel concepts for the mathematical representation and interactive manipulation of geometric models. This course covers the fundamentals and some of the latest developments in geometric modeling and geometry processing. Topics include surface modeling based on point clouds and triangle meshes, mesh generation, surface reconstruction, mesh fairing and parameterization, discrete differential geometry, interactive shape editing and digital shape fabrication.
Lecture notesSlides and course notes
Prerequisites / NoticePrerequisites:
Visual Computing, Computer Graphics or an equivalent class. Experience with C++ programming. Solid background in linear algebra and analysis. Some knowledge of differential geometry, computational geometry and numerical methods is helpful but not a strict requirement.
Elective Courses
NumberTitleTypeECTSHoursLecturers
252-0526-00LStatistical Learning Theory Information W7 credits3V + 2U + 1AJ. M. Buhmann, C. Cotrini Jimenez
AbstractThe course covers advanced methods of statistical learning:

- Variational methods and optimization.
- Deterministic annealing.
- Clustering for diverse types of data.
- Model validation by information theory.
Learning objectiveThe course surveys recent methods of statistical learning. The fundamentals of machine learning, as presented in the courses "Introduction to Machine Learning" and "Advanced Machine Learning", are expanded from the perspective of statistical learning.
Content- Variational methods and optimization. We consider optimization approaches for problems where the optimizer is a probability distribution. We will discuss concepts like maximum entropy, information bottleneck, and deterministic annealing.

- Clustering. This is the problem of sorting data into groups without using training samples. We discuss alternative notions of "similarity" between data points and adequate optimization procedures.

- Model selection and validation. This refers to the question of how complex the chosen model should be. In particular, we present an information theoretic approach for model validation.

- Statistical physics models. We discuss approaches for approximately optimizing large systems, which originate in statistical physics (free energy minimization applied to spin glasses and other models). We also study sampling methods based on these models.
Lecture notesA draft of a script will be provided. Lecture slides will be made available.
LiteratureHastie, Tibshirani, Friedman: The Elements of Statistical Learning, Springer, 2001.

L. Devroye, L. Gyorfi, and G. Lugosi: A probabilistic theory of pattern recognition. Springer, New York, 1996
Prerequisites / NoticeKnowledge of machine learning (introduction to machine learning and/or advanced machine learning)
Basic knowledge of statistics.
252-0570-00LGame Programming Laboratory Information
In the Master Programme max. 10 credits can be accounted by Labs on top of the Interfocus Courses. Additional Labs will be listed on the Addendum.
W10 credits9PB. Sumner
AbstractThe goal of this course is the in-depth understanding of the technology and programming underlying computer games. Students gradually design and develop a computer game in small groups and get acquainted with the art of game programming.
Learning objectiveThe goal of this new course is to acquaint students with the
technology and art of programming modern three-dimensional computer
games.
ContentThis course addresses modern three-dimensional computer game technology. During the course, small groups of students will design and develop a computer game. Focus will be put on technical aspects of game development, such as rendering, cinematography, interaction, physics, animation, and AI. In addition, we will cultivate creative thinking for advanced gameplay and visual effects.

The "laboratory" format involves a practical, hands-on approach with traditional lectures. We will meet once a week to discuss technical issues and to track progress. For development we use MonoGames, which is a collection of libraries and tools that facilitate game development. While development will take place on PCs, we will ultimately deployour games on the Xbox One console.

At the end of the course we will present our results to the public.
Lecture notesGame Design Workshop: A Playcentric Approach to Creating Innovative Games by Tracy Fullerton
Prerequisites / NoticeThe number of participants is limited.

Prerequisites include:

- Good programming skills (Java, C++, C#, etc.)

- CG experience: Students should have taken, at a minimum, Visual
Computing. Higher level courses are recommended, such as Introduction
to Computer Graphics, Surface Representations and Geometric Modeling,
and Physically-based Simulation in Computer Graphics.
252-0579-00L3D Vision Information W5 credits3G + 1AM. Pollefeys, V. Larsson
AbstractThe course covers camera models and calibration, feature tracking and matching, camera motion estimation via simultaneous localization and mapping (SLAM) and visual odometry (VO), epipolar and mult-view geometry, structure-from-motion, (multi-view) stereo, augmented reality, and image-based (re-)localization.
Learning objectiveAfter attending this course, students will:
1. understand the core concepts for recovering 3D shape of objects and scenes from images and video.
2. be able to implement basic systems for vision-based robotics and simple virtual/augmented reality applications.
3. have a good overview over the current state-of-the art in 3D vision.
4. be able to critically analyze and asses current research in this area.
ContentThe goal of this course is to teach the core techniques required for robotic and augmented reality applications: How to determine the motion of a camera and how to estimate the absolute position and orientation of a camera in the real world. This course will introduce the basic concepts of 3D Vision in the form of short lectures, followed by student presentations discussing the current state-of-the-art. The main focus of this course are student projects on 3D Vision topics, with an emphasis on robotic vision and virtual and augmented reality applications.
252-5706-00LMathematical Foundations of Computer Graphics and Vision Information W5 credits2V + 1U + 1AM. R. Oswald, C. Öztireli
AbstractThis course presents the fundamental mathematical tools and concepts used in computer graphics and vision. Each theoretical topic is introduced in the context of practical vision or graphic problems, showcasing its importance in real-world applications.
Learning objectiveThe main goal is to equip the students with the key mathematical tools necessary to understand state-of-the-art algorithms in vision and graphics. In addition to the theoretical part, the students will learn how to use these mathematical tools to solve a wide range of practical problems in visual computing. After successfully completing this course, the students will be able to apply these mathematical concepts and tools to practical industrial and academic projects in visual computing.
ContentThe theory behind various mathematical concepts and tools will be introduced, and their practical utility will be showcased in diverse applications in computer graphics and vision. The course will cover topics in sampling, reconstruction, approximation, optimization, robust fitting, differentiation, quadrature and spectral methods. Applications will include 3D surface reconstruction, camera pose estimation, image editing, data projection, character animation, structure-aware geometry processing, and rendering.
263-3710-00LMachine Perception Information Restricted registration - show details
Number of participants limited to 200.
W5 credits2V + 1U + 1AO. Hilliges
AbstractRecent developments in neural networks (aka “deep learning”) have drastically advanced the performance of machine perception systems in a variety of areas including computer vision, robotics, and intelligent UIs. This course is a deep dive into deep learning algorithms and architectures with applications to a variety of perceptual tasks.
Learning objectiveStudents will learn about fundamental aspects of modern deep learning approaches for perception. Students will learn to implement, train and debug their own neural networks and gain a detailed understanding of cutting-edge research in learning-based computer vision, robotics and HCI. The final project assignment will involve training a complex neural network architecture and applying it on a real-world dataset of human activity.

The core competency acquired through this course is a solid foundation in deep-learning algorithms to process and interpret human input into computing systems. In particular, students should be able to develop systems that deal with the problem of recognizing people in images, detecting and describing body parts, inferring their spatial configuration, performing action/gesture recognition from still images or image sequences, also considering multi-modal data, among others.
ContentWe will focus on teaching: how to set up the problem of machine perception, the learning algorithms, network architectures and advanced deep learning concepts in particular probabilistic deep learning models

The course covers the following main areas:
I) Foundations of deep-learning.
II) Probabilistic deep-learning for generative modelling of data (latent variable models, generative adversarial networks and auto-regressive models).
III) Deep learning in computer vision, human-computer interaction and robotics.

Specific topics include: 
I) Deep learning basics:
a) Neural Networks and training (i.e., backpropagation)
b) Feedforward Networks
c) Timeseries modelling (RNN, GRU, LSTM)
d) Convolutional Neural Networks for classification
II) Probabilistic Deep Learning:
a) Latent variable models (VAEs)
b) Generative adversarial networks (GANs)
c) Autoregressive models (PixelCNN, PixelRNN, TCNs)
III) Deep Learning techniques for machine perception:
a) Fully Convolutional architectures for dense per-pixel tasks (i.e., instance segmentation)
b) Pose estimation and other tasks involving human activity
c) Deep reinforcement learning
IV) Case studies from research in computer vision, HCI, robotics and signal processing
LiteratureDeep Learning
Book by Ian Goodfellow and Yoshua Bengio
Prerequisites / NoticeThis is an advanced grad-level course that requires a background in machine learning. Students are expected to have a solid mathematical foundation, in particular in linear algebra, multivariate calculus, and probability. The course will focus on state-of-the-art research in deep-learning and will not repeat basics of machine learning

Please take note of the following conditions:
1) The number of participants is limited to 200 students (MSc and PhDs).
2) Students must have taken the exam in Machine Learning (252-0535-00) or have acquired equivalent knowledge
3) All practical exercises will require basic knowledge of Python and will use libraries such as TensorFlow, scikit-learn and scikit-image. We will provide introductions to TensorFlow and other libraries that are needed but will not provide introductions to basic programming or Python.

The following courses are strongly recommended as prerequisite:
* "Visual Computing" or "Computer Vision"

The course will be assessed by a final written examination in English. No course materials or electronic devices can be used during the examination. Note that the examination will be based on the contents of the lectures, the associated reading materials and the exercises.
263-5806-00LComputational Models of Motion for Character Animation and Robotics Information W6 credits2V + 2U + 1AS. Coros, M. Bächer, B. Thomaszewski
AbstractThis course covers fundamentals of physics-based modelling and numerical optimization from the perspective of character animation and robotics applications. The methods discussed in class derive their theoretical underpinnings from applied mathematics, control theory and computational mechanics, and they will be richly illustrated using examples ranging from locomotion controllers and crowd simula
Learning objectiveStudents will learn how to represent, model and algorithmically control the behavior of animated characters and real-life robots. The lectures are accompanied by programming assignments (written in C++) and a capstone project.
ContentOptimal control and trajectory optimization; multibody systems; kinematics; forward and inverse dynamics; constrained and unconstrained numerical optimization; mass-spring models for crowd simulation; FEM; compliant systems; sim-to-real; robotic manipulation of elastically-deforming objects.
Prerequisites / NoticeExperience with C++ programming, numerical linear algebra and multivariate calculus. Some background in physics-based modeling, kinematics and dynamics is helpful, but not necessary.
227-1034-00LComputational Vision (University of Zurich)
No enrolment to this course at ETH Zurich. Book the corresponding module directly at UZH.
UZH Module Code: INI402

Mind the enrolment deadlines at UZH:
https://www.uzh.ch/cmsssl/en/studies/application/mobilitaet.html
W6 credits2V + 1UD. Kiper
AbstractThis course focuses on neural computations that underlie visual perception. We study how visual signals are processed in the retina, LGN and visual cortex. We study the morpholgy and functional architecture of cortical circuits responsible for pattern, motion, color, and three-dimensional vision.
Learning objectiveThis course considers the operation of circuits in the process of neural computations. The evolution of neural systems will be considered to demonstrate how neural structures and mechanisms are optimised for energy capture, transduction, transmission and representation of information. Canonical brain circuits will be described as models for the analysis of sensory information. The concept of receptive fields will be introduced and their role in coding spatial and temporal information will be considered. The constraints of the bandwidth of neural channels and the mechanisms of normalization by neural circuits will be discussed.
The visual system will form the basis of case studies in the computation of form, depth, and motion. The role of multiple channels and collective computations for object recognition will
be considered. Coordinate transformations of space and time by cortical and subcortical mechanisms will be analysed. The means by which sensory and motor systems are integrated to allow for adaptive behaviour will be considered.
ContentThis course considers the operation of circuits in the process of neural computations. The evolution of neural systems will be considered to demonstrate how neural structures and mechanisms are optimised for energy capture, transduction, transmission and representation of information. Canonical brain circuits will be described as models for the analysis of sensory information. The concept of receptive fields will be introduced and their role in coding spatial and temporal information will be considered. The constraints of the bandwidth of neural channels and the mechanisms of normalization by neural circuits will be discussed.
The visual system will form the basis of case studies in the computation of form, depth, and motion. The role of multiple channels and collective computations for object recognition will
be considered. Coordinate transformations of space and time by cortical and subcortical mechanisms will be analysed. The means by which sensory and motor systems are integrated to allow for adaptive behaviour will be considered.
LiteratureBooks: (recommended references, not required)
1. An Introduction to Natural Computation, D. Ballard (Bradford Books, MIT Press) 1997.
2. The Handbook of Brain Theorie and Neural Networks, M. Arbib (editor), (MIT Press) 1995.
Interfocus Courses
NumberTitleTypeECTSHoursLecturers
263-0007-00LAdvanced Systems Lab Information Restricted registration - show details
Only for master students, otherwise a special permission by the study administration of D-INFK is required.
W8 credits3V + 2U + 2AM. Püschel, C. Zhang
AbstractThis course introduces the student to the foundations and state-of-the-art techniques in developing high performance software for mathematical functionality occurring in various fields in computer science. The focus is on optimizing for a single core and includes optimizing for the memory hierarchy, for special instruction sets, and the possible use of automatic performance tuning.
Learning objectiveSoftware performance (i.e., runtime) arises through the complex interaction of algorithm, its implementation, the compiler used, and the microarchitecture the program is run on. The first goal of the course is to provide the student with an understanding of this "vertical" interaction, and hence software performance, for mathematical functionality. The second goal is to teach a systematic strategy how to use this knowledge to write fast software for numerical problems. This strategy will be trained in several homeworks and a semester-long group project.
ContentThe fast evolution and increasing complexity of computing platforms pose a major challenge for developers of high performance software for engineering, science, and consumer applications: it becomes increasingly harder to harness the available computing power. Straightforward implementations may lose as much as one or two orders of magnitude in performance. On the other hand, creating optimal implementations requires the developer to have an understanding of algorithms, capabilities and limitations of compilers, and the target platform's architecture and microarchitecture.

This interdisciplinary course introduces the student to the foundations and state-of-the-art techniques in high performance mathematical software development using important functionality such as matrix operations, transforms, filters, and others as examples. The course will explain how to optimize for the memory hierarchy, take advantage of special instruction sets, and other details of current processors that require optimization. The concept of automatic performance tuning is introduced. The focus is on optimization for a single core; thus, the course complements others on parallel and distributed computing.

Finally a general strategy for performance analysis and optimization is introduced that the students will apply in group projects that accompany the course.
Prerequisites / NoticeSolid knowledge of the C programming language and matrix algebra.
263-0008-00LComputational Intelligence Lab
Only for master students, otherwise a special permission by the study administration of D-INFK is required.
W8 credits2V + 2U + 3AT. Hofmann
AbstractThis laboratory course teaches fundamental concepts in computational science and machine learning with a special emphasis on matrix factorization and representation learning. The class covers techniques like dimension reduction, data clustering, sparse coding, and deep learning as well as a wide spectrum of related use cases and applications.
Learning objectiveStudents acquire fundamental theoretical concepts and methodologies from machine learning and how to apply these techniques to build intelligent systems that solve real-world problems. They learn to successfully develop solutions to application problems by following the key steps of modeling, algorithm design, implementation and experimental validation.

This lab course has a strong focus on practical assignments. Students work in groups of three to four people, to develop solutions to three application problems: 1. Collaborative filtering and recommender systems, 2. Text sentiment classification, and 3. Road segmentation in aerial imagery.

For each of these problems, students submit their solutions to an online evaluation and ranking system, and get feedback in terms of numerical accuracy and computational speed. In the final part of the course, students combine and extend one of their previous promising solutions, and write up their findings in an extended abstract in the style of a conference paper.

(Disclaimer: The offered projects may be subject to change from year to year.)
Contentsee course description
Free Electives
All Master level courses offered by ETH Zurich, EPF Lausanne and the University of Zurich may be chosen.
» Course Catalogue of ETH Zurich
GESS Science in Perspective
» Recommended GESS Science in Perspective (Type B) for D-INFK.
» see GESS Science in Perspective: Type A: Enhancement of Reflection Capability
» see GESS Science in Perspective: Language Courses ETH/UZH
Internship
NumberTitleTypeECTSHoursLecturers
260-0700-00LInternship
Only for Cyber Security MSc
E-0 creditsexternal organisers
AbstractAn Internship provides opportunities to gain experience in an industrial environment and it creates a network of contacts.
Learning objectivesee above
Master's Thesis
NumberTitleTypeECTSHoursLecturers
260-0800-00LMaster's Thesis Restricted registration - show details
Only students who fulfill the following criteria are allowed to begin with their master thesis:
a. successful completion of the bachelor programme;
b. fulfilling of any additional requirements necessary to gain admission to the master programme.
O30 credits64DProfessors
AbstractThe Master thesis completes the master programme and is an independent scientific project.
Learning objectiveThe Master’s thesis shall demonstrate that students are able to use the knowledge and skills acquired during Master’s studies to solve a complex cyber security problem.