Search result: Catalogue data in Spring Semester 2023
Computer Science Master | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Majors | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Major in Data Management Systems | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Core Courses | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Number | Title | Type | ECTS | Hours | Lecturers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
263-3855-00L | Cloud Computing Architecture | W | 9 credits | 3V + 2U + 3A | G. Alonso, A. Klimovic | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Cloud computing hosts a wide variety of online services that we use on a daily basis, including web search, social networks, and video streaming. This course will cover how datacenter hardware, systems software, and application frameworks are designed for the cloud. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | After successful completion of this course, students will be able to: 1) reason about performance, energy efficiency, and availability tradeoffs in the design of cloud system software, 2) describe how datacenter hardware is organized and explain why it is organized as such, 3) implement cloud applications as well as analyze and optimize their performance. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | In this course, we study how datacenter hardware, systems software, and applications are designed at large scale for the cloud. The course covers topics including server design, cluster management, large-scale storage systems, serverless computing, data analytics frameworks, and performance analysis. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Lecture slides will be available on the course website. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Undergraduate courses in 1) computer architecture and 2) operating systems, distributed systems, and/or database systems are strongly recommended. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Elective Courses | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Number | Title | Type | ECTS | Hours | Lecturers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-3800-00L | Advanced Operating Systems | W | 7 credits | 2V + 2U + 2A | D. Cock, T. Roscoe | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course is intended to give students a thorough understanding of design and implementation issues for modern operating systems, with a particular emphasis on the challenges of modern hardware features. We will cover key design issues in implementing an operating system, such as memory management, scheduling, protection, inter-process communication, device drivers, and file systems. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The goals of the course are, firstly, to give students: 1. A broader perspective on OS design than that provided by knowledge of Unix or Windows, building on the material in a standard undergraduate operating systems class 2. Practical experience in dealing directly with the concurrency, resource management, and abstraction problems confronting OS designers and implementers 3. A glimpse into future directions for the evolution of OS and computer hardware design | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The course is based on practical implementation work, in C and assembly language, and requires solid knowledge of both. The work is mostly carried out in teams of 3-4, using real hardware, and is a mixture of team milestones and individual projects which fit together into a complete system at the end. Emphasis is also placed on a final report which details the complete finished artifact, evaluates its performance, and discusses the choices the team made while building it. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | The course is based around a milestone-oriented project, where students work in small groups to implement major components of a microkernel-based operating system. The final assessment will be a combination grades awarded for milestones during the course of the project, a final written report on the work, and a set of test cases run on the final code. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
227-0558-00L | Principles of Distributed Computing | W | 7 credits | 2V + 2U + 2A | R. Wattenhofer | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | We study the fundamental issues underlying the design of distributed systems: communication, coordination, fault-tolerance, locality, parallelism, self-organization, symmetry breaking, synchronization, uncertainty. We explore essential algorithmic ideas and lower bound techniques. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Distributed computing is essential in modern computing and communications systems. Examples are on the one hand large-scale networks such as the Internet, and on the other hand multiprocessors such as your new multi-core laptop. This course introduces the principles of distributed computing, emphasizing the fundamental issues underlying the design of distributed systems and networks: communication, coordination, fault-tolerance, locality, parallelism, self-organization, symmetry breaking, synchronization, uncertainty. We explore essential algorithmic ideas and lower bound techniques, basically the "pearls" of distributed computing. We will cover a fresh topic every week. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Distributed computing models and paradigms, e.g. message passing, shared memory, synchronous vs. asynchronous systems, time and message complexity, peer-to-peer systems, small-world networks, social networks, sorting networks, wireless communication, and self-organizing systems. Distributed algorithms, e.g. leader election, coloring, covering, packing, decomposition, spanning trees, mutual exclusion, store and collect, arrow, ivy, synchronizers, diameter, all-pairs-shortest-path, wake-up, and lower bounds | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Available. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Lecture Notes By Roger Wattenhofer. These lecture notes are taught at about a dozen different universities through the world. Mastering Distributed Algorithms Roger Wattenhofer Inverted Forest Publishing, 2020. ISBN 979-8628688267 Distributed Computing: Fundamentals, Simulations and Advanced Topics Hagit Attiya, Jennifer Welch. McGraw-Hill Publishing, 1998, ISBN 0-07-709352 6 Introduction to Algorithms Thomas Cormen, Charles Leiserson, Ronald Rivest. The MIT Press, 1998, ISBN 0-262-53091-0 oder 0-262-03141-8 Disseminatin of Information in Communication Networks Juraj Hromkovic, Ralf Klasing, Andrzej Pelc, Peter Ruzicka, Walter Unger. Springer-Verlag, Berlin Heidelberg, 2005, ISBN 3-540-00846-2 Introduction to Parallel Algorithms and Architectures: Arrays, Trees, Hypercubes Frank Thomson Leighton. Morgan Kaufmann Publishers Inc., San Francisco, CA, 1991, ISBN 1-55860-117-1 Distributed Computing: A Locality-Sensitive Approach David Peleg. Society for Industrial and Applied Mathematics (SIAM), 2000, ISBN 0-89871-464-8 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Course pre-requisites: Interest in algorithmic problems. (No particular course needed.) | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Major in Machine Intelligence | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Core Courses | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Number | Title | Type | ECTS | Hours | Lecturers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
261-5110-00L | Optimization for Data Science | W | 10 credits | 3V + 2U + 4A | B. Gärtner, N. He | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course provides an in-depth theoretical treatment of optimization methods that are relevant in data science. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Understanding the guarantees and limits of relevant optimization methods used in data science. Learning theoretical paradigms and techniques to deal with optimization problems arising in data science. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | This course provides an in-depth theoretical treatment of classical and modern optimization methods that are relevant in data science. After a general discussion about the role that optimization has in the process of learning from data, we give an introduction to the theory of (convex) optimization. Based on this, we present and analyze algorithms in the following four categories: first-order methods (gradient and coordinate descent, Frank-Wolfe, subgradient and mirror descent, stochastic and incremental gradient methods); second-order methods (Newton and quasi Newton methods); non-convexity (local convergence, provable global convergence, cone programming, convex relaxations); min-max optimization (extragradient methods). The emphasis is on the motivations and design principles behind the algorithms, on provable performance bounds, and on the mathematical tools and techniques to prove them. The goal is to equip students with a fundamental understanding about why optimization algorithms work, and what their limits are. This understanding will be of help in selecting suitable algorithms in a given application, but providing concrete practical guidance is not our focus. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | A solid background in analysis and linear algebra; some background in theoretical computer science (computational complexity, analysis of algorithms); the ability to understand and write mathematical proofs. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-3710-00L | Machine Perception | W | 8 credits | 3V + 2U + 2A | O. Hilliges, J. Song | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Recent developments in neural networks have drastically advanced the performance of machine perception systems in a variety of areas including computer vision, robotics, and human shape modeling This course is a deep dive into deep learning algorithms and architectures with applications to a variety of perceptual and generative tasks. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Students will learn about fundamental aspects of modern deep learning approaches for perception and generation. Students will learn to implement, train and debug their own neural networks and gain a detailed understanding of cutting-edge research in learning-based computer vision, robotics, and shape modeling. The optional final project assignment will involve training a complex neural network architecture and applying it to a real-world dataset. The core competency acquired through this course is a solid foundation in deep-learning algorithms to process and interpret human-centric signals. In particular, students should be able to develop systems that deal with the problem of recognizing people in images, detecting and describing body parts, inferring their spatial configuration, performing action/gesture recognition from still images or image sequences, also considering multi-modal data, among others. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | We will focus on teaching: how to set up the problem of machine perception, the learning algorithms, network architectures, and advanced deep learning concepts in particular probabilistic deep learning models. The course covers the following main areas: I) Foundations of deep learning. II) Advanced topics like probabilistic generative modeling of data (latent variable models, generative adversarial networks, auto-regressive models, invertible neural networks, diffusion models). III) Deep learning in computer vision, human-computer interaction, and robotics. Specific topics include: I) Introduction to Deep Learning: a) Neural Networks and training (i.e., backpropagation) b) Feedforward Networks c) Timeseries modelling (RNN, GRU, LSTM) d) Convolutional Neural Networks II) Advanced topics: a) Latent variable models (VAEs) b) Generative adversarial networks (GANs) c) Autoregressive models (PixelCNN, PixelRNN, TCN, Transformer) d) Invertible Neural Networks / Normalizing Flows e) Coordinate-based networks (neural implicit surfaces, NeRF) f) Diffusion models III) Applications in machine perception and computer vision: a) Fully Convolutional architectures for dense per-pixel tasks (i.e., instance segmentation) b) Pose estimation and other tasks involving human activity c) Neural shape modeling (implicit surfaces, neural radiance fields) d) Deep Reinforcement Learning and Applications in Physics-Based Behavior Modeling | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Deep Learning Book by Ian Goodfellow and Yoshua Bengio | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | This is an advanced grad-level course that requires a background in machine learning. Students are expected to have a solid mathematical foundation, in particular in linear algebra, multivariate calculus, and probability. The course will focus on state-of-the-art research in deep learning and will not repeat the basics of machine learning Please take note of the following conditions: 1) Students must have taken the exam in Machine Learning (252-0535-00) or have acquired equivalent knowledge 2) All practical exercises will require basic knowledge of Python and will use libraries such as Pytorch, scikit-learn, and scikit-image. We will provide introductions to Pytorch and other libraries that are needed but will not provide introductions to basic programming or Python. The following courses are strongly recommended as prerequisites: * "Visual Computing" or "Computer Vision" The course will be assessed by a final written examination in English. No course materials or electronic devices can be used during the examination. Note that the examination will be based on the contents of the lectures, the associated reading materials, and the exercises. The exam will be a 3-hour end-of-term exam and take place at the end of the teaching period. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Elective Courses | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Number | Title | Type | ECTS | Hours | Lecturers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
252-0526-00L | Statistical Learning Theory | W | 8 credits | 3V + 2U + 2A | J. M. Buhmann | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The course covers advanced methods of statistical learning: - Variational methods and optimization. - Deterministic annealing. - Clustering for diverse types of data. - Model validation by information theory. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The course surveys recent methods of statistical learning. The fundamentals of machine learning, as presented in the courses "Introduction to Machine Learning" and "Advanced Machine Learning", are expanded from the perspective of statistical learning. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | - Variational methods and optimization. We consider optimization approaches for problems where the optimizer is a probability distribution. We will discuss concepts like maximum entropy, information bottleneck, and deterministic annealing. - Clustering. This is the problem of sorting data into groups without using training samples. We discuss alternative notions of "similarity" between data points and adequate optimization procedures. - Model selection and validation. This refers to the question of how complex the chosen model should be. In particular, we present an information theoretic approach for model validation. - Statistical physics models. We discuss approaches for approximately optimizing large systems, which originate in statistical physics (free energy minimization applied to spin glasses and other models). We also study sampling methods based on these models. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | A draft of a script will be provided. Lecture slides will be made available. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Hastie, Tibshirani, Friedman: The Elements of Statistical Learning, Springer, 2001. L. Devroye, L. Gyorfi, and G. Lugosi: A probabilistic theory of pattern recognition. Springer, New York, 1996 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Knowledge of machine learning (introduction to machine learning and/or advanced machine learning) Basic knowledge of statistics. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
252-0579-00L | 3D Vision | W | 5 credits | 3G + 1A | M. Pollefeys, D. B. Baráth | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The course covers camera models and calibration, feature tracking and matching, camera motion estimation via simultaneous localization and mapping (SLAM) and visual odometry (VO), epipolar and mult-view geometry, structure-from-motion, (multi-view) stereo, augmented reality, and image-based (re-)localization. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | After attending this course, students will: 1. understand the core concepts for recovering 3D shape of objects and scenes from images and video. 2. be able to implement basic systems for vision-based robotics and simple virtual/augmented reality applications. 3. have a good overview over the current state-of-the art in 3D vision. 4. be able to critically analyze and asses current research in this area. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The goal of this course is to teach the core techniques required for robotic and augmented reality applications: How to determine the motion of a camera and how to estimate the absolute position and orientation of a camera in the real world. This course will introduce the basic concepts of 3D Vision in the form of short lectures, followed by student presentations discussing the current state-of-the-art. The main focus of this course are student projects on 3D Vision topics, with an emphasis on robotic vision and virtual and augmented reality applications. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
261-5120-00L | Machine Learning for Health Care | W | 5 credits | 2V + 2A | V. Boeva, J. Vogt, M. Kuznetsova | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The course will review the most relevant methods and applications of Machine Learning in Biomedicine, discuss the main challenges they present and their current technical problems. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | During the last years, we have observed a rapid growth in the field of Machine Learning (ML), mainly due to improvements in ML algorithms, the increase of data availability and a reduction in computing costs. This growth is having a profound impact in biomedical applications, where the great variety of tasks and data types enables us to get benefit of ML algorithms in many different ways. In this course we will review the most relevant methods and applications of ML in biomedicine, discuss the main challenges they present and their current technical solutions. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The course will consist of four topic clusters that will cover the most relevant applications of ML in Biomedicine: 1) Structured time series: Temporal time series of structured data often appear in biomedical datasets, presenting challenges as containing variables with different periodicities, being conditioned by static data, etc. 2) Medical notes: Vast amount of medical observations are stored in the form of free text, we will analyze stategies for extracting knowledge from them. 3) Medical images: Images are a fundamental piece of information in many medical disciplines. We will study how to train ML algorithms with them. 4) Genomics data: ML in genomics is still an emerging subfield, but given that genomics data are arguably the most extensive and complex datasets that can be found in biomedicine, it is expected that many relevant ML applications will arise in the near future. We will review and discuss current applications and challenges. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Data Structures & Algorithms, Introduction to Machine Learning, Statistics/Probability, Programming in Python, Unix Command Line Relation to Course 261-5100-00 Computational Biomedicine: This course is a continuation of the previous course with new topics related to medical data and machine learning. The format of Computational Biomedicine II will also be different. It is helpful but not essential to attend Computational Biomedicine before attending Computational Biomedicine II. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5000-00L | Computational Semantics for Natural Language Processing | W | 6 credits | 2V + 1U + 2A | M. Sachan | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course presents an introduction to Natural language processing (NLP) with an emphasis on computational semantics i.e. the process of constructing and reasoning with meaning representations of natural language text. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The objective of the course is to learn about various topics in computational semantics and its importance in natural language processing methodology and research. Exercises and the project will be key parts of the course so the students will be able to gain hands-on experience with state-of-the-art techniques in the field. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | We will take a modern view of the topic, and focus on various statistical and deep learning approaches for computation semantics. We will also overview various primary areas of research in language processing and discuss how the computational semantics view can help us make advances in NLP. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Lecture slides will be made available at the course Web site. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | No textbook is required, but there will be regularly assigned readings from research literature, linked to the course website. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | The student should have successfully completed a graduate level class in machine learning (252-0220-00L), deep learning (263-3210-00L) or natural language processing (252-3005-00L) before. Similar courses from other universities are acceptable too. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5051-00L | AI Center Projects in Machine Learning Research Last cancellation/deregistration date for this ungraded semester performance: Friday, 17 March 2023! Please note that after that date no deregistration will be accepted and the course will be considered as "fail". | W | 4 credits | 2V + 1A | A. Ilic, N. Davoudi, M. El-Assady, F. Engelmann, S. Gashi, T. Kontogianni, A. Marx, B. Moseley, G. Ramponi, X. Shen, M. Sorbaro Sindaci | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The course will give students an overview of selected topics in advanced machine learning that are currently subjects of active research. The course concludes with a final project. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The overall objective is to give students a concrete idea of what working in contemporary machine learning research is like and inform them about current research performed at ETH. In this course, students will be able to get an overview of current research topics in different specialized areas. In the final project, students will be able to build experience in practical aspects of machine learning research, including research literature, aspects of implementation, and reproducibility challenges. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The course will be structured as sections taught by different postdocs specialized in the relevant fields. Each section will showcase an advanced research topic in machine learning, first introducing it and motivating it in the context of current technological or scientific advancement, then providing practical applications that students can experiment with, ideally with the aim of reproducing a known result in the specific field. A tentative list of topics for this year: - fully supervised 3D scene understanding - weakly supervised 3D scene understanding - causal discovery - biological and artificial neural networks - reinforcement learning - visual text analytics - human-centered AI - representation learning. The last weeks of the course will be reserved for the implementation of the final project. The students will be assigned group projects in one of the presented areas, based on their preferences. The outcomes will be made into a scientific poster and students will be asked to present the projects to the other groups in a joint poster session. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Participants should have basic knowledge about machine learning and statistics (e.g. Introduction to Machine Learning course or equivalent) and programming. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5052-00L | Interactive Machine Learning: Visualization & Explainability | W | 5 credits | 3G + 1A | M. El-Assady | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Visual Analytics supports the design of human-in-the-loop interfaces that enable human-machine collaboration. In this course, will go through the fundamentals of designing interactive visualizations, later applying them to explain and interact with machine leaning models. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The goal of the course is to introduce techniques for interactive information visualization and to apply these on understanding, diagnosing, and refining machine learning models. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Interactive, mixed-initiative machine learning promises to combine the efficiency of automation with the effectiveness of humans for a collaborative decision-making and problem-solving process. This can be facilitated through co-adaptive visual interfaces. This course will first introduce the foundations of information visualization design based on data charecteristics, e.g., high-dimensional, geo-spatial, relational, temporal, and textual data. Second, we will discuss interaction techniques and explanation strategies to enable explainable machine learning with the tasks of understanding, diagnosing, and refining machine learning models. Tentative list of topics: 1. Visualization and Perception 2. Interaction and Explanation 3. Systems Overview | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Course material will be provided in form of slides. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Will be provided during the course. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Basic understanding of machine learning as taught at the Bachelor's level. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5255-00L | Foundations of Reinforcement Learning | W | 7 credits | 3V + 3A | N. He | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Reinforcement learning (RL) has been in the limelight of many recent breakthroughs in artificial intelligence. This course focuses on theoretical and algorithmic foundations of reinforcement learning, through the lens of optimization, modern approximation, and learning theory. The course targets M.S. students with strong research interests in reinforcement learning, optimization, and control. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | This course aims to provide students with an advanced introduction of RL theory and algorithms as well as bring them near the frontier of this active research field. By the end of the course, students will be able to - Identify the strengths and limitations of various reinforcement learning algorithms; - Formulate and solve sequential decision-making problems by applying relevant reinforcement learning tools; - Generalize or discover “new” applications, algorithms, or theories of reinforcement learning towards conducting independent research on the topic. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Basic topics include fundamentals of Markov decision processes, approximate dynamic programming, linear programming and primal-dual perspectives of RL, model-based and model-free RL, policy gradient and actor-critic algorithms, Markov games and multi-agent RL. If time allows, we will also discuss advanced topics such as batch RL, inverse RL, causal RL, etc. The course keeps strong emphasis on in-depth understanding of the mathematical modeling and theoretical properties of RL algorithms. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Lecture notes will be posted on Moodle. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Dynamic Programming and Optimal Control, Vol I & II, Dimitris Bertsekas Reinforcement Learning: An Introduction, Second Edition, Richard Sutton and Andrew Barto. Algorithms for Reinforcement Learning, Csaba Czepesvári. Reinforcement Learning: Theory and Algorithms, Alekh Agarwal, Nan Jiang, Sham M. Kakade. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Students are expected to have strong mathematical background in linear algebra, probability theory, optimization, and machine learning. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5351-00L | Machine Learning for Genomics The deadline for deregistering expires at the end of the third week of the semester. Students who are still registered after that date, but do not provide project work, do not participate in paper presentation sessions and/or do not show up for the exam, will officially fail the course. | W | 5 credits | 2V + 1U + 1A | V. Boeva | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The course reviews solutions that machine learning provides to the most challenging questions in human genomics. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Over the last few years, the parallel development of machine learning methods and molecular profiling technologies for human cells, such as sequencing, created an extremely powerful tool to get insights into the cellular mechanisms in healthy and diseased contexts. In this course, we will discuss the state-of-the-art machine learning methodology solving or attempting to solve common problems in human genomics. At the end of the course, you will be familiar with (1) classical and advanced machine learning architectures used in genomics, (2) bioinformatics analysis of human genomic and transcriptomic data, and (3) data types used in this field. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | - Short introduction to major concepts of molecular biology: DNA, genes, genome, central dogma, transcription factors, epigenetic code, DNA methylation, signaling pathways - Prediction of transcription factor binding sites, open chromatin, histone marks, promoters, nucleosome positioning (convolutional neural networks, position weight matrices) - Prediction of variant effects and gene expression (hidden Markov models, topic models) - Deconvolution of mixed signal - DNA, RNA and protein folding (RNN, LSTM, transformers) - Data imputation for single cell RNA-seq data, clustering and annotation (diffusion and methods on graphs) - Batch correction (autoencoders, optimal transport) - Survival analysis (Cox proportional hazard model, regularization penalties, multi-omics, multi-tasking) | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Introduction to Machine Learning, Statistics/Probability, Programming in Python, Unix Command Line | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5352-00L | Advanced Formal Language Theory | W | 6 credits | 4G + 1A | R. Cotterell | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course serves as an introduction to various advanced topics in formal language theory. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The objective of the course is to learn and understand a variety of topics in advanced formal language theory. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | This course serves as an introduction to various advanced topics in formal language theory. The primary focus of the course is on weighted formalisms, which can easily be applied in machine learning. Topics include finite-state machines as well as the algorithms that are commonly used for their manipulation. We will also cover weighted context-free grammars, weighted tree automata, and weighted mildly context-sensitive formalisms. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5353-10L | Philosophy of Language and Computation II (with Case Study) | W | 5 credits | 2V + 1U + 1A | R. Cotterell, J. L. Gastaldi | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Understand the philosophical underpinnings of language-based artificial intelligence. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | This graduate class, taught like a seminar, is designed to help you understand the philosophical underpinnings of modern work in natural language processing (NLP), most of which is centered around statistical machine learning applied to natural language data. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | This graduate class, taught like a seminar, is designed to help you understand the philosophical underpinnings of modern work in natural language processing (NLP), most of which is centered around statistical machine learning applied to natural language data. The course is a year-long journey, but the second half (Spring 2023) does not depend on the first (Fall 2022) and thus either half may be taken independently. In each semester, we divide the class time into three modules. Each module is centered around a philosophical topic. After discussing logical, structuralist, and generative approaches to language in the first semester, in the second semester we will focus on information, language games, and pragmatics. The modules will be four weeks long. During the first two weeks of a module, we will read and discuss original texts and supplementary criticism. During the second two weeks, we will read recent NLP papers and discuss how the authors of those works are building on philosophical insights into our conception of language—perhaps implicitly or unwittingly. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | The literature will be provided by the instructors on the class website | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5354-00L | Large Language Models | W | 8 credits | 3V + 2U + 2A | R. Cotterell, M. Sachan, F. Tramèr, C. Zhang | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Large language models have become one of the most commonly deployed NLP inventions. In the past half-decade, their integration into core natural language processing tools has dramatically increased the performance of such tools, and they have entered the public discourse surrounding artificial intelligence. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | To understand the mathematical foundations of large language models as well as how to implement them. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | We start with the probabilistic foundations of language models, i.e., covering what constitutes a language model from a formal, theoretical perspective. We then discuss how to construct and curate training corpora, and introduce many of the neural-network architectures often used to instantiate language models at scale. The course covers aspects of systems programming, discussion of privacy and harms, as well as applications of language models in NLP and beyond. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | The lecture notes will be supplemented with various readings from the literature. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
227-0434-10L | Mathematics of Information | W | 8 credits | 3V + 2U + 2A | H. Bölcskei | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The class focuses on mathematical aspects of 1. Information science: Sampling theorems, frame theory, compressed sensing, sparsity, super-resolution, spectrum-blind sampling, subspace algorithms, dimensionality reduction 2. Learning theory: Approximation theory, greedy algorithms, uniform laws of large numbers, Rademacher complexity, Vapnik-Chervonenkis dimension | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The aim of the class is to familiarize the students with the most commonly used mathematical theories in data science, high-dimensional data analysis, and learning theory. The class consists of the lecture and exercise sessions with homework problems. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Mathematics of Information 1. Signal representations: Frame theory, wavelets, Gabor expansions, sampling theorems, density theorems 2. Sparsity and compressed sensing: Sparse linear models, uncertainty relations in sparse signal recovery, super-resolution, spectrum-blind sampling, subspace algorithms (ESPRIT), estimation in the high-dimensional noisy case, Lasso 3. Dimensionality reduction: Random projections, the Johnson-Lindenstrauss Lemma Mathematics of Learning 4. Approximation theory: Nonlinear approximation theory, best M-term approximation, greedy algorithms, fundamental limits on compressibility of signal classes, Kolmogorov-Tikhomirov epsilon-entropy of signal classes, optimal compression of signal classes 5. Uniform laws of large numbers: Rademacher complexity, Vapnik-Chervonenkis dimension, classes with polynomial discrimination | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Detailed lecture notes will be provided at the beginning of the semester. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | This course is aimed at students with a background in basic linear algebra, analysis, statistics, and probability. We encourage students who are interested in mathematical data science to take both this course and "401-4944-20L Mathematics of Data Science" by Prof. A. Bandeira. The two courses are designed to be complementary. H. Bölcskei and A. Bandeira | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
401-3632-00L | Computational Statistics | W | 8 credits | 3V + 1U | M. Mächler | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | We discuss modern statistical methods for data analysis, including methods for data exploration, prediction and inference. We pay attention to algorithmic aspects, theoretical properties and practical considerations. The class is hands-on and methods are applied using the statistical programming language R. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The student obtains an overview of modern statistical methods for data analysis, including their algorithmic aspects and theoretical properties. The methods are applied using the statistical programming language R. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | See the class website | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | At least one semester of (basic) probability and statistics. Programming experience is helpful but not required. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Major in Secure and Reliable Systems | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Core Courses | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Number | Title | Type | ECTS | Hours | Lecturers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-2815-00L | Automated Software Testing Last cancellation/deregistration date for this graded semester performance: 17 March 2023! Please note that after that date no deregistration will be accepted and the course will be considered as "fail". | W | 7 credits | 2V + 1U + 3A | Z. Su | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course introduces students to classic and modern techniques for the automated testing and analysis of software systems for reliability, security, and performance. It covers both techniques and their applications in various domains (e.g., compilers, databases, theorem provers, operating systems, machine/deep learning, and mobile applications), focusing on the latest, important results. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | * Learn fundamental and practical techniques for software testing and analysis * Understand the challenges, open issues and opportunities across a variety of domains (security/systems/compilers/databases/mobile/AI/education) * Understand how latest automated testing and analysis techniques work * Gain conceptual and practical experience in techniques/tools for reliability, security, and performance * Learn how to perform original and impactful research in this area | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The course will be organized into the following components: (1) classic and modern testing and analysis techniques (coverage metrics, mutation testing, metamorphic testing, combinatorial testing, symbolic execution, fuzzing, static analysis, etc.), (2) latest results on techniques and applications from diverse domains, and (3) open challenges and opportunities. A major component of this course is a class project. All students (individually or two-person teams) are expected to select and complete a course project. Ideally, the project is original research related in a broad sense to automated software testing and analysis. Potential project topics will also be suggested by the teaching staff. Students must select a project and write a one or two pages proposal describing why what the proposed project is interesting and giving a work schedule. Students will also write a final report describing the project and prepare a 20-30 minute presentation at the end of the course. The due dates for the project proposal, final report, and project presentation will be announced. The course will cover results from the Advanced Software Technologies (AST) Lab at ETH as well as notable results elsewhere, providing good opportunities for potential course project topics as well as MSc project/thesis topics. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Lecture notes/slides and other lecture materials/handouts will be available online. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Reading material and links to tools will be published on the course website. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | The prerequisites for this course are some programming and algorithmic experience. Background and experience in software engineering, programming languages/compilers, and security (as well as operating systems and databases) can be beneficial. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-2925-00L | Program Analysis for System Security and Reliability Does not take place this semester. | W | 7 credits | 2V + 1U + 3A | M. Vechev | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Security issues in modern systems (blockchains, datacenters, deep learning, etc.) result in billions of losses due to hacks and system downtime. This course introduces fundamental techniques (ranging over automated analysis, machine learning, synthesis, zero-knowledge, differential privacy, and their combinations) that can be applied in practice so to build more secure and reliable modern systems. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | * Understand the fundamental techniques used to create modern security and reliability analysis engines that are used worldwide. * Understand how symbolic techniques are combined with machine learning (e.g., deep learning, reinforcement learning) so to create new kinds of learning-based analyzers. * Understand how to quantify and fix security and reliability issues in modern deep learning models. * Understand open research questions from both theoretical and practical perspectives. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Please see: https://www.sri.inf.ethz.ch/teaching/pass2022 for detailed course content. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-4660-00L | Applied Cryptography | W | 8 credits | 3V + 2U + 2P | K. Paterson, F. Günther | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course will introduce the basic primitives of cryptography, using rigorous syntax and game-based security definitions. The course will show how these primitives can be combined to build cryptographic protocols and systems. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The goal of the course is to put students' understanding of cryptography on sound foundations, to enable them to start to build well-designed cryptographic systems, and to expose them to some of the pitfalls that arise when doing so. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Basic symmetric primitives (block ciphers, modes, hash functions); generic composition; AEAD; basic secure channels; basic public key primitives (encryption,signature, DH key exchange); ECC; randomness; applications. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Textbook: Boneh and Shoup, “A Graduate Course in Applied Cryptography”, http://toc.cryptobook.us/book.pdf. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Students should have taken the D-INFK Bachelor's course “Information Security" (252-0211-00) or an alternative first course covering cryptography at a similar level. / In this course, we will use Moodle for content delivery: https://moodle-app2.let.ethz.ch/course/view.php?id=19644. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Elective Courses | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Number | Title | Type | ECTS | Hours | Lecturers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
252-0408-00L | Cryptographic Protocols | W | 6 credits | 2V + 2U + 1A | M. Hirt | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | In a cryptographic protocol, a set of parties wants to achieve some common goal, while some of the parties are dishonest. Most prominent example of a cryptographic protocol is multi-party computation, where the parties compute an arbitrary (but fixed) function of their inputs, while maintaining the secrecy of the inputs and the correctness of the outputs even if some of the parties try to cheat. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | To know and understand a selection of cryptographic protocols and to be able to analyze and prove their security and efficiency. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The selection of considered protocols varies. Currently, we consider multi-party computation, secret-sharing, broadcast and Byzantine agreement. We look at both the synchronous and the asynchronous communication model, and focus on simple protocols as well as on highly-efficient protocols. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | We provide handouts of the slides. For some of the topics, we also provide papers and/or lecture notes. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | A basic understanding of fundamental cryptographic concepts (as taught for example in the course Information Security) is useful, but not required. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-2812-00L | Program Verification | W | 5 credits | 3G + 1A | P. Müller, M. Eilers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | A hands-on introduction to the theory and construction of deductive program verifiers, covering both powerful techniques for formal program reasoning, and a perspective over the tool stack making up modern verification tools. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Students will earn the necessary skills for designing, developing, and applying deductive verification tools that enable the modular verification of complex software, including features challenging for reasoning such as heap-based mutable data and concurrency. Students will learn both a variety of fundamental reasoning principles, and how these reasoning ideas can be made practical via automatic tools. By the end of the course, students should have a good working understanding and decisions involved with designing and building practical verification tools, including the underlying theory. They will also be able to apply such tools to develop formally-verified programs. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The course will cover verification techniques and ways to automate them by introducing a verifier for a small core language and then progressively enriching the language with advanced features such as a mutable heap and concurrency. For each language extension, the course will explain the necessary reasoning principles, specification techniques, and tool support. In particular, it will introduce SMT solvers to prove logical formulas, intermediate verification languages to encode verification problems, and source code verifiers to handle feature-rich languages. The course will intermix technical content with hands-on experience. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | The slides will be available online. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Will be announced in the lecture. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | A basic familiarity with propositional and first-order logic will be assumed. Courses with an emphasis on formal reasoning about programs (such as Formal Methods and Functional Programming) are advantageous background, but are not a requirement. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-4600-00L | Formal Methods for Information Security Does not take place this semester. | W | 5 credits | 2V + 1U + 1A | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The course focuses on formal methods for the modeling and analysis of security protocols for critical systems, ranging from authentication protocols for network security to electronic voting protocols and online banking. In addition, we will also introduce the notions of non-interference and runtime monitoring. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The students will learn the key ideas and theoretical foundations of formal modeling and analysis of security protocols. The students will complement their theoretical knowledge by solving practical exercises, completing a small project, and using state-of-the-art tools. The students also learn the fundamentals of non-interference and runtime monitoring. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The course treats formal methods mainly for the modeling and analysis of security protocols. Cryptographic protocols (such as SSL/TLS, SSH, Kerberos, SAML single-sign on, and IPSec) form the basis for secure communication and business processes. Numerous attacks on published protocols show that the design of cryptographic protocols is extremely error-prone. A rigorous analysis of these protocols is therefore indispensable, and manual analysis is insufficient. The lectures cover the theoretical basis for the (tool-supported) formal modeling and analysis of such protocols. Specifically, we discuss their operational semantics, the formalization of security properties, and techniques and algorithms for their verification. The second part of this course will cover a selection of advanced topics in security protocols such as abstraction techniques for efficient verification, secure communication with humans, the link between symbolic protocol models and cryptographic models as well as RFID protocols (a staple of the Internet of Things) and electronic voting protocols, including the relevant privacy properties. Moreover, we will give an introduction to two additional topics: non-interference as a general notion of secure systems, both from a semantic and a programming language perspective (type system), and runtime verification/monitoring to detect violations of security policies expressed as trace properties. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-4656-00L | Digital Signatures | W | 5 credits | 2V + 2A | D. Hofheinz | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Digital signatures as one central cryptographic building block. Different security goals and security definitions for digital signatures, followed by a variety of popular and fundamental signature schemes with their security analyses. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The student knows a variety of techniques to construct and analyze the security of digital signature schemes. This includes modularity as a central tool of constructing secure schemes, and reductions as a central tool to proving the security of schemes. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | We will start with several definitions of security for signature schemes, and investigate the relations among them. We will proceed to generic (but inefficient) constructions of secure signatures, and then move on to a number of efficient schemes based on concrete computational hardness assumptions. On the way, we will get to know paradigms such as hash-then-sign, one-time signatures, and chameleon hashing as central tools to construct secure signatures. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Jonathan Katz, "Digital Signatures." | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Ideally, students will have taken the D-INFK Bachelors course "Information Security" or an equivalent course at Bachelors level. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Major in Theoretical Computer Science | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Core Courses | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Number | Title | Type | ECTS | Hours | Lecturers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
261-5110-00L | Optimization for Data Science | W | 10 credits | 3V + 2U + 4A | B. Gärtner, N. He | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course provides an in-depth theoretical treatment of optimization methods that are relevant in data science. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Understanding the guarantees and limits of relevant optimization methods used in data science. Learning theoretical paradigms and techniques to deal with optimization problems arising in data science. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | This course provides an in-depth theoretical treatment of classical and modern optimization methods that are relevant in data science. After a general discussion about the role that optimization has in the process of learning from data, we give an introduction to the theory of (convex) optimization. Based on this, we present and analyze algorithms in the following four categories: first-order methods (gradient and coordinate descent, Frank-Wolfe, subgradient and mirror descent, stochastic and incremental gradient methods); second-order methods (Newton and quasi Newton methods); non-convexity (local convergence, provable global convergence, cone programming, convex relaxations); min-max optimization (extragradient methods). The emphasis is on the motivations and design principles behind the algorithms, on provable performance bounds, and on the mathematical tools and techniques to prove them. The goal is to equip students with a fundamental understanding about why optimization algorithms work, and what their limits are. This understanding will be of help in selecting suitable algorithms in a given application, but providing concrete practical guidance is not our focus. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | A solid background in analysis and linear algebra; some background in theoretical computer science (computational complexity, analysis of algorithms); the ability to understand and write mathematical proofs. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-4400-00L | Advanced Graph Algorithms and Optimization | W | 10 credits | 3V + 3U + 3A | R. Kyng, M. Probst | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course will cover a number of advanced topics in optimization and graph algorithms. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The course will take students on a deep dive into modern approaches to graph algorithms using convex optimization techniques. By studying convex optimization through the lens of graph algorithms, students should develop a deeper understanding of fundamental phenomena in optimization. The course will cover some traditional discrete approaches to various graph problems, especially flow problems, and then contrast these approaches with modern, asymptotically faster methods based on combining convex optimization with spectral and combinatorial graph theory. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Students should leave the course understanding key concepts in optimization such as first and second-order optimization, convex duality, multiplicative weights and dual-based methods, acceleration, preconditioning, and non-Euclidean optimization. Students will also be familiarized with central techniques in the development of graph algorithms in the past 15 years, including graph decomposition techniques, sparsification, oblivious routing, and spectral and combinatorial preconditioning. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | This course is targeted toward masters and doctoral students with an interest in theoretical computer science. Students should be comfortable with design and analysis of algorithms, probability, and linear algebra. Having passed the course Algorithms, Probability, and Computing (APC) is highly recommended, but not formally required. If you are not sure whether you're ready for this class or not, please consult the instructor. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-4508-00L | Algorithmic Foundations of Data Science | W | 10 credits | 3V + 2U + 4A | D. Steurer | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course provides rigorous theoretical foundations for the design and mathematical analysis of efficient algorithms that can solve fundamental tasks relevant to data science. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | We consider various statistical models for basic data-analytical tasks, e.g., (sparse) linear regression, principal component analysis, matrix completion, community detection, and clustering. Our goal is to design efficient (polynomial-time) algorithms that achieve the strongest possible (statistical) guarantees for these models. Toward this goal we learn about a wide range of mathematical techniques from convex optimization, linear algebra (especially, spectral theory and tensors), and high-dimensional statistics. We also incorporate adversarial (worst-case) components into our models as a way to reason about robustness guarantees for the algorithms we design. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Strengths and limitations of efficient algorithms in (robust) statistical models for the following (tentative) list of data analysis tasks: - (sparse) linear regression - principal component analysis and matrix completion - clustering and Gaussian mixture models - community detection | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | To be provided during the semester | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | High-Dimensional Statistics A Non-Asymptotic Viewpoint by Martin J. Wainwright | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Mathematical and algorithmic maturity at least at the level of the course "Algorithms, Probability, and Computing". Important: Optimization for Data Science 2018--2021 This course was created after a reorganization of the course "Optimization for Data Science" (ODS). A significant portion of the material for this course has previously been taught as part of ODS. Consequently, it is not possible to earn credit points for both this course and ODS as offered in 2018--2021. This restriction does not apply to ODS offered in 2022 or afterwards and you can earn credit points for both courses in this case. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Elective Courses | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Number | Title | Type | ECTS | Hours | Lecturers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
252-0408-00L | Cryptographic Protocols | W | 6 credits | 2V + 2U + 1A | M. Hirt | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | In a cryptographic protocol, a set of parties wants to achieve some common goal, while some of the parties are dishonest. Most prominent example of a cryptographic protocol is multi-party computation, where the parties compute an arbitrary (but fixed) function of their inputs, while maintaining the secrecy of the inputs and the correctness of the outputs even if some of the parties try to cheat. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | To know and understand a selection of cryptographic protocols and to be able to analyze and prove their security and efficiency. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The selection of considered protocols varies. Currently, we consider multi-party computation, secret-sharing, broadcast and Byzantine agreement. We look at both the synchronous and the asynchronous communication model, and focus on simple protocols as well as on highly-efficient protocols. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | We provide handouts of the slides. For some of the topics, we also provide papers and/or lecture notes. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | A basic understanding of fundamental cryptographic concepts (as taught for example in the course Information Security) is useful, but not required. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
252-1424-00L | Models of Computation | W | 6 credits | 2V + 2U + 1A | M. Cook | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course surveys many different models of computation: Turing Machines, Cellular Automata, Finite State Machines, Graph Automata, Circuits, Tilings, Lambda Calculus, Fractran, Chemical Reaction Networks, Hopfield Networks, String Rewriting Systems, Tag Systems, Diophantine Equations, Register Machines, Primitive Recursive Functions, and more. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The goal of this course is to become acquainted with a wide variety of models of computation, to understand how models help us to understand the modeled systems, and to be able to develop and analyze models appropriate for new systems. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | This course surveys many different models of computation: Turing Machines, Cellular Automata, Finite State Machines, Graph Automata, Circuits, Tilings, Lambda Calculus, Fractran, Chemical Reaction Networks, Hopfield Networks, String Rewriting Systems, Tag Systems, Diophantine Equations, Register Machines, Primitive Recursive Functions, and more. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-4509-00L | Complex Network Models | W | 5 credits | 2V + 2A | J. Lengler | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Complex network models are random graphs that feature one or several properties observed in real-world networks (e.g., social networks, internet graph, www). Depending on the application, different properties are relevant, and different complex network models are useful. This course gives an overview over some relevant models and the properties they do and do not cover. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The students get familiar with a portfolio of network models, and they know their features and shortcomings. For a given application, they can identify relevant properties for this applications and can select an appropriate network model. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Network models: Erdös-Renyi random graphs, Chung-Lu graphs, configuration model, Kleinberg model, geometric inhomogeneous random graphs Properties: degree distribution, structure of giant and smaller components, clustering coefficient, small-world properties, community structures, weak ties | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | The script is available in moodle or at https://as.inf.ethz.ch/people/members/lenglerj/CompNetScript.pdf It will be updated during the semester. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Latora, Nikosia, Russo: "Complex Networks: Principles, Methods and Applications" van der Hofstad: "Random Graphs and Complex Networks. Volume 1" | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | The students must be familiar with the basics of graph theory and of probability theory (e.g. linearity of expectation, inequalities of Markov, Chebyshev, Chernoff). The course "Randomized Algorithms and Probabilistic Methods" is helpful, but not required. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-4510-00L | Introduction to Topological Data Analysis | W | 8 credits | 3V + 2U + 2A | P. Schnider | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Topological Data Analysis (TDA) is a relatively new subfield of computer sciences, which uses techniques from algebraic topology and computational geometry and topology to analyze and quantify the shape of data. This course will introduce the theoretical foundations of TDA. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The goal is to make students familiar with the fundamental concepts, techniques and results in TDA. At the end of the course, students should be able to read and understand current research papers and have the necessary background knowledge to apply methods from TDA to other projects. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Mathematical background (Topology, Simplicial complexes, Homology), Persistent Homology, Complexes on point clouds (Čech complexes, Vietoris-Rips complexes, Delaunay complexes, Witness complexes), the TDA pipeline, Reeb Graphs, Mapper | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Main reference: Tamal K. Dey, Yusu Wang: Computational Topology for Data Analysis, 2021 https://www.cs.purdue.edu/homes/tamaldey/book/CTDAbook/CTDAbook.html Other references: Herbert Edelsbrunner, John Harer: Computational Topology: An Introduction, American Mathematical Society, 2010 https://bookstore.ams.org/mbk-69 Gunnar Carlsson, Mikael Vejdemo-Johansson: Topological Data Analysis with Applications, Cambridge University Press, 2021 Link Robert Ghrist: Elementary Applied Topology, 2014 https://www2.math.upenn.edu/~ghrist/notes.html Allen Hatcher: Algebraic Topology, Cambridge University Press, 2002 https://pi.math.cornell.edu/~hatcher/AT/ATpage.html | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | The course assumes knowledge of discrete mathematics, algorithms and data structures and linear algebra, as supplied in the first semesters of Bachelor Studies at ETH. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-4656-00L | Digital Signatures | W | 5 credits | 2V + 2A | D. Hofheinz | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Digital signatures as one central cryptographic building block. Different security goals and security definitions for digital signatures, followed by a variety of popular and fundamental signature schemes with their security analyses. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The student knows a variety of techniques to construct and analyze the security of digital signature schemes. This includes modularity as a central tool of constructing secure schemes, and reductions as a central tool to proving the security of schemes. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | We will start with several definitions of security for signature schemes, and investigate the relations among them. We will proceed to generic (but inefficient) constructions of secure signatures, and then move on to a number of efficient schemes based on concrete computational hardness assumptions. On the way, we will get to know paradigms such as hash-then-sign, one-time signatures, and chameleon hashing as central tools to construct secure signatures. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Jonathan Katz, "Digital Signatures." | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Ideally, students will have taken the D-INFK Bachelors course "Information Security" or an equivalent course at Bachelors level. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
272-0300-00L | Algorithmics for Hard Problems This course d o e s n o t include the Mentored Work Specialised Courses with an Educational Focus in Computer Science A. | W | 5 credits | 2V + 1U + 1A | H.‑J. Böckenhauer, D. Komm | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course unit looks into algorithmic approaches to the solving of hard problems, particularly with moderately exponential-time algorithms and parameterized algorithms. The seminar is accompanied by a comprehensive reflection upon the significance of the approaches presented for computer science tuition at high schools. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | To systematically acquire an overview of the methods for solving hard problems. To get deeper knowledge of exact and parameterized algorithms. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | First, the concept of hardness of computation is introduced (repeated for the computer science students). Then some methods for solving hard problems are treated in a systematic way. For each algorithm design method, it is discussed what guarantees it can give and how we pay for the improved efficiency. A special focus lies on moderately exponential-time algorithms and parameterized algorithms. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Unterlagen und Folien werden zur Verfügung gestellt. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | J. Hromkovic: Algorithmics for Hard Problems, Springer 2004. R. Niedermeier: Invitation to Fixed-Parameter Algorithms, 2006. M. Cygan et al.: Parameterized Algorithms, 2015. F. Fomin et al.: Kernelization, 2019. F. Fomin, D. Kratsch: Exact Exponential Algorithms, 2010. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
272-0302-00L | Approximation and Online Algorithms Does not take place this semester. | W | 5 credits | 2V + 1U + 1A | D. Komm | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This lecture deals with approximative algorithms for hard optimization problems and algorithmic approaches for solving online problems as well as the limits of these approaches. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Get a systematic overview of different methods for designing approximative algorithms for hard optimization problems and online problems. Get to know methods for showing the limitations of these approaches. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Approximation algorithms are one of the most succesful techniques to attack hard optimization problems. Here, we study the so-called approximation ratio, i.e., the ratio of the cost of the computed approximating solution and an optimal one (which is not computable efficiently). For an online problem, the whole instance is not known in advance, but it arrives pieceweise and for every such piece a corresponding part of the definite output must be given. The quality of an algorithm for such an online problem is measured by the competitive ratio, i.e., the ratio of the cost of the computed solution and the cost of an optimal solution that could be given if the whole input was known in advance. The contents of this lecture are - the classification of optimization problems by the reachable approximation ratio, - systematic methods to design approximation algorithms (e.g., greedy strategies, dynamic programming, linear programming relaxation), - methods to show non-approximability, - classic online problem like paging or scheduling problems and corresponding algorithms, - randomized online algorithms, - the design and analysis principles for online algorithms, and - limits of the competitive ratio and the advice complexity as a way to do a deeper analysis of the complexity of online problems. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | The lecture is based on the following books: J. Hromkovic: Algorithmics for Hard Problems, Springer, 2004 D. Komm: An Introduction to Online Computation: Determinism, Randomization, Advice, Springer, 2016 Additional literature: A. Borodin, R. El-Yaniv: Online Computation and Competitive Analysis, Cambridge University Press, 1998 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
401-3052-10L | Graph Theory | W | 10 credits | 4V + 1U | B. Sudakov | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Basics, trees, Caley's formula, matrix tree theorem, connectivity, theorems of Mader and Menger, Eulerian graphs, Hamilton cycles, theorems of Dirac, Ore, Erdös-Chvatal, matchings, theorems of Hall, König, Tutte, planar graphs, Euler's formula, Kuratowski's theorem, graph colorings, Brooks' theorem, 5-colorings of planar graphs, list colorings, Vizing's theorem, Ramsey theory, Turán's theorem | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The students will get an overview over the most fundamental questions concerning graph theory. We expect them to understand the proof techniques and to use them autonomously on related problems. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Lecture will be only at the blackboard. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | West, D.: "Introduction to Graph Theory" Diestel, R.: "Graph Theory" Further literature links will be provided in the lecture. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Students are expected to have a mathematical background and should be able to write rigorous proofs. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
401-3902-21L | Network & Integer Optimization: From Theory to Application | W | 6 credits | 3G | R. Zenklusen | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course covers various topics in Network and (Mixed-)Integer Optimization. It starts with a rigorous study of algorithmic techniques for some network optimization problems (with a focus on matching problems) and moves to key aspects of how to attack various optimization settings through well-designed (Mixed-)Integer Programming formulations. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Our goal is for students to both get a good foundational understanding of some key network algorithms and also to learn how to effectively employ (Mixed-)Integer Programming formulations, techniques, and solvers, to tackle a wide range of discrete optimization problems. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Key topics include: - Matching problems; - Integer Programming techniques and models; - Extended formulations and strong problem formulations; - Solver techniques for (Mixed-)Integer Programs; - Decomposition approaches. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | - Bernhard Korte, Jens Vygen: Combinatorial Optimization. 6th edition, Springer, 2018. - Alexander Schrijver: Combinatorial Optimization: Polyhedra and Efficiency. Springer, 2003. This work has 3 volumes. - Vanderbeck François, Wolsey Laurence: Reformulations and Decomposition of Integer Programs. Chapter 13 in: 50 Years of Integer Programming 1958-2008. Springer, 2010. - Alexander Schrijver: Theory of Linear and Integer Programming. John Wiley, 1986. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Solid background in linear algebra. Preliminary knowledge of Linear Programming is ideal but not a strict requirement. Prior attendance of the course Linear & Combinatorial Optimization is a plus. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
402-0448-01L | Quantum Information Processing I: Concepts This theory part QIP I together with the experimental part 402-0448-02L QIP II (both offered in the Spring Semester) combine to the core course in experimental physics "Quantum Information Processing" (totally 10 ECTS credits). This applies to the Master's degree programme in Physics. | W | 5 credits | 2V + 1U | J. Home | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The course covers the key concepts of quantum information processing, including quantum algorithms which give the quantum computer the power to compute problems outside the reach of any classical supercomputer. Key concepts such as quantum error correction are discussed in detail. They provide fundamental insights into the nature of quantum states and measurements. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | By the end of the course students are able to explain the basic mathematical formalism of quantum mechanics and apply them to quantum information processing problems. They are able to adapt and apply these concepts and methods to analyse and discuss quantum algorithms and other quantum information-processing protocols. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The topics covered in the course will include quantum circuits, gate decomposition and universal sets of gates, efficiency of quantum circuits, quantum algorithms (Shor, Grover, Deutsch-Josza,..), quantum error correction, fault-tolerant designs, and quantum simulation. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Will be provided. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Quantum Computation and Quantum Information Michael Nielsen and Isaac Chuang Cambridge University Press | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | A good understanding of finite dimensional linear algebra is recommended. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Major in Visual and Interactive Computing | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Core Courses | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Number | Title | Type | ECTS | Hours | Lecturers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
252-0538-00L | Shape Modeling and Geometry Processing | W | 8 credits | 2V + 1U + 4A | O. Sorkine Hornung | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course covers the fundamentals and developments in geometric modeling and geometry processing. Topics include surface modeling based on point clouds and polygonal meshes, mesh generation, surface reconstruction, mesh fairing and parameterization, discrete differential geometry, interactive shape editing, topics in digital shape fabrication. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The students will learn how to design, program and analyze algorithms and systems for interactive 3D shape modeling and geometry processing. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Recent advances in 3D geometry processing have created a plenitude of novel concepts for the mathematical representation and interactive manipulation of geometric models. This course covers the fundamentals and some of the developments in geometric modeling and geometry processing. Topics include surface modeling based on point clouds and triangle meshes, mesh generation, surface reconstruction, mesh fairing and parameterization, discrete differential geometry, interactive shape editing and digital shape fabrication. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Slides and course notes | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Prerequisites: Visual Computing, Computer Graphics or an equivalent class. Experience with C++ programming. Solid background in linear algebra and analysis. Some knowledge of differential geometry, computational geometry and numerical methods is helpful but not a strict requirement. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-3710-00L | Machine Perception | W | 8 credits | 3V + 2U + 2A | O. Hilliges, J. Song | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Recent developments in neural networks have drastically advanced the performance of machine perception systems in a variety of areas including computer vision, robotics, and human shape modeling This course is a deep dive into deep learning algorithms and architectures with applications to a variety of perceptual and generative tasks. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Students will learn about fundamental aspects of modern deep learning approaches for perception and generation. Students will learn to implement, train and debug their own neural networks and gain a detailed understanding of cutting-edge research in learning-based computer vision, robotics, and shape modeling. The optional final project assignment will involve training a complex neural network architecture and applying it to a real-world dataset. The core competency acquired through this course is a solid foundation in deep-learning algorithms to process and interpret human-centric signals. In particular, students should be able to develop systems that deal with the problem of recognizing people in images, detecting and describing body parts, inferring their spatial configuration, performing action/gesture recognition from still images or image sequences, also considering multi-modal data, among others. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | We will focus on teaching: how to set up the problem of machine perception, the learning algorithms, network architectures, and advanced deep learning concepts in particular probabilistic deep learning models. The course covers the following main areas: I) Foundations of deep learning. II) Advanced topics like probabilistic generative modeling of data (latent variable models, generative adversarial networks, auto-regressive models, invertible neural networks, diffusion models). III) Deep learning in computer vision, human-computer interaction, and robotics. Specific topics include: I) Introduction to Deep Learning: a) Neural Networks and training (i.e., backpropagation) b) Feedforward Networks c) Timeseries modelling (RNN, GRU, LSTM) d) Convolutional Neural Networks II) Advanced topics: a) Latent variable models (VAEs) b) Generative adversarial networks (GANs) c) Autoregressive models (PixelCNN, PixelRNN, TCN, Transformer) d) Invertible Neural Networks / Normalizing Flows e) Coordinate-based networks (neural implicit surfaces, NeRF) f) Diffusion models III) Applications in machine perception and computer vision: a) Fully Convolutional architectures for dense per-pixel tasks (i.e., instance segmentation) b) Pose estimation and other tasks involving human activity c) Neural shape modeling (implicit surfaces, neural radiance fields) d) Deep Reinforcement Learning and Applications in Physics-Based Behavior Modeling | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Deep Learning Book by Ian Goodfellow and Yoshua Bengio | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | This is an advanced grad-level course that requires a background in machine learning. Students are expected to have a solid mathematical foundation, in particular in linear algebra, multivariate calculus, and probability. The course will focus on state-of-the-art research in deep learning and will not repeat the basics of machine learning Please take note of the following conditions: 1) Students must have taken the exam in Machine Learning (252-0535-00) or have acquired equivalent knowledge 2) All practical exercises will require basic knowledge of Python and will use libraries such as Pytorch, scikit-learn, and scikit-image. We will provide introductions to Pytorch and other libraries that are needed but will not provide introductions to basic programming or Python. The following courses are strongly recommended as prerequisites: * "Visual Computing" or "Computer Vision" The course will be assessed by a final written examination in English. No course materials or electronic devices can be used during the examination. Note that the examination will be based on the contents of the lectures, the associated reading materials, and the exercises. The exam will be a 3-hour end-of-term exam and take place at the end of the teaching period. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5806-00L | Digital Humans Previously Computational Models of Motion and Virtual Humans | W | 8 credits | 3V + 2U + 2A | S. Coros, S. Tang | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course covers the core technologies required to model and simulate motions for digital humans and robotic characters. Topics include kinematic modeling, physics-based simulation, trajectory optimization, reinforcement learning, feedback control for motor skills, motion capture, data-driven motion synthesis, and ML-based generative models. They will be richly illustrated with examples. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Students will learn how to estimate human pose, shape, and motion from videos and create basic human avatars from various visual inputs. Students will also learn how to represent and algorithmically generate motions for digital characters and their real-life robotic counterparts. The lectures are accompanied by four programming assignments (written in python or C++) and a capstone project. The deadline to cancel/deregister from the course is May 1st. Deregistration after the deadline will lead to fail. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | - Basic concept of 3D representations - Human body/hand models - Human motion capture; - Non-rigid surface tracking and reconstruction - Neural rendering - Optimal control and trajectory optimization - Physics-based modeling for multibody systems - Forward and inverse kinematics - Rigging and keyframing - Reinforcement learning for locomotion | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Experience with python and C++ programming, numerical linear algebra, multivariate calculus and probability theory. Some background in deep learning, computer vision, physics-based modeling, kinematics, and dynamics is preferred. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Elective Courses | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Number | Title | Type | ECTS | Hours | Lecturers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
252-0312-00L | Mobile Health and Activity Monitoring | W | 6 credits | 2V + 3A | C. Holz | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Health and activity monitoring has become a key purpose of mobile & wearable devices, e.g., phones, watches, and rings. We will cover the phenomena they capture, i.e., user behavior, actions, and human physiology, as well as the sensors, signals, and methods for processing and analysis. For the exercise, students will receive a wristband to stream and analyze activity and health signals. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The course will combine high-level concepts with low-level technical methods needed to sense, detect, and understand them. High-level: – sensing modalities for interactive systems – "activities" and "events" (exercises and other mechanical activities such as movements and resulting vibrations) – health monitoring (basic cardiovascular physiology) – affective computing (emotions, mood, personality) Lower-level: – sampling and filtering, time and frequency domains – cross-modal sensor systems, signal synchronization and correlation – event detection, classification, prediction using basic signal processing as well as learning-based methods – sensor types: optical, mechanical/acoustic, electromagnetic | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Health and activity monitoring has become a key purpose of mobile and wearable devices, including phones, (smart) watches, (smart) rings, (smart) belts, and other trackers (e.g., shoe clips, pendants). In this course, we will cover the fundamental aspects that these devices observe, i.e., user behavior, actions, and physiological dynamics of the human body, as well as the sensors, signals, and methods to capture, process, and analyze them. We will then cover methods for pattern extraction and classification on such data. The course will therefore touch on aspects of human activities, cardiovascular and pulmonary physiology, affective computing (recognizing, interpreting, and processing emotions), corresponding lower-level sensing systems (e.g., inertial sensing, optical sensing, photoplethysmography, eletrodermal activity, electrocardiograms) and higher-level computer vision-based sensing (facial expressions, motions, gestures), as well as processing methods for these types of data. The course will be accompanied by a group exercise project, in which students will apply the concepts and methods taught in class. Students will receive a wearable wristband device that streams IMU data to a mobile phone (code will be provided for receiving, storing, visualizing on the phone). Throughout the course and exercises, we will collect data of various human activities from the band, annotate them, analyze, classify, and interpret them. For this, existing and novel processing methods will be developed (plenty of related work exists), based on the collected data as well as existing datasets. We will also combine the band with signals obtained from the mobile phone to holistically capture and analyze health and activity data. Full details: https://teaching.siplab.org/mobile_health_activity_monitoring/2023/ Note: All lectures will be streamed live and recorded for later replay. Hybrid participation will be possible. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Copies of slides will be made available Lectures will be streamed live as well as recorded and made available online. More information on the course site: https://teaching.siplab.org/mobile_health_activity_monitoring/2023/ Note: All lectures will be streamed live and recorded for later replay. Hybrid participation will be possible. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Will be provided in the lecture | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
252-0579-00L | 3D Vision | W | 5 credits | 3G + 1A | M. Pollefeys, D. B. Baráth | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The course covers camera models and calibration, feature tracking and matching, camera motion estimation via simultaneous localization and mapping (SLAM) and visual odometry (VO), epipolar and mult-view geometry, structure-from-motion, (multi-view) stereo, augmented reality, and image-based (re-)localization. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | After attending this course, students will: 1. understand the core concepts for recovering 3D shape of objects and scenes from images and video. 2. be able to implement basic systems for vision-based robotics and simple virtual/augmented reality applications. 3. have a good overview over the current state-of-the art in 3D vision. 4. be able to critically analyze and asses current research in this area. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The goal of this course is to teach the core techniques required for robotic and augmented reality applications: How to determine the motion of a camera and how to estimate the absolute position and orientation of a camera in the real world. This course will introduce the basic concepts of 3D Vision in the form of short lectures, followed by student presentations discussing the current state-of-the-art. The main focus of this course are student projects on 3D Vision topics, with an emphasis on robotic vision and virtual and augmented reality applications. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
252-5706-00L | Mathematical Foundations of Computer Graphics and Vision | W | 5 credits | 2V + 1U + 1A | T. Aydin, A. Djelouah | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course presents the fundamental mathematical tools and concepts used in computer graphics and vision. Each theoretical topic is introduced in the context of practical vision or graphic problems, showcasing its importance in real-world applications. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The main goal is to equip the students with the key mathematical tools necessary to understand state-of-the-art algorithms in vision and graphics. In addition to the theoretical part, the students will learn how to use these mathematical tools to solve a wide range of practical problems in visual computing. After successfully completing this course, the students will be able to apply these mathematical concepts and tools to practical industrial and academic projects in visual computing. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The theory behind various mathematical concepts and tools will be introduced, and their practical utility will be showcased in diverse applications in computer graphics and vision. The course will cover topics in sampling, reconstruction, approximation, optimization, robust fitting, differentiation, quadrature and spectral methods. Applications will include 3D surface reconstruction, camera pose estimation, image editing, data projection, character animation, structure-aware geometry processing, and rendering. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5052-00L | Interactive Machine Learning: Visualization & Explainability | W | 5 credits | 3G + 1A | M. El-Assady | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Visual Analytics supports the design of human-in-the-loop interfaces that enable human-machine collaboration. In this course, will go through the fundamentals of designing interactive visualizations, later applying them to explain and interact with machine leaning models. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The goal of the course is to introduce techniques for interactive information visualization and to apply these on understanding, diagnosing, and refining machine learning models. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Interactive, mixed-initiative machine learning promises to combine the efficiency of automation with the effectiveness of humans for a collaborative decision-making and problem-solving process. This can be facilitated through co-adaptive visual interfaces. This course will first introduce the foundations of information visualization design based on data charecteristics, e.g., high-dimensional, geo-spatial, relational, temporal, and textual data. Second, we will discuss interaction techniques and explanation strategies to enable explainable machine learning with the tasks of understanding, diagnosing, and refining machine learning models. Tentative list of topics: 1. Visualization and Perception 2. Interaction and Explanation 3. Systems Overview | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Course material will be provided in form of slides. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Will be provided during the course. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Basic understanding of machine learning as taught at the Bachelor's level. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5701-00L | Scientific Visualization | W | 5 credits | 2V + 1U + 1A | M. Gross, T. Günther | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This lecture provides an introduction into visualization of scientific and abstract data. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | This lecture provides an introduction into the visualization of scientific and abstract data. The lecture introduces into the two main branches of visualization: scientific visualization and information visualization. The focus is set onto scientific data, demonstrating the usefulness and necessity of computer graphics in other fields than the entertainment industry. The exercises contain theoretical tasks on the mathematical foundations such as numerical integration, differential vector calculus, and flow field analysis, while programming exercises familiarize with the Visualization Tool Kit (VTK). In a course project, the learned methods are applied to visualize one real scientific data set. The provided data sets contain measurements of volcanic eruptions, galaxy simulations, fluid simulations, meteorological cloud simulations and asteroid impact simulations. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | This lecture opens with human cognition basics, and scalar and vector calculus. Afterwards, this is applied to the visualization of air and fluid flows, including geometry-based, topology-based and feature-based methods. Further, the direct and indirect visualization of volume data is discussed. The lecture ends on the viualization of abstract, non-spatial and multi-dimensional data by means of information visualization. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Fundamentals of differential calculus. Knowledge on numerical mathematics, computer algebra systems, as well as ordinary and partial differential equations is an asset, but not required. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Seminar | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Number | Title | Type | ECTS | Hours | Lecturers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
252-2603-00L | Seminar on Systems Security The deadline for deregistering expires at the end of the second week of the semester. Students who are still registered after that date, but do not attend the seminar, will officially fail the seminar. | W | 2 credits | 2S | S. Shinde | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The seminar focuses on critical thinking and critique of fundamental as well as recent advances in systems security. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The learning objective is to analyze selected research papers published at top systems+security venues and then identify open problems in this space. The seminar will achieve this via several components: reading papers, technical presentations, writing analysis and critique summaries, class discussions, and exploring potential research topics. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Each student will pick one paper from the selected list, present it in the class, and lead the discussion for that paper. During the semester, all students will select, read, and submit critique summaries for at least 8 research papers from the list. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Students who are either interested in security research or are exploring thesis topics are highly encouraged to take this course. Students with systems/architecture/verification/PL expertise and basic security understanding are welcome. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
252-4102-00L | Seminar on Randomized Algorithms and Probabilistic Methods The deadline for deregistering expires at the end of the second week of the semester. Students who are still registered after that date, but do not attend the seminar, will officially fail the seminar. | W | 2 credits | 2S | A. Steger | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The aim of the seminar is to study papers which bring the students to the forefront of today's research topics. This semester we will study selected papers of the conference Symposium on Discrete Algorithms (SODA22). | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Read papers from the forefront of today's research; learn how to give a scientific talk. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | The seminar is open for both students from mathematics and students from computer science. As prerequisite we require that you passed the course Randomized Algorithms and Probabilistic Methods (or equivalent, if you come from abroad). | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
252-5704-00L | Advanced Methods in Computer Graphics The deadline for deregistering expires at the end of the second week of the semester. Students who are still registered after that date, but do not attend the seminar, will officially fail the seminar. | W | 2 credits | 2S | M. Gross, O. Sorkine Hornung | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This seminar covers advanced topics in computer graphics with a focus on the latest research results. Topics include modeling, rendering, visualization, animation, physical simulation, computational photography, and others. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The goal is to obtain an in-depth understanding of actual problems and research topics in the field of computer graphics as well as improve presentation and critical analysis skills. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
261-5113-00L | Computational Challenges in Medical Genomics | W | 2 credits | 2S | A. Kahles | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This seminar discusses recent relevant contributions to the fields of computational genomics, algorithmic bioinformatics, statistical genetics and related areas. Each participant will hold a presentation and lead the subsequent discussion. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Preparing and holding a scientific presentation in front of peers is a central part of working in the scientific domain. In this seminar, the participants will learn how to efficiently summarize the relevant parts of a scientific publication, critically reflect its contents, and summarize it for presentation to an audience. The necessary skills to succesfully present the key points of existing research work are the same as needed to communicate own research ideas. In addition to holding a presentation, each student will both contribute to as well as lead a discussion section on the topics presented in the class. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The topics covered in the seminar are related to recent computational challenges that arise from the fields of genomics and biomedicine, including but not limited to genomic variant interpretation, genomic sequence analysis, compressive genomics tasks, single-cell approaches, privacy considerations, statistical frameworks, etc. Both recently published works contributing novel ideas to the areas mentioned above as well as seminal contributions from the past are amongst the list of selected papers. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Knowledge of algorithms and data structures and interest in applications in genomics and computational biomedicine. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-2100-00L | Research Topics in Software Engineering The deadline for deregistering expires at the end of the second week of the semester. Students who are still registered after that date, but do not attend the seminar, will officially fail the seminar. | W | 2 credits | 2S | Z. Su, M. Vechev, R. Jung | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This seminar is an opportunity to become familiar with current research in software engineering and more generally with the methods and challenges of scientific research. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Each student will be asked to study some papers from the recent software engineering literature and review them. This is an exercise in critical review and analysis. Active participation is required (a presentation of a paper as well as participation in discussions). | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The aim of this seminar is to introduce students to recent research results in the area of programming languages and software engineering. To accomplish that, students will study and present research papers in the area as well as participate in paper discussions. The papers will span topics in both theory and practice, including papers on program verification, program analysis, testing, programming language design, and development tools. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | The publications to be presented will be announced on the seminar home page at least one week before the first session. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Papers will be distributed during the first lecture. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-2926-00L | Deep Learning for Big Code The deadline for deregistering expires at the end of the second week of the semester. Students who are still registered after that date, but do not attend the seminar, will officially fail the seminar. | W | 2 credits | 2S | V. Raychev | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The seminar covers some of the latest and most exciting developments (industrial and research) in the field of Deep Learning for Code, including new methods and latest systems, as well as open challenges and opportunities. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The objective of the seminar is to: - Introduce students to the field of Deep Learning for Big Code. - Learn how machine learning models can be used to solve practical challenges in software engineering and programming beyond traditional methods. - Highlight the latest research and work opportunities in industry and academia available on this topic. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The last 5 years have seen increased interest in applying advanced machine learning techniques such as deep learning to new kind of data: program code. As the size of open source code increases dramatically (over 980 billion lines of code written by humans), so comes the opportunity for new kind of deep probabilistic methods and commercial systems that leverage this data to revolutionize software creation and address hard problems not previously possible. Examples include: machines writing code, program de-obfuscation for security, code search, and many more. Interestingly, this new type of data, unlike natural language and images, introduces technical challenges not typically encountered when working with standard datasets (e.g., images, videos, natural language), for instance, finding the right representation over which deep learning operates. This in turn has the potential to drive new kinds of machine learning models with broad applicability. Because of this, there has been substantial interest over the last few years in both industry (e.g., companies such as Facebook starting, various start-ups in the space such as http://deepcode.ai), academia (e.g., http://plml.ethz.ch) and government agencies (e.g., DARPA) on using machine learning to automate various programming tasks. In this seminar, we will cover some of the latest and most exciting developments in the field of Deep Learning for Code, including new methods and latest systems, as well as open challenges and opportunities. The seminar is carried out as a set of presentations chosen from a list of available papers. The grade is determined as a function of the presentation, handling questions and answers, and participation. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | The seminar is carried out as a set of presentations chosen from a list of available papers. The grade is determined as a function of the presentation, handling questions and answers, and participation. The seminar is ideally suited for M.Sc. students in Computer Science. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-3600-00L | Heterogeneous Systems Seminar The deadline for deregistering expires at the end of the second week of the semester. Students who are still registered after that date, but do not attend the seminar, will officially fail the seminar. | W | 2 credits | 2S | M. J. Giardino | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The seminar covers heterogeneous systems, those that make use of different types of computing (GPUs, FPGA, ASICs, etc.) and/or memory (NVM/SCM). Our focus will be the systems and architectures that use these devices. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The objective of this course is to familiarize students with important topics in heterogeneous systems, past, present, and future: the devices, the architectures, and their uses. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The seminar consists of student presentations based upon a list of papers provided at the beginning of the course. Presentations will be done in teams. Students will be allotted a 45 minute time slot consisting of a 30 minute presentation and 15 minutes for questions. Grading is based upon the quality of the presentation, the coverage of the paper including necessary background and follow-on work, and the ability to understand and critique the paper and technology. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-3712-00L | Advanced Seminar on Computational Haptics The deadline for deregistering expires at the end of the second week of the semester. Students who are still registered after that date, but do not attend the seminar, will officially fail the seminar. | W | 2 credits | 2S | J. J. Zarate | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Haptic rendering technologies stimulate the user’s senses of touch and motion just as felt when interacting with physical objects. Actuation techniques need to address three questions: 1) What to actuate, 2) How to actuate it and 3) When to actuate it. We will approach each of these questions from a heavily technical perspective, with a focus on optimization and machine learning to find answers. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The goal of the seminar is to familiarize students with exciting new research topics in this important area, but also to teach basic scientific writing and oral presentation skills. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Haptics rendering is the use of technology that stimulates the senses of touch and motion that would be felt by a user interacting directly with physical objects. This usually involves hardware that is capable of delivering these senses. Three questions arise here: 1) What to actuate, 2) How to actuate it and 3) When to actuate. We will approach these questions from a heavy technical perspective that usually have an optimization or machine learning focus. Papers from scientific venues such as CHI, UIST & SIGGRAPH will be examined in-depth that answer these questions (partially). Students present and discuss the papers to extract techniques and insights that can be applied to software & hardware projects. Topics revolve around computational design, sensor placement, user state interference (through machine learning), and actuation as an optimization problem. The seminar will have a different structure from regular seminars to encourage more discussion and a deeper learning experience. We will use a case-study format where all students read the same paper each week but fulfill different roles and hence prepare with different viewpoints in mind ( "presenter", "historian", "PhD", and “Journalist”). The final deliverables include: 20 Minute presentation as presenter 5 Minute presentation as historian 1 A4 research proposal as the PhD 1 A4 summary of the discussion as the Journalist. Example papers are: Tactile Rendering Based on Skin Stress Optimization - (http://mslab.es/projects/TactileRenderingSkinStress/) SIGGRAPH 2020 SimuLearn: Fast and Accurate Simulator to Support Morphing Materials Design and Workflows - (https://dl.acm.org/doi/10.1145/3379337.3415867) UIST 2019 Fabrication-in-the-Loop Co-Optimization of Surfaces and Styli for Drawing Haptics -(https://www.pdf.inf.usi.ch/projects/SurfaceStylusCoOpt/index.html) SIGGRAPH 2020 For each topic, a paper will be chosen that represents the state of the art of research or seminal work that inspired and fostered future work. Students will learn how to incorporate computational methods into systems that involve software, hardware, and, very importantly, users. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Computational Interaction, Edited by Antti Oulasvirta, Per Ola Kristensson, Xiaojun Bi, and Andrew Howes, 2018. PDF Freely available through the ETH Network. Link | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-4203-00L | Geometry: Combinatorics and Algorithms The deadline for deregistering expires at the end of the second week of the semester. Students who are still registered after that date, but do not attend the seminar, will officially fail the seminar. | W | 2 credits | 2S | B. Gärtner, M. Hoffmann, E. Welzl, P. Schnider | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This seminar complements the course Geometry: Combinatorics & Algorithms. Students of the seminar will present original research papers, some classic and some of them very recent. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Each student is expected to read, understand, and elaborate on a selected research paper. To this end, (s)he should give a 45-min. presentation about the paper. The process includes * getting an overview of the related literature; * understanding and working out the background/motivation: why and where are the questions addressed relevant? * understanding the contents of the paper in all details; * selecting parts suitable for the presentation; * presenting the selected parts in such a way that an audience with some basic background in geometry and graph theory can easily understand and appreciate it. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | This seminar is held once a year and complements the course Geometry: Combinatorics & Algorithms. Students of the seminar will present original research papers, some classic and some of them very recent. The seminar is a good preparation for a master, diploma, or semester thesis in the area. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Prerequisite: Successful participation in the course "Geometry: Combinatorics & Algorithms" (takes place every HS) is required. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-4651-00L | Current Topics in Cryptography The deadline for deregistering expires at the end of the second week of the semester. Students who are still registered after that date, but do not attend the seminar, will officially fail the seminar. | W | 2 credits | 2S | D. Hofheinz, U. Maurer, K. Paterson | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | In this seminar course, students present and discuss a variety of recent research papers in Cryptography. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Independent study of scientific literature and assessment of its contributions as well as learning and practicing presentation techniques. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The course lecturers will provide a list of papers from which students will select. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | The reading list will be published on the course website. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Ideally, students will have taken the D-INFK Bachelors course “Information Security" or an equivalent course at Bachelors level. Ideally, they will have attended or will attend in parallel the Masters course in "Applied Cryptography”. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5002-00L | Generative Visual Models The deadline for deregistering expires at the end of the second week of the semester. Students who are still registered after that date, but do not attend the seminar, will officially fail the seminar. | W | 4 credits | 2S + 2A | T. Hofmann | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This seminar investigates generative models for image synthesis, which can be controlled via language prompts and visual seeding. The relevant methods will be explained in a few initial classes. Participants will study the research literature and develop project ideas in small groups, which will then be implemented. Presentation of research papers, project ideas, and results is a key component. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The goal of this class is for participants to find, read, understand and critically assess research literature in order to reach the current state of knowledge in the field. Moreover, the project work aims to enrich these readings by hands-on experience and allows for student to develop creative ideas of their own. This is meant to provide a wholistic research experience in small teams. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Phase 1: Introduction & Background During the first weeks of the semester lectures will provide the technical background to understand visual generative models. This includes a historic overview as well as technical deep dives into specialized topics such as stable diffusion and contrastive learning. There will also be a tutorial on suitable software framework to explore and fine-tune such models. Each participant will do a graded pen & paper exercise in order to check on progress. 20% of the grade, correctness of questions. Phase 2: Reading & Planning In the second phase, participants will split up in teams (ideal size 3) and will perform independent reading and planning towards a project idea. Paper suggestions and project sketches will be distributed to provide guidance and inspiration. During this time, participants are also expected to familiarize themselves with the experimental setup (we will locally host models on our GPU servers) and perform some simple warm-up or proof-of-concept experiments to inform the project definition. Each group will give a 15+5 min project pitch and will give/receive feedback from other teams. 30% of the grade, creativity of the idea, clarity of project articulation, recognition of existing work. Phase 3: Project Execution & Presentation In the third phase, teams will implement their project and run the designed experiments to answer the articulated research questions or goals. Participants will have (limited) access to local GPU servers. Each group will produce a written project report and will deliver a presentation. 50% of the grade, success of the project, quality of the experiments, quality of the slides/presentation. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | This hybrid course unit is open to master students enrolled in the Computer Science or Data Science Master program. Enrollement is limited to 20 students. A sufficient background in machine learning (e.g. 252-0220-00L Intro ML, 252-0535-00L Advanced ML) is assumed. The work load during Phase 1-2 will be moderate, but during Phase 3, we expect more intense team work. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5225-00L | Advanced Topics in Machine Learning and Data Science The deadline for deregistering expires at the end of the fourth week of the semester. Students who are still registered after that date, but do not attend the seminar, will officially fail the seminar. | W | 2 credits | 2S | F. Perez Cruz | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | In this seminar, recent papers of the machine learning and data science literature are presented and discussed. Possible topics cover statistical models, machine learning algorithms and its applications. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The seminar “Advanced Topics in Machine Learning and Data Science” familiarizes students with recent developments in machine learning and data science. Recently published articles, as well as influential papers, have to be presented and critically reviewed. The students will learn how to structure a scientific presentation, which covers the motivation, key ideas and main results of a scientific paper. An important goal of the seminar presentation is to summarize the essential ideas of the paper in sufficient depth for the audience to be able to follow its main conclusion, especially why the article is (or is not) worth attention. The presentation style will play an important role and should reach the level of professional scientific presentations. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The seminar will cover a number of recent papers which have emerged as important contributions to the machine learning and data science literatures. The topics will vary from year to year but they are centered on methodological issues in machine learning and its application, not only to text or images, but other scientific domains like medicine, climate or physics. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | The papers will be presented in the first session of the seminar. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5904-00L | Deep Learning for Computer Vision: Seminal Work The deadline for deregistering expires at the end of the first week of the semester. Students who are still registered after that date, but do not attend the seminar, will officially fail the seminar. | W | 2 credits | 2S | I. Armeni, H. Blum | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This seminar covers seminal papers on the topic of deep learning for computer vision. The students will present and discuss the papers and gain an understanding of the most influential research in this area - both past and present. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The objectives of this seminar are two-fold. Firstly, the aim is to provide a solid understanding of key contributions to the field of deep learning for vision (including a historical perspective as well as recent work). Secondly, the students will learn to critically read and analyse original research papers and judge their impact, as well as how to give a scientific presentation and lead a discussion on their topic. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The seminar will start with introductory lectures to provide (1) a compact overview of challenges and relevant machine learning and deep learning research, and (2) a tutorial on critical analysis and presentation of research papers. Each student then chooses one paper from the provided collection to present during the remainder of the seminar. The students will be supported in the preparation of their presentation by the seminar assistants. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | The selection of research papers will be presented at the beginning of the semester. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | The course "Machine Learning" is recommended. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
227-0559-00L | Seminar in Deep Neural Networks Number of participants limited to 25. | W | 2 credits | 2S | R. Wattenhofer | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | In this seminar participating students present and discuss recent research papers in the area of deep neural networks. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | We aim at giving the students an in depth view on the current advances in the area by discussing recent papers as well as discussing current issues and difficulties surrounding deep neural networks. The students will learn to read, evaluate and challenge research papers, prepare coherent scientific presentations and lead a discussion on their topic. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The seminar will cover a range of research directions, with a focus on Graph Neural Networks, Algorithmic Learning, Reinforcement Learning and Natural Language Processing. It will be structured in blocks with each focus area being briefly introduced before presenting and discussing recent research papers. Papers will be allocated to the students based on their preferences. For more information see www.disco.ethz.ch/courses.html. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Slides of presentations will be made available. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | The paper selection can be found on www.disco.ethz.ch/courses.html. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | It is expected that students have prior knowledge and interest in machine and deep learning, for instance by having attended appropriate courses. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
227-0559-10L | Seminar in Sustainable Networking | W | 2 credits | 2S | L. Vanbever, R. Jacob | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | In this seminar, students review, present, and discuss recent research papers on computer networks, with a focus on sustainable networking. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | By the end of the seminar, students will be able to 1. Read efficiently and assess critically scientific papers; 2. Discuss technical topics with an audience of peers; 3. Discuss the challenges of sustainable computing and networking. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The seminar will start with one introductory session. Starting from the third week, participating students will start reviewing, presenting, and discussing research papers. Two papers will be discussed every two weeks. Each student must choose a paper from a given list, prepare and give a (short) presentation on the paper's topic, and lead the follow-up discussion. In addition, all students submit one (short) review for the two papers presented every week in class. Students will be evaluated based on their reviews, their presentation, and their leadership of and participation in the paper discussions. The exact course content varies over time. For details, refers to the course website: https://seminar-net.ethz.ch/ | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | The slides of each presentation will be made available on the website. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | The paper selection will be made available on the course website. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Communication Networks (227-0120-00L) or equivalents. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
401-3900-16L | Advanced Topics in Discrete Optimization Number of participants limited to 12. | W | 4 credits | 2S | R. Zenklusen, D. E. K. Hershkowitz, R. Santiago Torres | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | In this seminar we will discuss selected topics in discrete optimization. The main focus is on mostly recent research papers in the field of Combinatorial Optimization. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The goal of the seminar is twofold. First, we aim at improving students' presentation and communication skills. In particular, students are to present a research paper to their peers and the instructors in a clear and understandable way. Second, students learn a selection of recent cutting-edge approaches in the field of Combinatorial Optimization by attending the other students' talks. A very active participation in the seminar helps students to build up the necessary skills for parsing and digesting advanced technical texts on a significantly higher complexity level than usual textbooks. A key goal is that students prepare their presentations in a concise and accessible way to make sure that other participants get a clear idea of the presented results and techniques. Students intending to do a project in optimization are strongly encouraged to participate. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The selected topics will cover various classical and modern results in Combinatorial Optimization. Contrary to prior years, a very significant component of the seminar will be interactive discussions where active participation of the students is required. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | The learning material will be in the form of scientific papers. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Requirements: We expect students to have a thorough understanding of topics covered in the course "Linear & Combinatorial Optimization". | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
851-0740-00L | Big Data, Law, and Policy | W | 3 credits | 2S | S. Bechtold | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course introduces students to societal perspectives on the big data revolution. Discussing important contributions from machine learning and data science, the course explores their legal, economic, ethical, and political implications in the past, present, and future. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | This course is intended both for students of machine learning and data science who want to reflect on the societal implications of their field, and for students from other disciplines who want to explore the societal impact of data sciences. The course will first discuss some of the methodological foundations of machine learning, followed by a discussion of research papers and real-world applications where big data and societal values may clash. Potential topics include the implications of big data for privacy, liability, insurance, health systems, voting, and democratic institutions, as well as the use of predictive algorithms for price discrimination and the criminal justice system. Guest speakers, weekly readings and reaction papers ensure a lively debate among participants from various backgrounds. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Practical Work | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Number | Title | Type | ECTS | Hours | Lecturers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
252-0570-00L | Game Programming Laboratory | W | 10 credits | 9P | B. Sumner | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The goal of this course is the in-depth understanding of the technology and programming underlying computer games. Students gradually design and develop a computer game in small groups and get acquainted with the art of game programming. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The goal of this new course is to acquaint students with the technology and art of programming modern three-dimensional computer games. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | This course addresses modern three-dimensional computer game technology. During the course, small groups of students will design and develop a computer game. Focus will be put on technical aspects of game development, such as rendering, cinematography, interaction, physics, animation, and AI. In addition, we will cultivate creative thinking for advanced gameplay and visual effects. The "laboratory" format involves a practical, hands-on approach with traditional lectures. We will meet once a week to discuss technical issues and to track progress. For development we use MonoGames, which is a collection of libraries and tools that facilitate game development. While development will take place on PCs, we will ultimately deployour games on the Xbox One console. At the end of the course we will present our results to the public. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Game Design Workshop: A Playcentric Approach to Creating Innovative Games by Tracy Fullerton | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | The number of participants is limited. Prerequisites include: - Good programming skills (Java, C++, C#, etc.) - CG experience: Students should have taken, at a minimum, Visual Computing. Higher level courses are recommended, such as Introduction to Computer Graphics, Surface Representations and Geometric Modeling, and Physically-based Simulation in Computer Graphics. Last cancellation/deregistration date for this ungraded semester performance: second Friday in March! Please note that after that date no deregistration will be accepted and the course will be considered as "fail". | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
252-0817-00L | Distributed Systems Laboratory | W | 10 credits | 9P | G. Alonso, T. Hoefler, A. Klimovic, T. Roscoe, R. Wattenhofer, C. Zhang | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course involves the participation in a substantial development and/or evaluation project involving distributed systems technology. There are projects available in a wide range of areas: from web services to ubiquitous computing including as well wireless networks, ad-hoc networks, and distributed application on mobile phones. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Students acquire practical knowledge about technologies from the area of distributed systems. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | This course involves the participation in a substantial development and/or evaluation project involving distributed systems technology. There are projects available in a wide range of areas: from web services to ubiquitous computing including as well wireless networks, ad-hoc networks, and distributed application on mobile phones. The objecte of the project is for the students to gain hands-on-experience with real products and the latest technology in distributed systems. There is no lecture associated to the course. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-4630-00L | Computer-Aided Modelling and Reasoning | W | 8 credits | 7P | C. Sprenger | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The "computer-aided modelling and reasoning" lab is a hands-on course about using an interactive theorem prover to construct formal models of algorithms, protocols, and programming languages and to reason about their properties. The lab has two parts: The first introduces various modelling and proof techniques. The second part consists of a project in which the students apply these techniques | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The students learn to effectively use a theorem prover to create unambiguous models and rigorously analyse them. They learn how to write precise and concise specifications, to exploit the theorem prover as a tool for checking and analysing such models and for taming their complexity, and to extract certified executable implementations from such specifications. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The "computer-aided modelling and reasoning" lab is a hands-on course about using an interactive theorem prover to construct formal models of algorithms, protocols, and programming languages and to reason about their properties. The focus is on applying logical methods to concrete problems supported by a theorem prover. The course will demonstrate the challenges of formal rigor, but also the benefits of machine support in modelling, proving and validating. The lab will have two parts: The first part introduces basic and advanced modelling techniques (functional programs, inductive definitions, modules), the associated proof techniques (term rewriting, resolution, induction, proof automation), and compilation of the models to certified executable code. In the second part, the students work in teams of two on a project assignment in which they apply these techniques: they build a formal model and prove its desired properties. The project lies in the area of programming languages, model checking, or information security. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Textbook: Tobias Nipkow, Gerwin Klein. Concrete Semantics, part 1 (www.concrete-semantics.org) | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Last cancellation/deregistration date for this ungraded semester performance: 31 March 2023! Please note that after that date no deregistration will be accepted and the course will be considered as "fail". | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-0650-00L | Practical Work | W | 8 credits | 17A | Supervisors | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Practical work shall foster the student's ability to solve technological scientific problems by applying acquired knowledge and social competencies. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | see above | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Practical work refers either to a semester project or a lab course, which is conducted under the supervision of a professor of the department of computer science. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Minors | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Minor in Computer Graphics | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Number | Title | Type | ECTS | Hours | Lecturers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
252-0538-00L | Shape Modeling and Geometry Processing | W | 8 credits | 2V + 1U + 4A | O. Sorkine Hornung | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course covers the fundamentals and developments in geometric modeling and geometry processing. Topics include surface modeling based on point clouds and polygonal meshes, mesh generation, surface reconstruction, mesh fairing and parameterization, discrete differential geometry, interactive shape editing, topics in digital shape fabrication. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The students will learn how to design, program and analyze algorithms and systems for interactive 3D shape modeling and geometry processing. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Recent advances in 3D geometry processing have created a plenitude of novel concepts for the mathematical representation and interactive manipulation of geometric models. This course covers the fundamentals and some of the developments in geometric modeling and geometry processing. Topics include surface modeling based on point clouds and triangle meshes, mesh generation, surface reconstruction, mesh fairing and parameterization, discrete differential geometry, interactive shape editing and digital shape fabrication. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Slides and course notes | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Prerequisites: Visual Computing, Computer Graphics or an equivalent class. Experience with C++ programming. Solid background in linear algebra and analysis. Some knowledge of differential geometry, computational geometry and numerical methods is helpful but not a strict requirement. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
252-5706-00L | Mathematical Foundations of Computer Graphics and Vision | W | 5 credits | 2V + 1U + 1A | T. Aydin, A. Djelouah | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course presents the fundamental mathematical tools and concepts used in computer graphics and vision. Each theoretical topic is introduced in the context of practical vision or graphic problems, showcasing its importance in real-world applications. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The main goal is to equip the students with the key mathematical tools necessary to understand state-of-the-art algorithms in vision and graphics. In addition to the theoretical part, the students will learn how to use these mathematical tools to solve a wide range of practical problems in visual computing. After successfully completing this course, the students will be able to apply these mathematical concepts and tools to practical industrial and academic projects in visual computing. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The theory behind various mathematical concepts and tools will be introduced, and their practical utility will be showcased in diverse applications in computer graphics and vision. The course will cover topics in sampling, reconstruction, approximation, optimization, robust fitting, differentiation, quadrature and spectral methods. Applications will include 3D surface reconstruction, camera pose estimation, image editing, data projection, character animation, structure-aware geometry processing, and rendering. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5701-00L | Scientific Visualization | W | 5 credits | 2V + 1U + 1A | M. Gross, T. Günther | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This lecture provides an introduction into visualization of scientific and abstract data. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | This lecture provides an introduction into the visualization of scientific and abstract data. The lecture introduces into the two main branches of visualization: scientific visualization and information visualization. The focus is set onto scientific data, demonstrating the usefulness and necessity of computer graphics in other fields than the entertainment industry. The exercises contain theoretical tasks on the mathematical foundations such as numerical integration, differential vector calculus, and flow field analysis, while programming exercises familiarize with the Visualization Tool Kit (VTK). In a course project, the learned methods are applied to visualize one real scientific data set. The provided data sets contain measurements of volcanic eruptions, galaxy simulations, fluid simulations, meteorological cloud simulations and asteroid impact simulations. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | This lecture opens with human cognition basics, and scalar and vector calculus. Afterwards, this is applied to the visualization of air and fluid flows, including geometry-based, topology-based and feature-based methods. Further, the direct and indirect visualization of volume data is discussed. The lecture ends on the viualization of abstract, non-spatial and multi-dimensional data by means of information visualization. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Fundamentals of differential calculus. Knowledge on numerical mathematics, computer algebra systems, as well as ordinary and partial differential equations is an asset, but not required. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5806-00L | Digital Humans Previously Computational Models of Motion and Virtual Humans | W | 8 credits | 3V + 2U + 2A | S. Coros, S. Tang | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course covers the core technologies required to model and simulate motions for digital humans and robotic characters. Topics include kinematic modeling, physics-based simulation, trajectory optimization, reinforcement learning, feedback control for motor skills, motion capture, data-driven motion synthesis, and ML-based generative models. They will be richly illustrated with examples. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Students will learn how to estimate human pose, shape, and motion from videos and create basic human avatars from various visual inputs. Students will also learn how to represent and algorithmically generate motions for digital characters and their real-life robotic counterparts. The lectures are accompanied by four programming assignments (written in python or C++) and a capstone project. The deadline to cancel/deregister from the course is May 1st. Deregistration after the deadline will lead to fail. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | - Basic concept of 3D representations - Human body/hand models - Human motion capture; - Non-rigid surface tracking and reconstruction - Neural rendering - Optimal control and trajectory optimization - Physics-based modeling for multibody systems - Forward and inverse kinematics - Rigging and keyframing - Reinforcement learning for locomotion | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Experience with python and C++ programming, numerical linear algebra, multivariate calculus and probability theory. Some background in deep learning, computer vision, physics-based modeling, kinematics, and dynamics is preferred. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Minor in Computer Vision | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Number | Title | Type | ECTS | Hours | Lecturers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
252-0579-00L | 3D Vision | W | 5 credits | 3G + 1A | M. Pollefeys, D. B. Baráth | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The course covers camera models and calibration, feature tracking and matching, camera motion estimation via simultaneous localization and mapping (SLAM) and visual odometry (VO), epipolar and mult-view geometry, structure-from-motion, (multi-view) stereo, augmented reality, and image-based (re-)localization. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | After attending this course, students will: 1. understand the core concepts for recovering 3D shape of objects and scenes from images and video. 2. be able to implement basic systems for vision-based robotics and simple virtual/augmented reality applications. 3. have a good overview over the current state-of-the art in 3D vision. 4. be able to critically analyze and asses current research in this area. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The goal of this course is to teach the core techniques required for robotic and augmented reality applications: How to determine the motion of a camera and how to estimate the absolute position and orientation of a camera in the real world. This course will introduce the basic concepts of 3D Vision in the form of short lectures, followed by student presentations discussing the current state-of-the-art. The main focus of this course are student projects on 3D Vision topics, with an emphasis on robotic vision and virtual and augmented reality applications. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-3710-00L | Machine Perception | W | 8 credits | 3V + 2U + 2A | O. Hilliges, J. Song | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Recent developments in neural networks have drastically advanced the performance of machine perception systems in a variety of areas including computer vision, robotics, and human shape modeling This course is a deep dive into deep learning algorithms and architectures with applications to a variety of perceptual and generative tasks. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Students will learn about fundamental aspects of modern deep learning approaches for perception and generation. Students will learn to implement, train and debug their own neural networks and gain a detailed understanding of cutting-edge research in learning-based computer vision, robotics, and shape modeling. The optional final project assignment will involve training a complex neural network architecture and applying it to a real-world dataset. The core competency acquired through this course is a solid foundation in deep-learning algorithms to process and interpret human-centric signals. In particular, students should be able to develop systems that deal with the problem of recognizing people in images, detecting and describing body parts, inferring their spatial configuration, performing action/gesture recognition from still images or image sequences, also considering multi-modal data, among others. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | We will focus on teaching: how to set up the problem of machine perception, the learning algorithms, network architectures, and advanced deep learning concepts in particular probabilistic deep learning models. The course covers the following main areas: I) Foundations of deep learning. II) Advanced topics like probabilistic generative modeling of data (latent variable models, generative adversarial networks, auto-regressive models, invertible neural networks, diffusion models). III) Deep learning in computer vision, human-computer interaction, and robotics. Specific topics include: I) Introduction to Deep Learning: a) Neural Networks and training (i.e., backpropagation) b) Feedforward Networks c) Timeseries modelling (RNN, GRU, LSTM) d) Convolutional Neural Networks II) Advanced topics: a) Latent variable models (VAEs) b) Generative adversarial networks (GANs) c) Autoregressive models (PixelCNN, PixelRNN, TCN, Transformer) d) Invertible Neural Networks / Normalizing Flows e) Coordinate-based networks (neural implicit surfaces, NeRF) f) Diffusion models III) Applications in machine perception and computer vision: a) Fully Convolutional architectures for dense per-pixel tasks (i.e., instance segmentation) b) Pose estimation and other tasks involving human activity c) Neural shape modeling (implicit surfaces, neural radiance fields) d) Deep Reinforcement Learning and Applications in Physics-Based Behavior Modeling | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Deep Learning Book by Ian Goodfellow and Yoshua Bengio | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | This is an advanced grad-level course that requires a background in machine learning. Students are expected to have a solid mathematical foundation, in particular in linear algebra, multivariate calculus, and probability. The course will focus on state-of-the-art research in deep learning and will not repeat the basics of machine learning Please take note of the following conditions: 1) Students must have taken the exam in Machine Learning (252-0535-00) or have acquired equivalent knowledge 2) All practical exercises will require basic knowledge of Python and will use libraries such as Pytorch, scikit-learn, and scikit-image. We will provide introductions to Pytorch and other libraries that are needed but will not provide introductions to basic programming or Python. The following courses are strongly recommended as prerequisites: * "Visual Computing" or "Computer Vision" The course will be assessed by a final written examination in English. No course materials or electronic devices can be used during the examination. Note that the examination will be based on the contents of the lectures, the associated reading materials, and the exercises. The exam will be a 3-hour end-of-term exam and take place at the end of the teaching period. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5052-00L | Interactive Machine Learning: Visualization & Explainability | W | 5 credits | 3G + 1A | M. El-Assady | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Visual Analytics supports the design of human-in-the-loop interfaces that enable human-machine collaboration. In this course, will go through the fundamentals of designing interactive visualizations, later applying them to explain and interact with machine leaning models. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The goal of the course is to introduce techniques for interactive information visualization and to apply these on understanding, diagnosing, and refining machine learning models. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Interactive, mixed-initiative machine learning promises to combine the efficiency of automation with the effectiveness of humans for a collaborative decision-making and problem-solving process. This can be facilitated through co-adaptive visual interfaces. This course will first introduce the foundations of information visualization design based on data charecteristics, e.g., high-dimensional, geo-spatial, relational, temporal, and textual data. Second, we will discuss interaction techniques and explanation strategies to enable explainable machine learning with the tasks of understanding, diagnosing, and refining machine learning models. Tentative list of topics: 1. Visualization and Perception 2. Interaction and Explanation 3. Systems Overview | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Course material will be provided in form of slides. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Will be provided during the course. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Basic understanding of machine learning as taught at the Bachelor's level. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Minor in Data Management | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Number | Title | Type | ECTS | Hours | Lecturers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
227-0558-00L | Principles of Distributed Computing | W | 7 credits | 2V + 2U + 2A | R. Wattenhofer | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | We study the fundamental issues underlying the design of distributed systems: communication, coordination, fault-tolerance, locality, parallelism, self-organization, symmetry breaking, synchronization, uncertainty. We explore essential algorithmic ideas and lower bound techniques. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Distributed computing is essential in modern computing and communications systems. Examples are on the one hand large-scale networks such as the Internet, and on the other hand multiprocessors such as your new multi-core laptop. This course introduces the principles of distributed computing, emphasizing the fundamental issues underlying the design of distributed systems and networks: communication, coordination, fault-tolerance, locality, parallelism, self-organization, symmetry breaking, synchronization, uncertainty. We explore essential algorithmic ideas and lower bound techniques, basically the "pearls" of distributed computing. We will cover a fresh topic every week. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Distributed computing models and paradigms, e.g. message passing, shared memory, synchronous vs. asynchronous systems, time and message complexity, peer-to-peer systems, small-world networks, social networks, sorting networks, wireless communication, and self-organizing systems. Distributed algorithms, e.g. leader election, coloring, covering, packing, decomposition, spanning trees, mutual exclusion, store and collect, arrow, ivy, synchronizers, diameter, all-pairs-shortest-path, wake-up, and lower bounds | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Available. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Lecture Notes By Roger Wattenhofer. These lecture notes are taught at about a dozen different universities through the world. Mastering Distributed Algorithms Roger Wattenhofer Inverted Forest Publishing, 2020. ISBN 979-8628688267 Distributed Computing: Fundamentals, Simulations and Advanced Topics Hagit Attiya, Jennifer Welch. McGraw-Hill Publishing, 1998, ISBN 0-07-709352 6 Introduction to Algorithms Thomas Cormen, Charles Leiserson, Ronald Rivest. The MIT Press, 1998, ISBN 0-262-53091-0 oder 0-262-03141-8 Disseminatin of Information in Communication Networks Juraj Hromkovic, Ralf Klasing, Andrzej Pelc, Peter Ruzicka, Walter Unger. Springer-Verlag, Berlin Heidelberg, 2005, ISBN 3-540-00846-2 Introduction to Parallel Algorithms and Architectures: Arrays, Trees, Hypercubes Frank Thomson Leighton. Morgan Kaufmann Publishers Inc., San Francisco, CA, 1991, ISBN 1-55860-117-1 Distributed Computing: A Locality-Sensitive Approach David Peleg. Society for Industrial and Applied Mathematics (SIAM), 2000, ISBN 0-89871-464-8 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Course pre-requisites: Interest in algorithmic problems. (No particular course needed.) | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-3855-00L | Cloud Computing Architecture | W | 9 credits | 3V + 2U + 3A | G. Alonso, A. Klimovic | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Cloud computing hosts a wide variety of online services that we use on a daily basis, including web search, social networks, and video streaming. This course will cover how datacenter hardware, systems software, and application frameworks are designed for the cloud. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | After successful completion of this course, students will be able to: 1) reason about performance, energy efficiency, and availability tradeoffs in the design of cloud system software, 2) describe how datacenter hardware is organized and explain why it is organized as such, 3) implement cloud applications as well as analyze and optimize their performance. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | In this course, we study how datacenter hardware, systems software, and applications are designed at large scale for the cloud. The course covers topics including server design, cluster management, large-scale storage systems, serverless computing, data analytics frameworks, and performance analysis. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Lecture slides will be available on the course website. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Undergraduate courses in 1) computer architecture and 2) operating systems, distributed systems, and/or database systems are strongly recommended. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5354-00L | Large Language Models | W | 8 credits | 3V + 2U + 2A | R. Cotterell, M. Sachan, F. Tramèr, C. Zhang | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Large language models have become one of the most commonly deployed NLP inventions. In the past half-decade, their integration into core natural language processing tools has dramatically increased the performance of such tools, and they have entered the public discourse surrounding artificial intelligence. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | To understand the mathematical foundations of large language models as well as how to implement them. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | We start with the probabilistic foundations of language models, i.e., covering what constitutes a language model from a formal, theoretical perspective. We then discuss how to construct and curate training corpora, and introduce many of the neural-network architectures often used to instantiate language models at scale. The course covers aspects of systems programming, discussion of privacy and harms, as well as applications of language models in NLP and beyond. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | The lecture notes will be supplemented with various readings from the literature. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Minor in Information Security | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Number | Title | Type | ECTS | Hours | Lecturers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
252-0408-00L | Cryptographic Protocols | W | 6 credits | 2V + 2U + 1A | M. Hirt | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | In a cryptographic protocol, a set of parties wants to achieve some common goal, while some of the parties are dishonest. Most prominent example of a cryptographic protocol is multi-party computation, where the parties compute an arbitrary (but fixed) function of their inputs, while maintaining the secrecy of the inputs and the correctness of the outputs even if some of the parties try to cheat. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | To know and understand a selection of cryptographic protocols and to be able to analyze and prove their security and efficiency. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The selection of considered protocols varies. Currently, we consider multi-party computation, secret-sharing, broadcast and Byzantine agreement. We look at both the synchronous and the asynchronous communication model, and focus on simple protocols as well as on highly-efficient protocols. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | We provide handouts of the slides. For some of the topics, we also provide papers and/or lecture notes. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | A basic understanding of fundamental cryptographic concepts (as taught for example in the course Information Security) is useful, but not required. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-2925-00L | Program Analysis for System Security and Reliability Does not take place this semester. | W | 7 credits | 2V + 1U + 3A | M. Vechev | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Security issues in modern systems (blockchains, datacenters, deep learning, etc.) result in billions of losses due to hacks and system downtime. This course introduces fundamental techniques (ranging over automated analysis, machine learning, synthesis, zero-knowledge, differential privacy, and their combinations) that can be applied in practice so to build more secure and reliable modern systems. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | * Understand the fundamental techniques used to create modern security and reliability analysis engines that are used worldwide. * Understand how symbolic techniques are combined with machine learning (e.g., deep learning, reinforcement learning) so to create new kinds of learning-based analyzers. * Understand how to quantify and fix security and reliability issues in modern deep learning models. * Understand open research questions from both theoretical and practical perspectives. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Please see: https://www.sri.inf.ethz.ch/teaching/pass2022 for detailed course content. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-4600-00L | Formal Methods for Information Security Does not take place this semester. | W | 5 credits | 2V + 1U + 1A | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The course focuses on formal methods for the modeling and analysis of security protocols for critical systems, ranging from authentication protocols for network security to electronic voting protocols and online banking. In addition, we will also introduce the notions of non-interference and runtime monitoring. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The students will learn the key ideas and theoretical foundations of formal modeling and analysis of security protocols. The students will complement their theoretical knowledge by solving practical exercises, completing a small project, and using state-of-the-art tools. The students also learn the fundamentals of non-interference and runtime monitoring. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The course treats formal methods mainly for the modeling and analysis of security protocols. Cryptographic protocols (such as SSL/TLS, SSH, Kerberos, SAML single-sign on, and IPSec) form the basis for secure communication and business processes. Numerous attacks on published protocols show that the design of cryptographic protocols is extremely error-prone. A rigorous analysis of these protocols is therefore indispensable, and manual analysis is insufficient. The lectures cover the theoretical basis for the (tool-supported) formal modeling and analysis of such protocols. Specifically, we discuss their operational semantics, the formalization of security properties, and techniques and algorithms for their verification. The second part of this course will cover a selection of advanced topics in security protocols such as abstraction techniques for efficient verification, secure communication with humans, the link between symbolic protocol models and cryptographic models as well as RFID protocols (a staple of the Internet of Things) and electronic voting protocols, including the relevant privacy properties. Moreover, we will give an introduction to two additional topics: non-interference as a general notion of secure systems, both from a semantic and a programming language perspective (type system), and runtime verification/monitoring to detect violations of security policies expressed as trace properties. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-4656-00L | Digital Signatures | W | 5 credits | 2V + 2A | D. Hofheinz | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Digital signatures as one central cryptographic building block. Different security goals and security definitions for digital signatures, followed by a variety of popular and fundamental signature schemes with their security analyses. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The student knows a variety of techniques to construct and analyze the security of digital signature schemes. This includes modularity as a central tool of constructing secure schemes, and reductions as a central tool to proving the security of schemes. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | We will start with several definitions of security for signature schemes, and investigate the relations among them. We will proceed to generic (but inefficient) constructions of secure signatures, and then move on to a number of efficient schemes based on concrete computational hardness assumptions. On the way, we will get to know paradigms such as hash-then-sign, one-time signatures, and chameleon hashing as central tools to construct secure signatures. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Jonathan Katz, "Digital Signatures." | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Ideally, students will have taken the D-INFK Bachelors course "Information Security" or an equivalent course at Bachelors level. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-4660-00L | Applied Cryptography | W | 8 credits | 3V + 2U + 2P | K. Paterson, F. Günther | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course will introduce the basic primitives of cryptography, using rigorous syntax and game-based security definitions. The course will show how these primitives can be combined to build cryptographic protocols and systems. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The goal of the course is to put students' understanding of cryptography on sound foundations, to enable them to start to build well-designed cryptographic systems, and to expose them to some of the pitfalls that arise when doing so. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Basic symmetric primitives (block ciphers, modes, hash functions); generic composition; AEAD; basic secure channels; basic public key primitives (encryption,signature, DH key exchange); ECC; randomness; applications. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Textbook: Boneh and Shoup, “A Graduate Course in Applied Cryptography”, http://toc.cryptobook.us/book.pdf. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Students should have taken the D-INFK Bachelor's course “Information Security" (252-0211-00) or an alternative first course covering cryptography at a similar level. / In this course, we will use Moodle for content delivery: https://moodle-app2.let.ethz.ch/course/view.php?id=19644. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Minor in Machine Learning | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Number | Title | Type | ECTS | Hours | Lecturers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
252-0526-00L | Statistical Learning Theory | W | 8 credits | 3V + 2U + 2A | J. M. Buhmann | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The course covers advanced methods of statistical learning: - Variational methods and optimization. - Deterministic annealing. - Clustering for diverse types of data. - Model validation by information theory. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The course surveys recent methods of statistical learning. The fundamentals of machine learning, as presented in the courses "Introduction to Machine Learning" and "Advanced Machine Learning", are expanded from the perspective of statistical learning. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | - Variational methods and optimization. We consider optimization approaches for problems where the optimizer is a probability distribution. We will discuss concepts like maximum entropy, information bottleneck, and deterministic annealing. - Clustering. This is the problem of sorting data into groups without using training samples. We discuss alternative notions of "similarity" between data points and adequate optimization procedures. - Model selection and validation. This refers to the question of how complex the chosen model should be. In particular, we present an information theoretic approach for model validation. - Statistical physics models. We discuss approaches for approximately optimizing large systems, which originate in statistical physics (free energy minimization applied to spin glasses and other models). We also study sampling methods based on these models. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | A draft of a script will be provided. Lecture slides will be made available. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Hastie, Tibshirani, Friedman: The Elements of Statistical Learning, Springer, 2001. L. Devroye, L. Gyorfi, and G. Lugosi: A probabilistic theory of pattern recognition. Springer, New York, 1996 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Knowledge of machine learning (introduction to machine learning and/or advanced machine learning) Basic knowledge of statistics. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
261-5120-00L | Machine Learning for Health Care | W | 5 credits | 2V + 2A | V. Boeva, J. Vogt, M. Kuznetsova | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The course will review the most relevant methods and applications of Machine Learning in Biomedicine, discuss the main challenges they present and their current technical problems. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | During the last years, we have observed a rapid growth in the field of Machine Learning (ML), mainly due to improvements in ML algorithms, the increase of data availability and a reduction in computing costs. This growth is having a profound impact in biomedical applications, where the great variety of tasks and data types enables us to get benefit of ML algorithms in many different ways. In this course we will review the most relevant methods and applications of ML in biomedicine, discuss the main challenges they present and their current technical solutions. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The course will consist of four topic clusters that will cover the most relevant applications of ML in Biomedicine: 1) Structured time series: Temporal time series of structured data often appear in biomedical datasets, presenting challenges as containing variables with different periodicities, being conditioned by static data, etc. 2) Medical notes: Vast amount of medical observations are stored in the form of free text, we will analyze stategies for extracting knowledge from them. 3) Medical images: Images are a fundamental piece of information in many medical disciplines. We will study how to train ML algorithms with them. 4) Genomics data: ML in genomics is still an emerging subfield, but given that genomics data are arguably the most extensive and complex datasets that can be found in biomedicine, it is expected that many relevant ML applications will arise in the near future. We will review and discuss current applications and challenges. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Data Structures & Algorithms, Introduction to Machine Learning, Statistics/Probability, Programming in Python, Unix Command Line Relation to Course 261-5100-00 Computational Biomedicine: This course is a continuation of the previous course with new topics related to medical data and machine learning. The format of Computational Biomedicine II will also be different. It is helpful but not essential to attend Computational Biomedicine before attending Computational Biomedicine II. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-3710-00L | Machine Perception | W | 8 credits | 3V + 2U + 2A | O. Hilliges, J. Song | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Recent developments in neural networks have drastically advanced the performance of machine perception systems in a variety of areas including computer vision, robotics, and human shape modeling This course is a deep dive into deep learning algorithms and architectures with applications to a variety of perceptual and generative tasks. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Students will learn about fundamental aspects of modern deep learning approaches for perception and generation. Students will learn to implement, train and debug their own neural networks and gain a detailed understanding of cutting-edge research in learning-based computer vision, robotics, and shape modeling. The optional final project assignment will involve training a complex neural network architecture and applying it to a real-world dataset. The core competency acquired through this course is a solid foundation in deep-learning algorithms to process and interpret human-centric signals. In particular, students should be able to develop systems that deal with the problem of recognizing people in images, detecting and describing body parts, inferring their spatial configuration, performing action/gesture recognition from still images or image sequences, also considering multi-modal data, among others. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | We will focus on teaching: how to set up the problem of machine perception, the learning algorithms, network architectures, and advanced deep learning concepts in particular probabilistic deep learning models. The course covers the following main areas: I) Foundations of deep learning. II) Advanced topics like probabilistic generative modeling of data (latent variable models, generative adversarial networks, auto-regressive models, invertible neural networks, diffusion models). III) Deep learning in computer vision, human-computer interaction, and robotics. Specific topics include: I) Introduction to Deep Learning: a) Neural Networks and training (i.e., backpropagation) b) Feedforward Networks c) Timeseries modelling (RNN, GRU, LSTM) d) Convolutional Neural Networks II) Advanced topics: a) Latent variable models (VAEs) b) Generative adversarial networks (GANs) c) Autoregressive models (PixelCNN, PixelRNN, TCN, Transformer) d) Invertible Neural Networks / Normalizing Flows e) Coordinate-based networks (neural implicit surfaces, NeRF) f) Diffusion models III) Applications in machine perception and computer vision: a) Fully Convolutional architectures for dense per-pixel tasks (i.e., instance segmentation) b) Pose estimation and other tasks involving human activity c) Neural shape modeling (implicit surfaces, neural radiance fields) d) Deep Reinforcement Learning and Applications in Physics-Based Behavior Modeling | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Deep Learning Book by Ian Goodfellow and Yoshua Bengio | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | This is an advanced grad-level course that requires a background in machine learning. Students are expected to have a solid mathematical foundation, in particular in linear algebra, multivariate calculus, and probability. The course will focus on state-of-the-art research in deep learning and will not repeat the basics of machine learning Please take note of the following conditions: 1) Students must have taken the exam in Machine Learning (252-0535-00) or have acquired equivalent knowledge 2) All practical exercises will require basic knowledge of Python and will use libraries such as Pytorch, scikit-learn, and scikit-image. We will provide introductions to Pytorch and other libraries that are needed but will not provide introductions to basic programming or Python. The following courses are strongly recommended as prerequisites: * "Visual Computing" or "Computer Vision" The course will be assessed by a final written examination in English. No course materials or electronic devices can be used during the examination. Note that the examination will be based on the contents of the lectures, the associated reading materials, and the exercises. The exam will be a 3-hour end-of-term exam and take place at the end of the teaching period. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5000-00L | Computational Semantics for Natural Language Processing | W | 6 credits | 2V + 1U + 2A | M. Sachan | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course presents an introduction to Natural language processing (NLP) with an emphasis on computational semantics i.e. the process of constructing and reasoning with meaning representations of natural language text. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The objective of the course is to learn about various topics in computational semantics and its importance in natural language processing methodology and research. Exercises and the project will be key parts of the course so the students will be able to gain hands-on experience with state-of-the-art techniques in the field. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | We will take a modern view of the topic, and focus on various statistical and deep learning approaches for computation semantics. We will also overview various primary areas of research in language processing and discuss how the computational semantics view can help us make advances in NLP. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Lecture slides will be made available at the course Web site. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | No textbook is required, but there will be regularly assigned readings from research literature, linked to the course website. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | The student should have successfully completed a graduate level class in machine learning (252-0220-00L), deep learning (263-3210-00L) or natural language processing (252-3005-00L) before. Similar courses from other universities are acceptable too. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5052-00L | Interactive Machine Learning: Visualization & Explainability | W | 5 credits | 3G + 1A | M. El-Assady | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Visual Analytics supports the design of human-in-the-loop interfaces that enable human-machine collaboration. In this course, will go through the fundamentals of designing interactive visualizations, later applying them to explain and interact with machine leaning models. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The goal of the course is to introduce techniques for interactive information visualization and to apply these on understanding, diagnosing, and refining machine learning models. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Interactive, mixed-initiative machine learning promises to combine the efficiency of automation with the effectiveness of humans for a collaborative decision-making and problem-solving process. This can be facilitated through co-adaptive visual interfaces. This course will first introduce the foundations of information visualization design based on data charecteristics, e.g., high-dimensional, geo-spatial, relational, temporal, and textual data. Second, we will discuss interaction techniques and explanation strategies to enable explainable machine learning with the tasks of understanding, diagnosing, and refining machine learning models. Tentative list of topics: 1. Visualization and Perception 2. Interaction and Explanation 3. Systems Overview | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Course material will be provided in form of slides. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Will be provided during the course. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Basic understanding of machine learning as taught at the Bachelor's level. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5255-00L | Foundations of Reinforcement Learning | W | 7 credits | 3V + 3A | N. He | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Reinforcement learning (RL) has been in the limelight of many recent breakthroughs in artificial intelligence. This course focuses on theoretical and algorithmic foundations of reinforcement learning, through the lens of optimization, modern approximation, and learning theory. The course targets M.S. students with strong research interests in reinforcement learning, optimization, and control. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | This course aims to provide students with an advanced introduction of RL theory and algorithms as well as bring them near the frontier of this active research field. By the end of the course, students will be able to - Identify the strengths and limitations of various reinforcement learning algorithms; - Formulate and solve sequential decision-making problems by applying relevant reinforcement learning tools; - Generalize or discover “new” applications, algorithms, or theories of reinforcement learning towards conducting independent research on the topic. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Basic topics include fundamentals of Markov decision processes, approximate dynamic programming, linear programming and primal-dual perspectives of RL, model-based and model-free RL, policy gradient and actor-critic algorithms, Markov games and multi-agent RL. If time allows, we will also discuss advanced topics such as batch RL, inverse RL, causal RL, etc. The course keeps strong emphasis on in-depth understanding of the mathematical modeling and theoretical properties of RL algorithms. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Lecture notes will be posted on Moodle. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Dynamic Programming and Optimal Control, Vol I & II, Dimitris Bertsekas Reinforcement Learning: An Introduction, Second Edition, Richard Sutton and Andrew Barto. Algorithms for Reinforcement Learning, Csaba Czepesvári. Reinforcement Learning: Theory and Algorithms, Alekh Agarwal, Nan Jiang, Sham M. Kakade. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Students are expected to have strong mathematical background in linear algebra, probability theory, optimization, and machine learning. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5351-00L | Machine Learning for Genomics The deadline for deregistering expires at the end of the third week of the semester. Students who are still registered after that date, but do not provide project work, do not participate in paper presentation sessions and/or do not show up for the exam, will officially fail the course. | W | 5 credits | 2V + 1U + 1A | V. Boeva | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The course reviews solutions that machine learning provides to the most challenging questions in human genomics. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Over the last few years, the parallel development of machine learning methods and molecular profiling technologies for human cells, such as sequencing, created an extremely powerful tool to get insights into the cellular mechanisms in healthy and diseased contexts. In this course, we will discuss the state-of-the-art machine learning methodology solving or attempting to solve common problems in human genomics. At the end of the course, you will be familiar with (1) classical and advanced machine learning architectures used in genomics, (2) bioinformatics analysis of human genomic and transcriptomic data, and (3) data types used in this field. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | - Short introduction to major concepts of molecular biology: DNA, genes, genome, central dogma, transcription factors, epigenetic code, DNA methylation, signaling pathways - Prediction of transcription factor binding sites, open chromatin, histone marks, promoters, nucleosome positioning (convolutional neural networks, position weight matrices) - Prediction of variant effects and gene expression (hidden Markov models, topic models) - Deconvolution of mixed signal - DNA, RNA and protein folding (RNN, LSTM, transformers) - Data imputation for single cell RNA-seq data, clustering and annotation (diffusion and methods on graphs) - Batch correction (autoencoders, optimal transport) - Survival analysis (Cox proportional hazard model, regularization penalties, multi-omics, multi-tasking) | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Introduction to Machine Learning, Statistics/Probability, Programming in Python, Unix Command Line | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5352-00L | Advanced Formal Language Theory | W | 6 credits | 4G + 1A | R. Cotterell | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course serves as an introduction to various advanced topics in formal language theory. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The objective of the course is to learn and understand a variety of topics in advanced formal language theory. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | This course serves as an introduction to various advanced topics in formal language theory. The primary focus of the course is on weighted formalisms, which can easily be applied in machine learning. Topics include finite-state machines as well as the algorithms that are commonly used for their manipulation. We will also cover weighted context-free grammars, weighted tree automata, and weighted mildly context-sensitive formalisms. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5354-00L | Large Language Models | W | 8 credits | 3V + 2U + 2A | R. Cotterell, M. Sachan, F. Tramèr, C. Zhang | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Large language models have become one of the most commonly deployed NLP inventions. In the past half-decade, their integration into core natural language processing tools has dramatically increased the performance of such tools, and they have entered the public discourse surrounding artificial intelligence. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | To understand the mathematical foundations of large language models as well as how to implement them. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | We start with the probabilistic foundations of language models, i.e., covering what constitutes a language model from a formal, theoretical perspective. We then discuss how to construct and curate training corpora, and introduce many of the neural-network architectures often used to instantiate language models at scale. The course covers aspects of systems programming, discussion of privacy and harms, as well as applications of language models in NLP and beyond. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | The lecture notes will be supplemented with various readings from the literature. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Minor in Networking | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Number | Title | Type | ECTS | Hours | Lecturers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
227-0558-00L | Principles of Distributed Computing | W | 7 credits | 2V + 2U + 2A | R. Wattenhofer | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | We study the fundamental issues underlying the design of distributed systems: communication, coordination, fault-tolerance, locality, parallelism, self-organization, symmetry breaking, synchronization, uncertainty. We explore essential algorithmic ideas and lower bound techniques. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Distributed computing is essential in modern computing and communications systems. Examples are on the one hand large-scale networks such as the Internet, and on the other hand multiprocessors such as your new multi-core laptop. This course introduces the principles of distributed computing, emphasizing the fundamental issues underlying the design of distributed systems and networks: communication, coordination, fault-tolerance, locality, parallelism, self-organization, symmetry breaking, synchronization, uncertainty. We explore essential algorithmic ideas and lower bound techniques, basically the "pearls" of distributed computing. We will cover a fresh topic every week. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Distributed computing models and paradigms, e.g. message passing, shared memory, synchronous vs. asynchronous systems, time and message complexity, peer-to-peer systems, small-world networks, social networks, sorting networks, wireless communication, and self-organizing systems. Distributed algorithms, e.g. leader election, coloring, covering, packing, decomposition, spanning trees, mutual exclusion, store and collect, arrow, ivy, synchronizers, diameter, all-pairs-shortest-path, wake-up, and lower bounds | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Available. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Lecture Notes By Roger Wattenhofer. These lecture notes are taught at about a dozen different universities through the world. Mastering Distributed Algorithms Roger Wattenhofer Inverted Forest Publishing, 2020. ISBN 979-8628688267 Distributed Computing: Fundamentals, Simulations and Advanced Topics Hagit Attiya, Jennifer Welch. McGraw-Hill Publishing, 1998, ISBN 0-07-709352 6 Introduction to Algorithms Thomas Cormen, Charles Leiserson, Ronald Rivest. The MIT Press, 1998, ISBN 0-262-53091-0 oder 0-262-03141-8 Disseminatin of Information in Communication Networks Juraj Hromkovic, Ralf Klasing, Andrzej Pelc, Peter Ruzicka, Walter Unger. Springer-Verlag, Berlin Heidelberg, 2005, ISBN 3-540-00846-2 Introduction to Parallel Algorithms and Architectures: Arrays, Trees, Hypercubes Frank Thomson Leighton. Morgan Kaufmann Publishers Inc., San Francisco, CA, 1991, ISBN 1-55860-117-1 Distributed Computing: A Locality-Sensitive Approach David Peleg. Society for Industrial and Applied Mathematics (SIAM), 2000, ISBN 0-89871-464-8 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Course pre-requisites: Interest in algorithmic problems. (No particular course needed.) | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Minor in Programming Languages and Software Engineering | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Number | Title | Type | ECTS | Hours | Lecturers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-2812-00L | Program Verification | W | 5 credits | 3G + 1A | P. Müller, M. Eilers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | A hands-on introduction to the theory and construction of deductive program verifiers, covering both powerful techniques for formal program reasoning, and a perspective over the tool stack making up modern verification tools. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Students will earn the necessary skills for designing, developing, and applying deductive verification tools that enable the modular verification of complex software, including features challenging for reasoning such as heap-based mutable data and concurrency. Students will learn both a variety of fundamental reasoning principles, and how these reasoning ideas can be made practical via automatic tools. By the end of the course, students should have a good working understanding and decisions involved with designing and building practical verification tools, including the underlying theory. They will also be able to apply such tools to develop formally-verified programs. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The course will cover verification techniques and ways to automate them by introducing a verifier for a small core language and then progressively enriching the language with advanced features such as a mutable heap and concurrency. For each language extension, the course will explain the necessary reasoning principles, specification techniques, and tool support. In particular, it will introduce SMT solvers to prove logical formulas, intermediate verification languages to encode verification problems, and source code verifiers to handle feature-rich languages. The course will intermix technical content with hands-on experience. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | The slides will be available online. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Will be announced in the lecture. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | A basic familiarity with propositional and first-order logic will be assumed. Courses with an emphasis on formal reasoning about programs (such as Formal Methods and Functional Programming) are advantageous background, but are not a requirement. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-2815-00L | Automated Software Testing Last cancellation/deregistration date for this graded semester performance: 17 March 2023! Please note that after that date no deregistration will be accepted and the course will be considered as "fail". | W | 7 credits | 2V + 1U + 3A | Z. Su | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course introduces students to classic and modern techniques for the automated testing and analysis of software systems for reliability, security, and performance. It covers both techniques and their applications in various domains (e.g., compilers, databases, theorem provers, operating systems, machine/deep learning, and mobile applications), focusing on the latest, important results. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | * Learn fundamental and practical techniques for software testing and analysis * Understand the challenges, open issues and opportunities across a variety of domains (security/systems/compilers/databases/mobile/AI/education) * Understand how latest automated testing and analysis techniques work * Gain conceptual and practical experience in techniques/tools for reliability, security, and performance * Learn how to perform original and impactful research in this area | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The course will be organized into the following components: (1) classic and modern testing and analysis techniques (coverage metrics, mutation testing, metamorphic testing, combinatorial testing, symbolic execution, fuzzing, static analysis, etc.), (2) latest results on techniques and applications from diverse domains, and (3) open challenges and opportunities. A major component of this course is a class project. All students (individually or two-person teams) are expected to select and complete a course project. Ideally, the project is original research related in a broad sense to automated software testing and analysis. Potential project topics will also be suggested by the teaching staff. Students must select a project and write a one or two pages proposal describing why what the proposed project is interesting and giving a work schedule. Students will also write a final report describing the project and prepare a 20-30 minute presentation at the end of the course. The due dates for the project proposal, final report, and project presentation will be announced. The course will cover results from the Advanced Software Technologies (AST) Lab at ETH as well as notable results elsewhere, providing good opportunities for potential course project topics as well as MSc project/thesis topics. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Lecture notes/slides and other lecture materials/handouts will be available online. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Reading material and links to tools will be published on the course website. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | The prerequisites for this course are some programming and algorithmic experience. Background and experience in software engineering, programming languages/compilers, and security (as well as operating systems and databases) can be beneficial. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-2925-00L | Program Analysis for System Security and Reliability Does not take place this semester. | W | 7 credits | 2V + 1U + 3A | M. Vechev | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Security issues in modern systems (blockchains, datacenters, deep learning, etc.) result in billions of losses due to hacks and system downtime. This course introduces fundamental techniques (ranging over automated analysis, machine learning, synthesis, zero-knowledge, differential privacy, and their combinations) that can be applied in practice so to build more secure and reliable modern systems. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | * Understand the fundamental techniques used to create modern security and reliability analysis engines that are used worldwide. * Understand how symbolic techniques are combined with machine learning (e.g., deep learning, reinforcement learning) so to create new kinds of learning-based analyzers. * Understand how to quantify and fix security and reliability issues in modern deep learning models. * Understand open research questions from both theoretical and practical perspectives. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Please see: https://www.sri.inf.ethz.ch/teaching/pass2022 for detailed course content. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-4600-00L | Formal Methods for Information Security Does not take place this semester. | W | 5 credits | 2V + 1U + 1A | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The course focuses on formal methods for the modeling and analysis of security protocols for critical systems, ranging from authentication protocols for network security to electronic voting protocols and online banking. In addition, we will also introduce the notions of non-interference and runtime monitoring. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The students will learn the key ideas and theoretical foundations of formal modeling and analysis of security protocols. The students will complement their theoretical knowledge by solving practical exercises, completing a small project, and using state-of-the-art tools. The students also learn the fundamentals of non-interference and runtime monitoring. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The course treats formal methods mainly for the modeling and analysis of security protocols. Cryptographic protocols (such as SSL/TLS, SSH, Kerberos, SAML single-sign on, and IPSec) form the basis for secure communication and business processes. Numerous attacks on published protocols show that the design of cryptographic protocols is extremely error-prone. A rigorous analysis of these protocols is therefore indispensable, and manual analysis is insufficient. The lectures cover the theoretical basis for the (tool-supported) formal modeling and analysis of such protocols. Specifically, we discuss their operational semantics, the formalization of security properties, and techniques and algorithms for their verification. The second part of this course will cover a selection of advanced topics in security protocols such as abstraction techniques for efficient verification, secure communication with humans, the link between symbolic protocol models and cryptographic models as well as RFID protocols (a staple of the Internet of Things) and electronic voting protocols, including the relevant privacy properties. Moreover, we will give an introduction to two additional topics: non-interference as a general notion of secure systems, both from a semantic and a programming language perspective (type system), and runtime verification/monitoring to detect violations of security policies expressed as trace properties. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Minor in Systems Software | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Number | Title | Type | ECTS | Hours | Lecturers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
227-0558-00L | Principles of Distributed Computing | W | 7 credits | 2V + 2U + 2A | R. Wattenhofer | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | We study the fundamental issues underlying the design of distributed systems: communication, coordination, fault-tolerance, locality, parallelism, self-organization, symmetry breaking, synchronization, uncertainty. We explore essential algorithmic ideas and lower bound techniques. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Distributed computing is essential in modern computing and communications systems. Examples are on the one hand large-scale networks such as the Internet, and on the other hand multiprocessors such as your new multi-core laptop. This course introduces the principles of distributed computing, emphasizing the fundamental issues underlying the design of distributed systems and networks: communication, coordination, fault-tolerance, locality, parallelism, self-organization, symmetry breaking, synchronization, uncertainty. We explore essential algorithmic ideas and lower bound techniques, basically the "pearls" of distributed computing. We will cover a fresh topic every week. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Distributed computing models and paradigms, e.g. message passing, shared memory, synchronous vs. asynchronous systems, time and message complexity, peer-to-peer systems, small-world networks, social networks, sorting networks, wireless communication, and self-organizing systems. Distributed algorithms, e.g. leader election, coloring, covering, packing, decomposition, spanning trees, mutual exclusion, store and collect, arrow, ivy, synchronizers, diameter, all-pairs-shortest-path, wake-up, and lower bounds | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Available. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Lecture Notes By Roger Wattenhofer. These lecture notes are taught at about a dozen different universities through the world. Mastering Distributed Algorithms Roger Wattenhofer Inverted Forest Publishing, 2020. ISBN 979-8628688267 Distributed Computing: Fundamentals, Simulations and Advanced Topics Hagit Attiya, Jennifer Welch. McGraw-Hill Publishing, 1998, ISBN 0-07-709352 6 Introduction to Algorithms Thomas Cormen, Charles Leiserson, Ronald Rivest. The MIT Press, 1998, ISBN 0-262-53091-0 oder 0-262-03141-8 Disseminatin of Information in Communication Networks Juraj Hromkovic, Ralf Klasing, Andrzej Pelc, Peter Ruzicka, Walter Unger. Springer-Verlag, Berlin Heidelberg, 2005, ISBN 3-540-00846-2 Introduction to Parallel Algorithms and Architectures: Arrays, Trees, Hypercubes Frank Thomson Leighton. Morgan Kaufmann Publishers Inc., San Francisco, CA, 1991, ISBN 1-55860-117-1 Distributed Computing: A Locality-Sensitive Approach David Peleg. Society for Industrial and Applied Mathematics (SIAM), 2000, ISBN 0-89871-464-8 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Course pre-requisites: Interest in algorithmic problems. (No particular course needed.) | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-2925-00L | Program Analysis for System Security and Reliability Does not take place this semester. | W | 7 credits | 2V + 1U + 3A | M. Vechev | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Security issues in modern systems (blockchains, datacenters, deep learning, etc.) result in billions of losses due to hacks and system downtime. This course introduces fundamental techniques (ranging over automated analysis, machine learning, synthesis, zero-knowledge, differential privacy, and their combinations) that can be applied in practice so to build more secure and reliable modern systems. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | * Understand the fundamental techniques used to create modern security and reliability analysis engines that are used worldwide. * Understand how symbolic techniques are combined with machine learning (e.g., deep learning, reinforcement learning) so to create new kinds of learning-based analyzers. * Understand how to quantify and fix security and reliability issues in modern deep learning models. * Understand open research questions from both theoretical and practical perspectives. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Please see: https://www.sri.inf.ethz.ch/teaching/pass2022 for detailed course content. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-3800-00L | Advanced Operating Systems | W | 7 credits | 2V + 2U + 2A | D. Cock, T. Roscoe | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course is intended to give students a thorough understanding of design and implementation issues for modern operating systems, with a particular emphasis on the challenges of modern hardware features. We will cover key design issues in implementing an operating system, such as memory management, scheduling, protection, inter-process communication, device drivers, and file systems. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The goals of the course are, firstly, to give students: 1. A broader perspective on OS design than that provided by knowledge of Unix or Windows, building on the material in a standard undergraduate operating systems class 2. Practical experience in dealing directly with the concurrency, resource management, and abstraction problems confronting OS designers and implementers 3. A glimpse into future directions for the evolution of OS and computer hardware design | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The course is based on practical implementation work, in C and assembly language, and requires solid knowledge of both. The work is mostly carried out in teams of 3-4, using real hardware, and is a mixture of team milestones and individual projects which fit together into a complete system at the end. Emphasis is also placed on a final report which details the complete finished artifact, evaluates its performance, and discusses the choices the team made while building it. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | The course is based around a milestone-oriented project, where students work in small groups to implement major components of a microkernel-based operating system. The final assessment will be a combination grades awarded for milestones during the course of the project, a final written report on the work, and a set of test cases run on the final code. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Minor in Theoretical Computer Science | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Number | Title | Type | ECTS | Hours | Lecturers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
252-0408-00L | Cryptographic Protocols | W | 6 credits | 2V + 2U + 1A | M. Hirt | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | In a cryptographic protocol, a set of parties wants to achieve some common goal, while some of the parties are dishonest. Most prominent example of a cryptographic protocol is multi-party computation, where the parties compute an arbitrary (but fixed) function of their inputs, while maintaining the secrecy of the inputs and the correctness of the outputs even if some of the parties try to cheat. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | To know and understand a selection of cryptographic protocols and to be able to analyze and prove their security and efficiency. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The selection of considered protocols varies. Currently, we consider multi-party computation, secret-sharing, broadcast and Byzantine agreement. We look at both the synchronous and the asynchronous communication model, and focus on simple protocols as well as on highly-efficient protocols. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | We provide handouts of the slides. For some of the topics, we also provide papers and/or lecture notes. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | A basic understanding of fundamental cryptographic concepts (as taught for example in the course Information Security) is useful, but not required. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
252-1424-00L | Models of Computation | W | 6 credits | 2V + 2U + 1A | M. Cook | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course surveys many different models of computation: Turing Machines, Cellular Automata, Finite State Machines, Graph Automata, Circuits, Tilings, Lambda Calculus, Fractran, Chemical Reaction Networks, Hopfield Networks, String Rewriting Systems, Tag Systems, Diophantine Equations, Register Machines, Primitive Recursive Functions, and more. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The goal of this course is to become acquainted with a wide variety of models of computation, to understand how models help us to understand the modeled systems, and to be able to develop and analyze models appropriate for new systems. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | This course surveys many different models of computation: Turing Machines, Cellular Automata, Finite State Machines, Graph Automata, Circuits, Tilings, Lambda Calculus, Fractran, Chemical Reaction Networks, Hopfield Networks, String Rewriting Systems, Tag Systems, Diophantine Equations, Register Machines, Primitive Recursive Functions, and more. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
261-5110-00L | Optimization for Data Science | W | 10 credits | 3V + 2U + 4A | B. Gärtner, N. He | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course provides an in-depth theoretical treatment of optimization methods that are relevant in data science. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Understanding the guarantees and limits of relevant optimization methods used in data science. Learning theoretical paradigms and techniques to deal with optimization problems arising in data science. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | This course provides an in-depth theoretical treatment of classical and modern optimization methods that are relevant in data science. After a general discussion about the role that optimization has in the process of learning from data, we give an introduction to the theory of (convex) optimization. Based on this, we present and analyze algorithms in the following four categories: first-order methods (gradient and coordinate descent, Frank-Wolfe, subgradient and mirror descent, stochastic and incremental gradient methods); second-order methods (Newton and quasi Newton methods); non-convexity (local convergence, provable global convergence, cone programming, convex relaxations); min-max optimization (extragradient methods). The emphasis is on the motivations and design principles behind the algorithms, on provable performance bounds, and on the mathematical tools and techniques to prove them. The goal is to equip students with a fundamental understanding about why optimization algorithms work, and what their limits are. This understanding will be of help in selecting suitable algorithms in a given application, but providing concrete practical guidance is not our focus. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | A solid background in analysis and linear algebra; some background in theoretical computer science (computational complexity, analysis of algorithms); the ability to understand and write mathematical proofs. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-4400-00L | Advanced Graph Algorithms and Optimization | W | 10 credits | 3V + 3U + 3A | R. Kyng, M. Probst | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course will cover a number of advanced topics in optimization and graph algorithms. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The course will take students on a deep dive into modern approaches to graph algorithms using convex optimization techniques. By studying convex optimization through the lens of graph algorithms, students should develop a deeper understanding of fundamental phenomena in optimization. The course will cover some traditional discrete approaches to various graph problems, especially flow problems, and then contrast these approaches with modern, asymptotically faster methods based on combining convex optimization with spectral and combinatorial graph theory. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Students should leave the course understanding key concepts in optimization such as first and second-order optimization, convex duality, multiplicative weights and dual-based methods, acceleration, preconditioning, and non-Euclidean optimization. Students will also be familiarized with central techniques in the development of graph algorithms in the past 15 years, including graph decomposition techniques, sparsification, oblivious routing, and spectral and combinatorial preconditioning. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | This course is targeted toward masters and doctoral students with an interest in theoretical computer science. Students should be comfortable with design and analysis of algorithms, probability, and linear algebra. Having passed the course Algorithms, Probability, and Computing (APC) is highly recommended, but not formally required. If you are not sure whether you're ready for this class or not, please consult the instructor. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-4508-00L | Algorithmic Foundations of Data Science | W | 10 credits | 3V + 2U + 4A | D. Steurer | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course provides rigorous theoretical foundations for the design and mathematical analysis of efficient algorithms that can solve fundamental tasks relevant to data science. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | We consider various statistical models for basic data-analytical tasks, e.g., (sparse) linear regression, principal component analysis, matrix completion, community detection, and clustering. Our goal is to design efficient (polynomial-time) algorithms that achieve the strongest possible (statistical) guarantees for these models. Toward this goal we learn about a wide range of mathematical techniques from convex optimization, linear algebra (especially, spectral theory and tensors), and high-dimensional statistics. We also incorporate adversarial (worst-case) components into our models as a way to reason about robustness guarantees for the algorithms we design. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Strengths and limitations of efficient algorithms in (robust) statistical models for the following (tentative) list of data analysis tasks: - (sparse) linear regression - principal component analysis and matrix completion - clustering and Gaussian mixture models - community detection | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | To be provided during the semester | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | High-Dimensional Statistics A Non-Asymptotic Viewpoint by Martin J. Wainwright | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Mathematical and algorithmic maturity at least at the level of the course "Algorithms, Probability, and Computing". Important: Optimization for Data Science 2018--2021 This course was created after a reorganization of the course "Optimization for Data Science" (ODS). A significant portion of the material for this course has previously been taught as part of ODS. Consequently, it is not possible to earn credit points for both this course and ODS as offered in 2018--2021. This restriction does not apply to ODS offered in 2022 or afterwards and you can earn credit points for both courses in this case. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-4509-00L | Complex Network Models | W | 5 credits | 2V + 2A | J. Lengler | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Complex network models are random graphs that feature one or several properties observed in real-world networks (e.g., social networks, internet graph, www). Depending on the application, different properties are relevant, and different complex network models are useful. This course gives an overview over some relevant models and the properties they do and do not cover. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The students get familiar with a portfolio of network models, and they know their features and shortcomings. For a given application, they can identify relevant properties for this applications and can select an appropriate network model. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Network models: Erdös-Renyi random graphs, Chung-Lu graphs, configuration model, Kleinberg model, geometric inhomogeneous random graphs Properties: degree distribution, structure of giant and smaller components, clustering coefficient, small-world properties, community structures, weak ties | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | The script is available in moodle or at https://as.inf.ethz.ch/people/members/lenglerj/CompNetScript.pdf It will be updated during the semester. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Latora, Nikosia, Russo: "Complex Networks: Principles, Methods and Applications" van der Hofstad: "Random Graphs and Complex Networks. Volume 1" | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | The students must be familiar with the basics of graph theory and of probability theory (e.g. linearity of expectation, inequalities of Markov, Chebyshev, Chernoff). The course "Randomized Algorithms and Probabilistic Methods" is helpful, but not required. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-4510-00L | Introduction to Topological Data Analysis | W | 8 credits | 3V + 2U + 2A | P. Schnider | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Topological Data Analysis (TDA) is a relatively new subfield of computer sciences, which uses techniques from algebraic topology and computational geometry and topology to analyze and quantify the shape of data. This course will introduce the theoretical foundations of TDA. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The goal is to make students familiar with the fundamental concepts, techniques and results in TDA. At the end of the course, students should be able to read and understand current research papers and have the necessary background knowledge to apply methods from TDA to other projects. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Mathematical background (Topology, Simplicial complexes, Homology), Persistent Homology, Complexes on point clouds (Čech complexes, Vietoris-Rips complexes, Delaunay complexes, Witness complexes), the TDA pipeline, Reeb Graphs, Mapper | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Main reference: Tamal K. Dey, Yusu Wang: Computational Topology for Data Analysis, 2021 https://www.cs.purdue.edu/homes/tamaldey/book/CTDAbook/CTDAbook.html Other references: Herbert Edelsbrunner, John Harer: Computational Topology: An Introduction, American Mathematical Society, 2010 https://bookstore.ams.org/mbk-69 Gunnar Carlsson, Mikael Vejdemo-Johansson: Topological Data Analysis with Applications, Cambridge University Press, 2021 Link Robert Ghrist: Elementary Applied Topology, 2014 https://www2.math.upenn.edu/~ghrist/notes.html Allen Hatcher: Algebraic Topology, Cambridge University Press, 2002 https://pi.math.cornell.edu/~hatcher/AT/ATpage.html | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | The course assumes knowledge of discrete mathematics, algorithms and data structures and linear algebra, as supplied in the first semesters of Bachelor Studies at ETH. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-4656-00L | Digital Signatures | W | 5 credits | 2V + 2A | D. Hofheinz | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Digital signatures as one central cryptographic building block. Different security goals and security definitions for digital signatures, followed by a variety of popular and fundamental signature schemes with their security analyses. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The student knows a variety of techniques to construct and analyze the security of digital signature schemes. This includes modularity as a central tool of constructing secure schemes, and reductions as a central tool to proving the security of schemes. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | We will start with several definitions of security for signature schemes, and investigate the relations among them. We will proceed to generic (but inefficient) constructions of secure signatures, and then move on to a number of efficient schemes based on concrete computational hardness assumptions. On the way, we will get to know paradigms such as hash-then-sign, one-time signatures, and chameleon hashing as central tools to construct secure signatures. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Jonathan Katz, "Digital Signatures." | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Ideally, students will have taken the D-INFK Bachelors course "Information Security" or an equivalent course at Bachelors level. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
272-0300-00L | Algorithmics for Hard Problems This course d o e s n o t include the Mentored Work Specialised Courses with an Educational Focus in Computer Science A. | W | 5 credits | 2V + 1U + 1A | H.‑J. Böckenhauer, D. Komm | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course unit looks into algorithmic approaches to the solving of hard problems, particularly with moderately exponential-time algorithms and parameterized algorithms. The seminar is accompanied by a comprehensive reflection upon the significance of the approaches presented for computer science tuition at high schools. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | To systematically acquire an overview of the methods for solving hard problems. To get deeper knowledge of exact and parameterized algorithms. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | First, the concept of hardness of computation is introduced (repeated for the computer science students). Then some methods for solving hard problems are treated in a systematic way. For each algorithm design method, it is discussed what guarantees it can give and how we pay for the improved efficiency. A special focus lies on moderately exponential-time algorithms and parameterized algorithms. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Unterlagen und Folien werden zur Verfügung gestellt. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | J. Hromkovic: Algorithmics for Hard Problems, Springer 2004. R. Niedermeier: Invitation to Fixed-Parameter Algorithms, 2006. M. Cygan et al.: Parameterized Algorithms, 2015. F. Fomin et al.: Kernelization, 2019. F. Fomin, D. Kratsch: Exact Exponential Algorithms, 2010. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
272-0302-00L | Approximation and Online Algorithms Does not take place this semester. | W | 5 credits | 2V + 1U + 1A | D. Komm | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This lecture deals with approximative algorithms for hard optimization problems and algorithmic approaches for solving online problems as well as the limits of these approaches. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Get a systematic overview of different methods for designing approximative algorithms for hard optimization problems and online problems. Get to know methods for showing the limitations of these approaches. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Approximation algorithms are one of the most succesful techniques to attack hard optimization problems. Here, we study the so-called approximation ratio, i.e., the ratio of the cost of the computed approximating solution and an optimal one (which is not computable efficiently). For an online problem, the whole instance is not known in advance, but it arrives pieceweise and for every such piece a corresponding part of the definite output must be given. The quality of an algorithm for such an online problem is measured by the competitive ratio, i.e., the ratio of the cost of the computed solution and the cost of an optimal solution that could be given if the whole input was known in advance. The contents of this lecture are - the classification of optimization problems by the reachable approximation ratio, - systematic methods to design approximation algorithms (e.g., greedy strategies, dynamic programming, linear programming relaxation), - methods to show non-approximability, - classic online problem like paging or scheduling problems and corresponding algorithms, - randomized online algorithms, - the design and analysis principles for online algorithms, and - limits of the competitive ratio and the advice complexity as a way to do a deeper analysis of the complexity of online problems. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | The lecture is based on the following books: J. Hromkovic: Algorithmics for Hard Problems, Springer, 2004 D. Komm: An Introduction to Online Computation: Determinism, Randomization, Advice, Springer, 2016 Additional literature: A. Borodin, R. El-Yaniv: Online Computation and Competitive Analysis, Cambridge University Press, 1998 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
401-3052-10L | Graph Theory | W | 10 credits | 4V + 1U | B. Sudakov | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Basics, trees, Caley's formula, matrix tree theorem, connectivity, theorems of Mader and Menger, Eulerian graphs, Hamilton cycles, theorems of Dirac, Ore, Erdös-Chvatal, matchings, theorems of Hall, König, Tutte, planar graphs, Euler's formula, Kuratowski's theorem, graph colorings, Brooks' theorem, 5-colorings of planar graphs, list colorings, Vizing's theorem, Ramsey theory, Turán's theorem | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The students will get an overview over the most fundamental questions concerning graph theory. We expect them to understand the proof techniques and to use them autonomously on related problems. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Lecture will be only at the blackboard. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | West, D.: "Introduction to Graph Theory" Diestel, R.: "Graph Theory" Further literature links will be provided in the lecture. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Students are expected to have a mathematical background and should be able to write rigorous proofs. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
401-3902-21L | Network & Integer Optimization: From Theory to Application | W | 6 credits | 3G | R. Zenklusen | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course covers various topics in Network and (Mixed-)Integer Optimization. It starts with a rigorous study of algorithmic techniques for some network optimization problems (with a focus on matching problems) and moves to key aspects of how to attack various optimization settings through well-designed (Mixed-)Integer Programming formulations. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Our goal is for students to both get a good foundational understanding of some key network algorithms and also to learn how to effectively employ (Mixed-)Integer Programming formulations, techniques, and solvers, to tackle a wide range of discrete optimization problems. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Key topics include: - Matching problems; - Integer Programming techniques and models; - Extended formulations and strong problem formulations; - Solver techniques for (Mixed-)Integer Programs; - Decomposition approaches. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | - Bernhard Korte, Jens Vygen: Combinatorial Optimization. 6th edition, Springer, 2018. - Alexander Schrijver: Combinatorial Optimization: Polyhedra and Efficiency. Springer, 2003. This work has 3 volumes. - Vanderbeck François, Wolsey Laurence: Reformulations and Decomposition of Integer Programs. Chapter 13 in: 50 Years of Integer Programming 1958-2008. Springer, 2010. - Alexander Schrijver: Theory of Linear and Integer Programming. John Wiley, 1986. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Solid background in linear algebra. Preliminary knowledge of Linear Programming is ideal but not a strict requirement. Prior attendance of the course Linear & Combinatorial Optimization is a plus. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
402-0448-01L | Quantum Information Processing I: Concepts This theory part QIP I together with the experimental part 402-0448-02L QIP II (both offered in the Spring Semester) combine to the core course in experimental physics "Quantum Information Processing" (totally 10 ECTS credits). This applies to the Master's degree programme in Physics. | W | 5 credits | 2V + 1U | J. Home | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The course covers the key concepts of quantum information processing, including quantum algorithms which give the quantum computer the power to compute problems outside the reach of any classical supercomputer. Key concepts such as quantum error correction are discussed in detail. They provide fundamental insights into the nature of quantum states and measurements. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | By the end of the course students are able to explain the basic mathematical formalism of quantum mechanics and apply them to quantum information processing problems. They are able to adapt and apply these concepts and methods to analyse and discuss quantum algorithms and other quantum information-processing protocols. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The topics covered in the course will include quantum circuits, gate decomposition and universal sets of gates, efficiency of quantum circuits, quantum algorithms (Shor, Grover, Deutsch-Josza,..), quantum error correction, fault-tolerant designs, and quantum simulation. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | Will be provided. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Quantum Computation and Quantum Information Michael Nielsen and Isaac Chuang Cambridge University Press | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | A good understanding of finite dimensional linear algebra is recommended. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Interfocus Courses | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Number | Title | Type | ECTS | Hours | Lecturers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-0007-00L | Advanced Systems Lab Only for master students, otherwise a special permission by the study administration of D-INFK is required. | O | 8 credits | 3V + 2U + 2A | M. Püschel | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course introduces the student to the foundations and state-of-the-art techniques in developing high performance software for mathematical functionality occurring in various fields in computer science. The focus is on optimizing for a single core and includes optimizing for the memory hierarchy, for special instruction sets, and the possible use of automatic performance tuning. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Software performance (i.e., runtime) arises through the complex interaction of algorithm, its implementation, the compiler used, and the microarchitecture the program is run on. The first goal of the course is to provide the student with an understanding of this "vertical" interaction, and hence software performance, for mathematical functionality. The second goal is to teach a systematic strategy how to use this knowledge to write fast software for numerical problems. This strategy will be trained in several homeworks and a semester-long group project. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The fast evolution and increasing complexity of computing platforms pose a major challenge for developers of high performance software for engineering, science, and consumer applications: it becomes increasingly harder to harness the available computing power. Straightforward implementations may lose as much as one or two orders of magnitude in performance. On the other hand, creating optimal implementations requires the developer to have an understanding of algorithms, capabilities and limitations of compilers, and the target platform's architecture and microarchitecture. This interdisciplinary course introduces the student to the foundations and state-of-the-art techniques in high performance mathematical software development using important functionality such as matrix operations, transforms, filters, and others as examples. The course will explain how to optimize for the memory hierarchy, take advantage of special instruction sets, and other details of current processors that require optimization. The concept of automatic performance tuning is introduced. The focus is on optimization for a single core; thus, the course complements others on parallel and distributed computing. Finally a general strategy for performance analysis and optimization is introduced that the students will apply in group projects that accompany the course. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Solid knowledge of the C programming language and matrix algebra. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-0008-00L | Computational Intelligence Lab Only for master students, otherwise a special permission by the study administration of D-INFK is required. | O | 8 credits | 2V + 2U + 3A | T. Hofmann | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This laboratory course teaches fundamental concepts in computational science and machine learning with a special emphasis on matrix factorization and representation learning. The class covers techniques like dimension reduction, data clustering, sparse coding, and deep learning as well as a wide spectrum of related use cases and applications. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Students acquire fundamental theoretical concepts and methodologies from machine learning and how to apply these techniques to build intelligent systems that solve real-world problems. They learn to successfully develop solutions to application problems by following the key steps of modeling, algorithm design, implementation and experimental validation. This lab course has a strong focus on practical assignments. Students work in groups of three to four people, to develop solutions to three application problems: 1. Collaborative filtering and recommender systems, 2. Text sentiment classification, and 3. Road segmentation in aerial imagery. For each of these problems, students submit their solutions to an online evaluation and ranking system, and get feedback in terms of numerical accuracy and computational speed. In the final part of the course, students combine and extend one of their previous promising solutions, and write up their findings in an extended abstract in the style of a conference paper. (Disclaimer: The offered projects may be subject to change from year to year.) | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | see course description | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Elective Courses Students can individually chose from the entire Master course offerings in the area of Computer Science (or a closely related field), from ETH Zurich, EPF Lausanne, the University of Zurich and - but only with the consent of the Director of Studies - from all other Swiss universities. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Number | Title | Type | ECTS | Hours | Lecturers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
252-0820-00L | Information Technology in Practice | W | 5 credits | 2V + 1U + 1A | M. Brandis | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The course is designed to provide students with an understanding of "real-life" computer science challenges in business settings and teach them how to address these. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Students will learn important considerations of companies when applying information technology in practice, including costs, economic value and risks of information technology use, or impact of information technology on business strategy and vice versa. They will get insight into how companies have used or are using information technology to be successful. Students will also learn how to assess information technology decisions from different viewpoints, including technical experts, IT managers, business users, and business top managers. The course will equip participants to understand the role computer science and information technology plays in different companies and to contribute to respective decisions as they enter into practice. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The course consists of multiple lectures on economics of information technology, business and IT strategy, and how they are interlinked, and a set of relevant case studies. They address how companies become more successful using information technology, how bad information technology decisions can hurt them, and they look into a number of current challenges companies face regarding their information technology. The cases are taken both from documented international case studies as well as from Swiss companies participating in the course. The learned concepts will be applied in exercises, which form a key component of the course. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | The course builds on the earlier "Case Studies from Practice" course, with a stronger focus on learning key concepts of information technology use in practice and applying them in exercises, and only a limited number of case studies. The course prepares students for participation in the subsequent "Case Studies from Practice Seminar", which provides deeper insights into actual cases and how to solve them. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-0600-00L | Research in Computer Science | W | 5 credits | 11A | Professors | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Independent project work under the supervision of a Computer Science Professor. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Project done under supervision of a professor in the Department of Computer Science. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Only students who fulfill one of the following requirements are allowed to begin a research project: a) 1 lab (interfocus course) and 1 core focus course b) 2 core focus courses c) 2 labs (interfocus courses) A task description must be submitted to the Student Administration Office at the beginning of the work. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5055-00L | Talent Kick: From Student to Entrepreneur | W | 3 credits | 2G | V. Gropengiesser, A. Ilic | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The transfer of the latest research results into scalable start-ups creates the prerequisite forsuccessful innovations. An entrepreneurial spirit and mindset enables young leaders to navigate complex environments and bring their research into practice. Studies are the best time to develop an entrepreneurial mindset and explore the entrepreneurial career path. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | This seminar helps aspiring student/research entrepreneurs to gain hands-on entrepreneurial experience on the path from research into practice. The examples and cases will be primarily from software, AI, and other deep-tech ventures. The seminar was created with the support of ETH AI Center and University of St. Gallen and received competitive funding from the ETH Board, Fondation Botnar, Gebert Rüf Foundation, as well as support from the ETH Foundation. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | After attending this course, students will be able to: ● Explain the importance and tools to form successful interdisciplinary teams ● Structure customer calls and sales pitchdecks ● Build their first prototypes and MVPs ● Find the right markets and customers to bring your research into practice ● Deal with complexity in bringing innovative / novel products into market ● Develop customer-centric business strategy ● Convince first supporters incl. Entrepreneurial mentors, first investors etc. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | The course is practically oriented and features guest speakers from leading start-ups. The course embraces a unique perspective combining technology and investor thinking. The seminar is structured around ten days. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Science in Perspective Note that no more than six credits can be accredited in this category. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
» Recommended Science in Perspective (Type B) for D-INFK | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
» see Science in Perspective: Language Courses ETH/UZH | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
» see Science in Perspective: Type A: Enhancement of Reflection Capability | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Master's Thesis | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Number | Title | Type | ECTS | Hours | Lecturers | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-0800-00L | Master's Thesis Only students who fulfill the following criteria are allowed to begin with their master thesis: a. successful completion of the bachelor programme; b. fulfilling any additional requirements necessary to gain admission to the master programme; c. "Inter focus courses" (16 credits) completed; d. "Focus courses" (26 credits) completed, from which at least 16 credits must come from the Major Core courses; e. "Practical work" at least 8 credits completed. f. In total, besides the master thesis, no more than 8 credits may be missing. | O | 30 credits | 64D | Professors | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The Master's thesis concludes the study programme. Thesis work should prove the students' ability to independent, structured and scientific working. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | To work independently and to produce a scientifically structured work under the supervision of a Computer Science Professor. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Independent project work supervised by a Computer Science professor. The duration of the MT is 28 weeks (full-time), where the 28 weeks are composed of 26 weeks of actual processing time and 2 weeks to compensate for public holidays, sick days and other short-term absences. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | Supervisor must be a professor at D-INFK or affiliated, see https://inf.ethz.ch/people/faculty.html |