Search result: Catalogue data in Spring Semester 2020

Computer Science Master Information
Focus Courses
Focus Courses in Computational Science
Focus Core Courses Computational Science
NumberTitleTypeECTSHoursLecturers
401-3632-00LComputational StatisticsW8 credits3V + 1UM. H. Maathuis
AbstractWe discuss modern statistical methods for data analysis, including methods for data exploration, prediction and inference. We pay attention to algorithmic aspects, theoretical properties and practical considerations. The class is hands-on and methods are applied using the statistical programming language R.
ObjectiveThe student obtains an overview of modern statistical methods for data analysis, including their algorithmic aspects and theoretical properties. The methods are applied using the statistical programming language R.
ContentSee the class website
Prerequisites / NoticeAt least one semester of (basic) probability and statistics.

Programming experience is helpful but not required.
Focus Elective Courses Computational Science
NumberTitleTypeECTSHoursLecturers
252-0526-00LStatistical Learning Theory Information W7 credits3V + 2U + 1AJ. M. Buhmann, C. Cotrini Jimenez
AbstractThe course covers advanced methods of statistical learning:

- Variational methods and optimization.
- Deterministic annealing.
- Clustering for diverse types of data.
- Model validation by information theory.
ObjectiveThe course surveys recent methods of statistical learning. The fundamentals of machine learning, as presented in the courses "Introduction to Machine Learning" and "Advanced Machine Learning", are expanded from the perspective of statistical learning.
Content- Variational methods and optimization. We consider optimization approaches for problems where the optimizer is a probability distribution. We will discuss concepts like maximum entropy, information bottleneck, and deterministic annealing.

- Clustering. This is the problem of sorting data into groups without using training samples. We discuss alternative notions of "similarity" between data points and adequate optimization procedures.

- Model selection and validation. This refers to the question of how complex the chosen model should be. In particular, we present an information theoretic approach for model validation.

- Statistical physics models. We discuss approaches for approximately optimizing large systems, which originate in statistical physics (free energy minimization applied to spin glasses and other models). We also study sampling methods based on these models.
Lecture notesA draft of a script will be provided. Lecture slides will be made available.
LiteratureHastie, Tibshirani, Friedman: The Elements of Statistical Learning, Springer, 2001.

L. Devroye, L. Gyorfi, and G. Lugosi: A probabilistic theory of pattern recognition. Springer, New York, 1996
Prerequisites / NoticeKnowledge of machine learning (introduction to machine learning and/or advanced machine learning)
Basic knowledge of statistics.
261-5120-00LMachine Learning for Health Care Information Restricted registration - show details
Number of participants limited to 150.
W5 credits3P + 1AG. Rätsch, J. Vogt, V. Boeva
AbstractThe course will review the most relevant methods and applications of Machine Learning in Biomedicine, discuss the main challenges they present and their current technical problems.
ObjectiveDuring the last years, we have observed a rapid growth in the field of Machine Learning (ML), mainly due to improvements in ML algorithms, the increase of data availability and a reduction in computing costs. This growth is having a profound impact in biomedical applications, where the great variety of tasks and data types enables us to get benefit of ML algorithms in many different ways. In this course we will review the most relevant methods and applications of ML in biomedicine, discuss the main challenges they present and their current technical solutions.
ContentThe course will consist of four topic clusters that will cover the most relevant applications of ML in Biomedicine:
1) Structured time series: Temporal time series of structured data often appear in biomedical datasets, presenting challenges as containing variables with different periodicities, being conditioned by static data, etc.
2) Medical notes: Vast amount of medical observations are stored in the form of free text, we will analyze stategies for extracting knowledge from them.
3) Medical images: Images are a fundamental piece of information in many medical disciplines. We will study how to train ML algorithms with them.
4) Genomics data: ML in genomics is still an emerging subfield, but given that genomics data are arguably the most extensive and complex datasets that can be found in biomedicine, it is expected that many relevant ML applications will arise in the near future. We will review and discuss current applications and challenges.
Prerequisites / NoticeData Structures & Algorithms, Introduction to Machine Learning, Statistics/Probability, Programming in Python, Unix Command Line

Relation to Course 261-5100-00 Computational Biomedicine: This course is a continuation of the previous course with new topics related to medical data and machine learning. The format of Computational Biomedicine II will also be different. It is helpful but not essential to attend Computational Biomedicine before attending Computational Biomedicine II.
263-5300-00LGuarantees for Machine Learning Information Restricted registration - show details W5 credits2V + 2AF. Yang
AbstractThis course teaches classical and recent methods in statistics and optimization commonly used to prove theoretical guarantees for machine learning algorithms. The knowledge is then applied in project work that focuses on understanding phenomena in modern machine learning.
ObjectiveThis course is aimed at advanced master and doctorate students who want to understand and/or conduct independent research on theory for modern machine learning. For this purpose, students will learn common mathematical techniques from statistical learning theory. In independent project work, they then apply their knowledge and go through the process of critically questioning recently published work, finding relevant research questions and learning how to effectively present research ideas to a professional audience.
ContentThis course teaches some classical and recent methods in statistical learning theory aimed at proving theoretical guarantees for machine learning algorithms, including topics in

- concentration bounds, uniform convergence
- high-dimensional statistics (e.g. Lasso)
- prediction error bounds for non-parametric statistics (e.g. in kernel spaces)
- minimax lower bounds
- regularization via optimization

The project work focuses on active theoretical ML research that aims to understand modern phenomena in machine learning, including but not limited to

- how overparameterization could help generalization ( interpolating models, linearized NN )
- how overparameterization could help optimization ( non-convex optimization, loss landscape )
- complexity measures and approximation theoretic properties of randomly initialized and
trained NN
- generalization of robust learning ( adversarial robustness, standard and robust error tradeoff )
- prediction with calibrated confidence ( conformal prediction, calibration )
Prerequisites / NoticeIt’s absolutely necessary for students to have a strong mathematical background (basic real analysis, probability theory, linear algebra) and good knowledge of core concepts in machine learning taught in courses such as “Introduction to Machine Learning”, “Regression”/ “Statistical Modelling”. It's also helpful to have heard an optimization course or approximation theoretic course. In addition to these prerequisites, this class requires a certain degree of mathematical maturity—including abstract thinking and the ability to understand and write proofs.
Seminar in Computational Science
NumberTitleTypeECTSHoursLecturers
252-5704-00LAdvanced Methods in Computer Graphics Information Restricted registration - show details
Number of participants limited to 24.

The deadline for deregistering expires at the end of the second week of the semester. Students who are still registered after that date, but do not attend the seminar, will officially fail the seminar.
W2 credits2SO. Sorkine Hornung
AbstractThis seminar covers advanced topics in computer graphics with a focus on the latest research results. Topics include modeling, rendering, visualization,
animation, physical simulation, computational photography, and others.
ObjectiveThe goal is to obtain an in-depth understanding of actual problems and
research topics in the field of computer graphics as well as improve
presentation and critical analysis skills.
261-5113-00LComputational Challenges in Medical Genomics Information Restricted registration - show details
Number of participants limited to 20.
W2 credits2SA. Kahles, G. Rätsch
AbstractThis seminar discusses recent relevant contributions to the fields of computational genomics, algorithmic bioinformatics, statistical genetics and related areas. Each participant will hold a presentation and lead the subsequent discussion.
ObjectivePreparing and holding a scientific presentation in front of peers is a central part of working in the scientific domain. In this seminar, the participants will learn how to efficiently summarize the relevant parts of a scientific publication, critically reflect its contents, and summarize it for presentation to an audience. The necessary skills to succesfully present the key points of existing research work are the same as needed to communicate own research ideas.
In addition to holding a presentation, each student will both contribute to as well as lead a discussion section on the topics presented in the class.
ContentThe topics covered in the seminar are related to recent computational challenges that arise from the fields of genomics and biomedicine, including but not limited to genomic variant interpretation, genomic sequence analysis, compressive genomics tasks, single-cell approaches, privacy considerations, statistical frameworks, etc.
Both recently published works contributing novel ideas to the areas mentioned above as well as seminal contributions from the past are amongst the list of selected papers.
Prerequisites / NoticeKnowledge of algorithms and data structures and interest in applications in genomics and computational biomedicine.
Focus Courses in Distributed Systems
Focus Core Courses Distributed Systems
NumberTitleTypeECTSHoursLecturers
227-0558-00LPrinciples of Distributed Computing Information W7 credits2V + 2U + 2AR. Wattenhofer, M. Ghaffari
AbstractWe study the fundamental issues underlying the design of distributed systems: communication, coordination, fault-tolerance, locality, parallelism, self-organization, symmetry breaking, synchronization, uncertainty. We explore essential algorithmic ideas and lower bound techniques.
ObjectiveDistributed computing is essential in modern computing and communications systems. Examples are on the one hand large-scale networks such as the Internet, and on the other hand multiprocessors such as your new multi-core laptop. This course introduces the principles of distributed computing, emphasizing the fundamental issues underlying the design of distributed systems and networks: communication, coordination, fault-tolerance, locality, parallelism, self-organization, symmetry breaking, synchronization, uncertainty. We explore essential algorithmic ideas and lower bound techniques, basically the "pearls" of distributed computing. We will cover a fresh topic every week.
ContentDistributed computing models and paradigms, e.g. message passing, shared memory, synchronous vs. asynchronous systems, time and message complexity, peer-to-peer systems, small-world networks, social networks, sorting networks, wireless communication, and self-organizing systems.

Distributed algorithms, e.g. leader election, coloring, covering, packing, decomposition, spanning trees, mutual exclusion, store and collect, arrow, ivy, synchronizers, diameter, all-pairs-shortest-path, wake-up, and lower bounds
Lecture notesAvailable. Our course script is used at dozens of other universities around the world.
LiteratureLecture Notes By Roger Wattenhofer. These lecture notes are taught at about a dozen different universities through the world.

Distributed Computing: Fundamentals, Simulations and Advanced Topics
Hagit Attiya, Jennifer Welch.
McGraw-Hill Publishing, 1998, ISBN 0-07-709352 6

Introduction to Algorithms
Thomas Cormen, Charles Leiserson, Ronald Rivest.
The MIT Press, 1998, ISBN 0-262-53091-0 oder 0-262-03141-8

Disseminatin of Information in Communication Networks
Juraj Hromkovic, Ralf Klasing, Andrzej Pelc, Peter Ruzicka, Walter Unger.
Springer-Verlag, Berlin Heidelberg, 2005, ISBN 3-540-00846-2

Introduction to Parallel Algorithms and Architectures: Arrays, Trees, Hypercubes
Frank Thomson Leighton.
Morgan Kaufmann Publishers Inc., San Francisco, CA, 1991, ISBN 1-55860-117-1

Distributed Computing: A Locality-Sensitive Approach
David Peleg.
Society for Industrial and Applied Mathematics (SIAM), 2000, ISBN 0-89871-464-8
Prerequisites / NoticeCourse pre-requisites: Interest in algorithmic problems. (No particular course needed.)
263-3800-00LAdvanced Operating Systems Information W7 credits2V + 2U + 2AD. Cock, T. Roscoe
AbstractThis course is intended to give students a thorough understanding of design and implementation issues for modern operating systems, with a particular emphasis on the challenges of modern hardware features. We will cover key design issues in implementing an operating system, such as memory management, scheduling, protection, inter-process communication, device drivers, and file systems.
ObjectiveThe goals of the course are, firstly, to give students:

1. A broader perspective on OS design than that provided by knowledge of Unix or Windows, building on the material in a standard undergraduate operating systems class

2. Practical experience in dealing directly with the concurrency, resource management, and abstraction problems confronting OS designers and implementers

3. A glimpse into future directions for the evolution of OS and computer hardware design
ContentThe course is based on practical implementation work, in C and assembly language, and requires solid knowledge of both. The work is mostly carried out in teams of 3-4, using real hardware, and is a mixture of team milestones and individual projects which fit together into a complete system at the end. Emphasis is also placed on a final report which details the complete finished artifact, evaluates its performance, and discusses the choices the team made while building it.
Prerequisites / NoticeThe course is based around a milestone-oriented project, where students work in small groups to implement major components of a microkernel-based operating system. The final assessment will be a combination grades awarded for milestones during the course of the project, a final written report on the work, and a set of test cases run on the final code.
Focus Elective Courses Distributed Systems
NumberTitleTypeECTSHoursLecturers
252-0312-00LUbiquitous Computing Information W4 credits2V + 1AC. Holz, F. Mattern, S. Mayer
AbstractUnlike desktop computing, ubiquitous computing occurs anytime and everywhere, using any device, in any location, and in any format. Computers exist in different forms, from watches and phones to refrigerators or pairs of glasses.
Main topics: Smart environments, IoT, mobiles & wearables, context & location, sensing & tracking, computer vision on embedded systems, health monitoring, fabrication.
ObjectiveUnlike desktop computing, ubiquitous computing occurs anytime and everywhere, using any device, in any location, and in any format. Computers exist in different forms, from watches and phones to refrigerators or pairs of glasses.
Main topics: Smart environments, IoT, mobiles & wearables, context & location, sensing & tracking, computer vision on embedded systems, health monitoring, fabrication.
Lecture notesCopies of slides will be made available
LiteratureWill be provided in the lecture. To put you in the mood:
Mark Weiser: The Computer for the 21st Century. Scientific American, September 1991, pp. 94-104
252-0437-00LDistributed Algorithms Information W5 credits3V + 1AF. Mattern
AbstractModels of distributed computations, time space diagrams, virtual time, logical clocks and causality, wave algorithms, parallel and distributed graph traversal, consistent snapshots, mutual exclusion, election and symmetry breaking, distributed termination detection, garbage collection in distributed systems, monitoring distributed systems, global predicates.
ObjectiveBecome acquainted with models and algorithms for distributed systems.
ContentVerteilte Algorithmen sind Verfahren, die dadurch charakterisiert sind, dass mehrere autonome Prozesse gleichzeitig Teile eines gemeinsamen Problems in kooperativer Weise bearbeiten und der dabei erforderliche Informationsaustausch ausschliesslich über Nachrichten erfolgt. Derartige Algorithmen kommen im Rahmen verteilter Systeme zum Einsatz, bei denen kein gemeinsamer Speicher existiert und die Übertragungszeit von Nachrichten i.a. nicht vernachlässigt werden kann. Da dabei kein Prozess eine aktuelle konsistente Sicht des globalen Zustands besitzt, führt dies zu interessanten Problemen.
Im einzelnen werden u.a. folgende Themen behandelt:
Modelle verteilter Berechnungen; Raum-Zeit Diagramme; Virtuelle Zeit; Logische Uhren und Kausalität; Wellenalgorithmen; Verteilte und parallele Graphtraversierung; Berechnung konsistenter Schnappschüsse; Wechselseitiger Ausschluss; Election und Symmetriebrechung; Verteilte Terminierung; Garbage-Collection in verteilten Systemen; Beobachten verteilter Systeme; Berechnung globaler Prädikate.
Literature- F. Mattern: Verteilte Basisalgorithmen, Springer-Verlag
- G. Tel: Topics in Distributed Algorithms, Cambridge University Press
- G. Tel: Introduction to Distributed Algorithms, Cambridge University Press, 2nd edition
- A.D. Kshemkalyani, M. Singhal: Distributed Computing, Cambridge University Press
- N. Lynch: Distributed Algorithms, Morgan Kaufmann Publ
252-0817-00LDistributed Systems Laboratory Information
In the Master Programme max. 10 credits can be accounted by Labs
on top of the Interfocus Courses. Additional Labs will be listed on the Addendum.
W10 credits9PG. Alonso, T. Hoefler, F. Mattern, A. Singla, R. Wattenhofer, C. Zhang
AbstractThis course involves the participation in a substantial development and/or evaluation project involving distributed systems technology. There are projects available in a wide range of areas: from web services to ubiquitous computing including as well wireless networks, ad-hoc networks, and distributed application on mobile phones.
ObjectiveStudents acquire practical knowledge about technologies from the area of distributed systems.
ContentThis course involves the participation in a substantial development and/or evaluation project involving distributed systems technology. There are projects available in a wide range of areas: from web services to ubiquitous computing including as well wireless networks, ad-hoc networks, and distributed application on mobile phones. The objecte of the project is for the students to gain hands-on-experience with real products and the latest technology in distributed systems. There is no lecture associated to the course.
For information of the course or projects available, please contact Prof. Mattern, Prof. Wattenhofer, Prof. Roscoe or Prof. G. Alonso.
263-3501-00LFuture Internet Information W6 credits1V + 1U + 3AA. Singla
AbstractThis course will discuss recent advances in networking, with a focus on the Internet, with topics ranging from the algorithmic design of applications like video streaming to the likely near-future of satellite-based networking.
ObjectiveThe goals of the course are to build on basic undergraduate-level networking, and provide an understanding of the tradeoffs and existing technology in the design of large, complex networked systems, together with concrete experience of the challenges through a series of lab exercises.
ContentThe focus of the course is on principles, architectures, protocols, and applications used in modern networked systems. Example topics include:

- How video streaming services like Netflix work, and research on improving their performance.
- How Web browsing could be made faster
- How the Internet's protocols are improving
- Exciting developments in satellite-based networking (ala SpaceX)
- The role of data centers in powering Internet services

A series of programming assignments will form a substantial part of the course grade.
Lecture notesLecture slides will be made available at the course Web site: Link
LiteratureNo textbook is required, but there will be regularly assigned readings from research literature, liked to the course Web site: Link.
Prerequisites / NoticeAn undergraduate class covering the basics of networking, such as Internet routing and TCP. At ETH, Computer Networks (252-0064-00L) and Communication Networks (227-0120-00L) suffice. Similar courses from other universities are acceptable too.
263-3710-00LMachine Perception Information Restricted registration - show details
Number of participants limited to 200.
W5 credits2V + 1U + 1AO. Hilliges
AbstractRecent developments in neural networks (aka “deep learning”) have drastically advanced the performance of machine perception systems in a variety of areas including computer vision, robotics, and intelligent UIs. This course is a deep dive into deep learning algorithms and architectures with applications to a variety of perceptual tasks.
ObjectiveStudents will learn about fundamental aspects of modern deep learning approaches for perception. Students will learn to implement, train and debug their own neural networks and gain a detailed understanding of cutting-edge research in learning-based computer vision, robotics and HCI. The final project assignment will involve training a complex neural network architecture and applying it on a real-world dataset of human activity.

The core competency acquired through this course is a solid foundation in deep-learning algorithms to process and interpret human input into computing systems. In particular, students should be able to develop systems that deal with the problem of recognizing people in images, detecting and describing body parts, inferring their spatial configuration, performing action/gesture recognition from still images or image sequences, also considering multi-modal data, among others.
ContentWe will focus on teaching: how to set up the problem of machine perception, the learning algorithms, network architectures and advanced deep learning concepts in particular probabilistic deep learning models

The course covers the following main areas:
I) Foundations of deep-learning.
II) Probabilistic deep-learning for generative modelling of data (latent variable models, generative adversarial networks and auto-regressive models).
III) Deep learning in computer vision, human-computer interaction and robotics.

Specific topics include: 
I) Deep learning basics:
a) Neural Networks and training (i.e., backpropagation)
b) Feedforward Networks
c) Timeseries modelling (RNN, GRU, LSTM)
d) Convolutional Neural Networks for classification
II) Probabilistic Deep Learning:
a) Latent variable models (VAEs)
b) Generative adversarial networks (GANs)
c) Autoregressive models (PixelCNN, PixelRNN, TCNs)
III) Deep Learning techniques for machine perception:
a) Fully Convolutional architectures for dense per-pixel tasks (i.e., instance segmentation)
b) Pose estimation and other tasks involving human activity
c) Deep reinforcement learning
IV) Case studies from research in computer vision, HCI, robotics and signal processing
LiteratureDeep Learning
Book by Ian Goodfellow and Yoshua Bengio
Prerequisites / NoticeThis is an advanced grad-level course that requires a background in machine learning. Students are expected to have a solid mathematical foundation, in particular in linear algebra, multivariate calculus, and probability. The course will focus on state-of-the-art research in deep-learning and will not repeat basics of machine learning

Please take note of the following conditions:
1) The number of participants is limited to 200 students (MSc and PhDs).
2) Students must have taken the exam in Machine Learning (252-0535-00) or have acquired equivalent knowledge
3) All practical exercises will require basic knowledge of Python and will use libraries such as TensorFlow, scikit-learn and scikit-image. We will provide introductions to TensorFlow and other libraries that are needed but will not provide introductions to basic programming or Python.

The following courses are strongly recommended as prerequisite:
* "Visual Computing" or "Computer Vision"

The course will be assessed by a final written examination in English. No course materials or electronic devices can be used during the examination. Note that the examination will be based on the contents of the lectures, the associated reading materials and the exercises.
Seminar in Distributed Systems
NumberTitleTypeECTSHoursLecturers
263-2211-00LSeminar in Computer Architecture Information Restricted registration - show details
Number of participants limited to 22.

The deadline for deregistering expires at the end of the second week of the semester. Students who are still registered after that date, but do not attend the seminar, will officially fail the seminar.
W2 credits2SO. Mutlu, M. H. K. Alser, J. Gómez Luna
AbstractThis seminar course covers fundamental and cutting-edge research papers in computer architecture. It has multiple components that are aimed at improving students' (1) technical skills in computer architecture, (2) critical thinking and analysis abilities on computer architecture concepts, as well as (3) technical presentation of concepts and papers in both spoken and written forms.
ObjectiveThe main objective is to learn how to rigorously analyze and present papers and ideas on computer architecture. We will have rigorous presentation and discussion of selected papers during lectures and a written report delivered by each student at the end of the semester.

This course is for those interested in computer architecture. Registered students are expected to attend every meeting, participate in the discussion, and create a synthesis report at the end of the course.
ContentTopics will center around computer architecture. We will, for example, discuss papers on hardware security; accelerators for key applications like machine learning, graph processing and bioinformatics; memory systems; interconnects; processing in memory; various fundamental and emerging paradigms in computer architecture; hardware/software co-design and cooperation; fault tolerance; energy efficiency; heterogeneous and parallel systems; new execution models; predictable computing, etc.
Lecture notesAll materials will be posted on the course website: Link
Past course materials, including the synthesis report assignment, can be found in the Fall 2019 website for the course: Link
LiteratureKey papers and articles, on both fundamentals and cutting-edge topics in computer architecture will be provided and discussed. These will be posted on the course website.
Prerequisites / NoticeDesign of Digital Circuits.
Students should (1) have done very well in Design of Digital Circuits and (2) show a genuine interest in Computer Architecture.
263-3712-00LSeminar on Computational Interaction Information Restricted registration - show details
Number of participants limited to 14.

The deadline for deregistering expires at the end of the second week of the semester. Students who are still registered after that date, but do not attend the seminar, will officially fail the seminar.
W2 credits2SO. Hilliges
AbstractComputational Interaction focuses on the use of algorithms to enhance the interaction with a computing system. Papers from scientific venues such as CHI, UIST & SIGGRAPH will be examined in-depth. Student present and discuss the papers to extract techniques and insights that can be applied to software & hardware projects. Topics include user modeling, computational design, and input & output.
ObjectiveThe goal of the seminar is to familiarize students with exciting new research topics in this important area, but also to teach basic scientific writing and oral presentation skills.
ContentThe seminar will have a different structure from regular seminars to encourage more discussion and a deeper learning experience. We will use a case-study format where all students read the same paper each week but fulfill different roles and hence prepare with different viewpoints in mind (e.g. "presenter", "historian", "student", etc).

The seminar will cover multiple topics of computational interaction, including:
1) User- and context modeling for UI adaptation
Intent modeling, activity and emotion recognition, and user perception.

2) Computational design
Design mining, design exploration, UI optimization.

3) Computer supported input
Text entry, pointing, gestural input, physiological sensing, eye tracking, and sketching.

4) Computer supported output
Information retrieval, fabrication, mixed reality interfaces, haptics, and gaze contingency

For each topic, a paper will be chosen that represents the state of the art of research or seminal work that inspired and fostered future work. Student will learn how to incorporate computational methods into system that involve software, hardware, and, very importantly, users.

Seminar website: Link
263-3840-00LHardware Architectures for Machine Learning Information
The deadline for deregistering expires at the end of the second week of the semester. Students who are still registered after that date, but do not attend the seminar, will officially fail the seminar.
W2 credits2SG. Alonso, T. Hoefler, C. Zhang
AbstractThe seminar covers recent results in the increasingly important field of hardware acceleration for data science and machine learning, both in dedicated machines or in data centers.
ObjectiveThe seminar aims at students interested in the system aspects of machine learning, who are willing to bridge the gap across traditional disciplines: machine learning, databases, systems, and computer architecture.
ContentThe seminar is intended to cover recent results in the increasingly important field of hardware acceleration for data science and machine learning, both in dedicated machines or in data centers.
Prerequisites / NoticeThe seminar should be of special interest to students intending to complete a master's thesis or a doctoral dissertation in related topics.
227-0126-00LAdvanced Topics in Networked Embedded SystemsW2 credits1SL. Thiele, J. Beutel
AbstractThe seminar will cover advanced topics in networked embedded systems. A particular focus are cyber-physical systems, internet of things, and sensor networks in various application domains.
ObjectiveThe goal is to get a deeper understanding on leading edge technologies in the discipline, on classes of applications, and on current as well as future research directions. In addition, participants will improve their presentation, reading and reviewing skills.
ContentThe seminar enables Master students, PhDs and Postdocs to learn about latest breakthroughs in wireless sensor networks, networked embedded systems and devices, and energy-harvesting in several application domains, including environmental monitoring, tracking, smart buildings and control. Participants are requested to actively participate in the organization and preparation of the seminar. In particular, they review all presented papers using a standard scientific reviewing system, they present one of the papers orally and they lead the corresponding discussion session.
227-0559-00LSeminar in Deep Reinforcement Learning Information Restricted registration - show details
Number of participants limited to 25.
W2 credits2SR. Wattenhofer, O. Richter
AbstractIn this seminar participating students present and discuss recent research papers in the area of deep reinforcement learning. The seminar starts with two introductory lessons introducing the basic concepts. Alongside the seminar a programming challenge is posed in which students can take part to improve their grade.
ObjectiveSince Google Deepmind presented the Deep Q-Network (DQN) algorithm in 2015 that could play Atari-2600 games at a superhuman level, the field of deep reinforcement learning gained a lot of traction. It sparked media attention with AlphaGo and AlphaZero and is one of the most prominent research areas. Yet many research papers in the area come from one of two sources: Google Deepmind or OpenAI. In this seminar we aim at giving the students an in depth view on the current advances in the area by discussing recent papers as well as discussing current issues and difficulties surrounding deep reinforcement learning.
ContentTwo introductory courses introducing Q-learning and policy gradient methods. Afterwards participating students present recent papers. For details see: Link
Lecture notesSlides of presentations will be made available.
LiteratureOpenAI course (Link) plus selected papers.
The paper selection can be found on Link.
Prerequisites / NoticeIt is expected that student have prior knowledge and interest in machine and deep learning, for instance by having attended appropriate courses.
851-0740-00LBig Data, Law, and Policy Restricted registration - show details
Number of participants limited to 35

Students will be informed by 1.3.2020 at the latest.
W3 credits2SS. Bechtold
AbstractThis course introduces students to societal perspectives on the big data revolution. Discussing important contributions from machine learning and data science, the course explores their legal, economic, ethical, and political implications in the past, present, and future.
ObjectiveThis course is intended both for students of machine learning and data science who want to reflect on the societal implications of their field, and for students from other disciplines who want to explore the societal impact of data sciences. The course will first discuss some of the methodological foundations of machine learning, followed by a discussion of research papers and real-world applications where big data and societal values may clash. Potential topics include the implications of big data for privacy, liability, insurance, health systems, voting, and democratic institutions, as well as the use of predictive algorithms for price discrimination and the criminal justice system. Guest speakers, weekly readings and reaction papers ensure a lively debate among participants from various backgrounds.
227-0559-10LSeminar in Communication Networks: Learning, Reasoning and Control Restricted registration - show details
Does not take place this semester.
Number of participants limited to 24.
W2 credits2SL. Vanbever, A. Singla
AbstractIn this seminar participating students review, present, and discuss (mostly recent) research papers in the area of computer networks. This semester the seminar will focus on topics blending networks with machine learning and control theory.
ObjectiveThe two main goals of this seminar are: 1) learning how to read and review scientific papers; and 2) learning how to present and discuss technical topics with an audience of peers.

Students are required to attend the entire seminar, choose a paper to present from a given list, prepare and give a presentation on that topic, and lead the follow-up discussion. To ensure the talks' quality, each student will be mentored by a teaching assistant. In addition to presenting one paper, every student is also required to submit one (short) review for one of the two papers presented every week in-class (12 reviews in total).

The students will be evaluated based on their submitted reviews, their presentation, their leadership in animating the discussion for their own paper, and their participation in the discussions of other papers.
ContentThe seminar will start with two introductory lectures in week 1 and week 2. Starting from week 3, participating students will start reviewing, presenting, and discussing research papers. Each week will see two presentations, for a total of 24 papers.

The course content will vary from semester to semester. This semester, the seminar will focus on topics blending networks with machine learning and control theory. For details, please see: Link
Lecture notesThe slides of each presentation will be made available on the website.
LiteratureThe paper selection will be made available on the course website: Link
Prerequisites / NoticeCommunication Networks (227-0120-00L) or equivalents. It is expected that students have prior knowledge in machine learning and control theory, for instance by having attended appropriate courses.
  •  Page  1  of  6 Next page Last page     All