# Search result: Catalogue data in Spring Semester 2020

Doctoral Department of Computer Science More Information at: https://www.ethz.ch/en/doctorate.html | ||||||

Doctoral and Post-Doctoral Courses | ||||||

Number | Title | Type | ECTS | Hours | Lecturers | |
---|---|---|---|---|---|---|

» Course Catalogue of ETH Zurich | ||||||

252-0926-00L | Advanced Seminar on Distributed Systems | W | 2 credits | 2S | F. Mattern | |

Abstract | Latest Topics in the area of Distributed Systems will be discussed. | |||||

Objective | Learn about current topics in the area of Distributed Systems. | |||||

Prerequisites / Notice | Seminar for PhD students. | |||||

252-4202-00L | Seminar in Theoretical Computer Science | W | 2 credits | 2S | E. Welzl, B. Gärtner, M. Ghaffari, M. Hoffmann, J. Lengler, A. Steger, D. Steurer, B. Sudakov | |

Abstract | Presentation of recent publications in theoretical computer science, including results by diploma, masters and doctoral candidates. | |||||

Objective | To get an overview of current research in the areas covered by the involved research groups. To present results from the literature. | |||||

Prerequisites / Notice | This seminar takes place as part of the joint research seminar of several theory groups. Intended participation is for students with excellent performance only. Formal restriction is: prior successful participation in a master level seminar in theoretical computer science. | |||||

252-0945-10L | Doctoral Seminar Machine Learning (FS20)Only for Computer Science Ph.D. students. This doctoral seminar is intended for PhD students affiliated with the Institute for Machine Learning. Other PhD students who work on machine learning projects or related topics need approval by at least one of the organizers to register for the seminar. | W | 2 credits | 1S | J. M. Buhmann, T. Hofmann, A. Krause, G. Rätsch | |

Abstract | An essential aspect of any research project is dissemination of the findings arising from the study. Here we focus on oral communication, which includes: appropriate selection of material, preparation of the visual aids (slides and/or posters), and presentation skills. | |||||

Objective | The seminar participants should learn how to prepare and deliver scientific talks as well as to deal with technical questions. Participants are also expected to actively contribute to discussions during presentations by others, thus learning and practicing critical thinking skills. | |||||

Prerequisites / Notice | This doctoral seminar of the Machine Learning Laboratory of ETH is intended for PhD students who work on a machine learning project, i.e., for the PhD students of the ML lab. | |||||

263-2100-00L | Research Topics in Software Engineering Number of participants limited to 22. The deadline for deregistering expires at the end of the second week of the semester. Students who are still registered after that date, but do not attend the seminar, will officially fail the seminar. | W | 2 credits | 2S | Z. Su, P. He, M. Rigger, T. Su | |

Abstract | This seminar is an opportunity to become familiar with current research in software engineering and more generally with the methods and challenges of scientific research. | |||||

Objective | Each student will be asked to study some papers from the recent software engineering literature and review them. This is an exercise in critical review and analysis. Active participation is required (a presentation of a paper as well as participation in discussions). | |||||

Content | The aim of this seminar is to introduce students to recent research results in the area of programming languages and software engineering. To accomplish that, students will study and present research papers in the area as well as participate in paper discussions. The papers will span topics in both theory and practice, including papers on program verification, program analysis, testing, programming language design, and development tools. | |||||

Literature | The publications to be presented will be announced on the seminar home page at least one week before the first session. | |||||

Prerequisites / Notice | Papers will be distributed during the first lecture. | |||||

263-3840-00L | Hardware Architectures for Machine Learning The deadline for deregistering expires at the end of the second week of the semester. Students who are still registered after that date, but do not attend the seminar, will officially fail the seminar. | W | 2 credits | 2S | G. Alonso, T. Hoefler, C. Zhang | |

Abstract | The seminar covers recent results in the increasingly important field of hardware acceleration for data science and machine learning, both in dedicated machines or in data centers. | |||||

Objective | The seminar aims at students interested in the system aspects of machine learning, who are willing to bridge the gap across traditional disciplines: machine learning, databases, systems, and computer architecture. | |||||

Content | The seminar is intended to cover recent results in the increasingly important field of hardware acceleration for data science and machine learning, both in dedicated machines or in data centers. | |||||

Prerequisites / Notice | The seminar should be of special interest to students intending to complete a master's thesis or a doctoral dissertation in related topics. | |||||

263-4203-00L | Geometry: Combinatorics and Algorithms The deadline for deregistering expires at the end of the second week of the semester. Students who are still registered after that date, but do not attend the seminar, will officially fail the seminar. | W | 2 credits | 2S | B. Gärtner, M. Hoffmann, E. Welzl, M. Wettstein | |

Abstract | This seminar complements the course Geometry: Combinatorics & Algorithms. Students of the seminar will present original research papers, some classic and some of them very recent. | |||||

Objective | Each student is expected to read, understand, and elaborate on a selected research paper. To this end, (s)he should give a 45-min. presentation about the paper. The process includes * getting an overview of the related literature; * understanding and working out the background/motivation: why and where are the questions addressed relevant? * understanding the contents of the paper in all details; * selecting parts suitable for the presentation; * presenting the selected parts in such a way that an audience with some basic background in geometry and graph theory can easily understand and appreciate it. | |||||

Content | This seminar is held once a year and complements the course Geometry: Combinatorics & Algorithms. Students of the seminar will present original research papers, some classic and some of them very recent. The seminar is a good preparation for a master, diploma, or semester thesis in the area. | |||||

Prerequisites / Notice | Prerequisite: Successful participation in the course "Geometry: Combinatorics & Algorithms" (takes place every HS) is required. | |||||

264-5800-15L | Doctoral Seminar in Visual Computing (FS20) | W | 1 credit | 1S | M. Pollefeys, O. Sorkine Hornung, S. Tang | |

Abstract | In this doctoral seminar, current research at the Institute for Visual Computing will be presented and discussed. The goal is to learn about current research projects at our institute, to strengthen our expertise in the field, to provide a platform where research challenges can be discussed, and also to practice scientific presentations. | |||||

Objective | In this doctoral seminar, current research at the Institute for Visual Computing will be presented and discussed. The goal is to learn about current research projects at our institute, to strengthen our expertise in the field, to provide a platform where research challenges can be discussed, and also to practice scientific presentations. | |||||

Content | Current research at the IVC will be presented and discussed. | |||||

Prerequisites / Notice | This course requires solid knowledge in the area of Computer Graphics and Computer Vision as well as state-of-the-art research. | |||||

264-5812-00L | Writing for Publication in Computer Science (WPCS) Only for D-INFK doctoral students. Number of participants limited to 15. | Z | 2 credits | 1G | S. Milligan | |

Abstract | This short course is designed to help junior researchers in Computer Science develop the skills needed to write their first research articles. | |||||

Objective | Writing for Publication in Computer Science is a short course (5 x 4-lesson workshops) designed to help doctoral students develop the skills needed to write their first research articles. The course deals with topics such as: - understanding the needs of different target readerships, - managing the writing process efficiently, - structuring texts effectively, - producing logical flow in sentences and paragraphs, - editing texts before submission, and - revising texts in response to colleagues' feedback and reviewers' comments. | |||||

Content | Participants will be expected to produce a number of short texts (e.g., draft of a conference abstract) as homework assignments; they will receive individual feedback on these texts during the course. Wherever feasible, elements of participants' future conference/journal articles can be developed as assignments within the course, so it is likely to be particularly useful for those who have i) their data and are about to begin the writing process, or ii) an MSc thesis they would like to convert for publication. | |||||

151-0906-00L | Frontiers in Energy Research This course is only for doctoral students. | W | 2 credits | 2S | C. Schaffner | |

Abstract | Doctoral students at ETH Zurich working in the broad area of energy present their research to their colleagues, their advisors and the scientific community. Each week a different student gives a 50-60 min presentation of their research (a full introduction, background & findings) followed by discussion with the audience. | |||||

Objective | The key objectives of the course are: (1) participants will gain knowledge of advanced research in the area of energy; (2) participants will actively participate in discussion after each presentation; (3) participants gain experience of different presentation styles; (4) to create a network amongst the energy research doctoral student community. | |||||

Content | Doctoral students at ETH Zurich working in the broad area of energy present their research to their colleagues, to their advisors and to the scientific community. There will be one presentation a week during the semester, each structured as follows: 20 min introduction to the research topic, 30 min presentation of the results, 30 min discussion with the audience. | |||||

Lecture notes | Slides will be available on the Energy Science Center pages(www.esc.ethz.ch/events/frontiers-in-energy-research.html). | |||||

263-5300-00L | Guarantees for Machine Learning | W | 5 credits | 2V + 2A | F. Yang | |

Abstract | This course teaches classical and recent methods in statistics and optimization commonly used to prove theoretical guarantees for machine learning algorithms. The knowledge is then applied in project work that focuses on understanding phenomena in modern machine learning. | |||||

Objective | This course is aimed at advanced master and doctorate students who want to understand and/or conduct independent research on theory for modern machine learning. For this purpose, students will learn common mathematical techniques from statistical learning theory. In independent project work, they then apply their knowledge and go through the process of critically questioning recently published work, finding relevant research questions and learning how to effectively present research ideas to a professional audience. | |||||

Content | This course teaches some classical and recent methods in statistical learning theory aimed at proving theoretical guarantees for machine learning algorithms, including topics in - concentration bounds, uniform convergence - high-dimensional statistics (e.g. Lasso) - prediction error bounds for non-parametric statistics (e.g. in kernel spaces) - minimax lower bounds - regularization via optimization The project work focuses on active theoretical ML research that aims to understand modern phenomena in machine learning, including but not limited to - how overparameterization could help generalization ( interpolating models, linearized NN ) - how overparameterization could help optimization ( non-convex optimization, loss landscape ) - complexity measures and approximation theoretic properties of randomly initialized and trained NN - generalization of robust learning ( adversarial robustness, standard and robust error tradeoff ) - prediction with calibrated confidence ( conformal prediction, calibration ) | |||||

Prerequisites / Notice | It’s absolutely necessary for students to have a strong mathematical background (basic real analysis, probability theory, linear algebra) and good knowledge of core concepts in machine learning taught in courses such as “Introduction to Machine Learning”, “Regression”/ “Statistical Modelling”. It's also helpful to have heard an optimization course or approximation theoretic course. In addition to these prerequisites, this class requires a certain degree of mathematical maturity—including abstract thinking and the ability to understand and write proofs. | |||||

263-4507-00L | Advances in Distributed Graph AlgorithmsDoes not take place this semester. | W | 6 credits | 3V + 1U + 1A | M. Ghaffari | |

Abstract | How can a network of computers solve the graph problems needed for running that network? | |||||

Objective | This course will familiarize the students with the algorithmic tools and techniques in local distributed graph algorithms, and overview the recent highlights in the field. This will also prepare the students for independent research at the frontier of this area. This is a special‐topics course in algorithm design. It should be accessible to any student with sufficient theoretical/algorithmic background. In particular, it assumes no familiarity with distributed computing. We only expect that the students are comfortable with the basics of algorithm design and analysis, as well as probability theory. It is possible to take this course simultaneously with the course “Principles of Distributed Computing”. If you are not sure whether you are ready for this class or not, please consult the instructor. | |||||

Content | How can a network of computers solve the graph problems needed for running that network? Answering this and similar questions is the underlying motivation of the area of Distributed Graph Algorithms. The area focuses on the foundational algorithmic aspects in these questions and provides methods for various distributed systems --- e.g., the Internet, a wireless network, a multi-processor computer, etc --- to solve computational problems that can be abstracted as graph problems. For instance, think about shortest path computation in routing, or about coloring and independent set computation in contention resolution. Over the past decade, we have witnessed a renaissance in the area of Distributed Graph Algorithms, with tremendous progress in many directions and solutions for a number of decades-old central problems. This course overviews the highlights of these results. The course will mainly focus on one half of the field, which revolves around locality and local problems. The other half, which relates to the issue of congestion and dealing with limited bandwidth in global problems, will not be addressed in this offering of the course. The course will cover a sampling of the recent developments (and open questions) at the frontier of research of distributed graph algorithms. The material will be based on a compilation of recent papers in this area, which will be provided throughout the semester. The tentative list of topics includes: - The shattering technique for local graph problems and its necessity - Lovasz Local Lemma algorithms, their distributed variants, and distributed applications - Distributed Derandomization - Distributed Lower bounds - Graph Coloring - Complexity Hierarchy and Gaps - Primal-Dual Techniques | |||||

Prerequisites / Notice | The class assumes no knowledge in distributed algorithms/computing. Our only prerequisite is the undergraduate class Algorithms, Probability, and Computing (APC) or any other course that can be seen as the equivalent. In particular, much of what we will discuss uses randomized algorithms and therefore, we will assume that the students are familiar with the tools and techniques in randomized algorithms and analysis (to the extent covered in the APC class). | |||||

252-0220-10L | Introduction to Machine Learning (only project)Only for Ph.D. students! | W | 2 credits | 4A | A. Krause | |

Abstract | ||||||

Objective | The course will introduce the foundations of learning and making predictions from data. We will study basic concepts such as trading goodness of fit and model complexitiy. We will discuss important machine learning algorithms used in practice, and provide hands-on experience in a course project. |

- Page 1 of 1