Search result: Catalogue data in Autumn Semester 2024
Science, Technology, and Policy Master | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Minor in Natural Sciences and Engineering | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Data and Computer Science | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Number | Title | Type | ECTS | Hours | Lecturers | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
263-3210-00L | Deep Learning | W | 8 credits | 3V + 2U + 2A | T. Hofmann | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Deep learning is an area within machine learning that deals with algorithms and models that automatically induce multi-level data representations. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | In recent years, deep learning and deep networks have significantly improved the state-of-the-art in many application domains such as computer vision, speech recognition, and natural language processing. This class will cover the mathematical foundations of deep learning and provide insights into model design, training, and validation. The main objective is a profound understanding of why these methods work and how. There will also be a rich set of hands-on tasks and practical projects to familiarize students with this emerging technology. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | This is an advanced level course that requires some basic background in machine learning. More importantly, students are expected to have a very solid mathematical foundation, including linear algebra, multivariate calculus, and probability. The course will make heavy use of mathematics and is not (!) meant to be an extended tutorial of how to train deep networks with tools like Torch or Tensorflow, although that may be a side benefit. The participation in the course is subject to the following condition: - Students must have taken the exam in Advanced Machine Learning (252-0535-00) or have acquired equivalent knowledge, see exhaustive list below: Advanced Machine Learning https://ml2.inf.ethz.ch/courses/aml/ Computational Intelligence Lab http://da.inf.ethz.ch/teaching/2019/CIL/ Introduction to Machine Learning https://las.inf.ethz.ch/teaching/introml-S19 Statistical Learning Theory http://ml2.inf.ethz.ch/courses/slt/ Computational Statistics https://stat.ethz.ch/lectures/ss19/comp-stats.php Probabilistic Artificial Intelligence https://las.inf.ethz.ch/teaching/pai-f18 | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
252-1414-00L | System Security | W | 7 credits | 2V + 2U + 2A | S. Capkun, S. Shinde | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The first part of the course covers general security concepts and hardware-based support for security. In the second part, the focus is on system design and methodologies for building secure systems. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | In this lecture, students learn about the security requirements and capabilities that are expected from modern hardware, operating systems, and other software environments. An overview of available technologies, algorithms and standards is given, with which these requirements can be met. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The first part of the lecture covers hardware-based security concepts. Topics include the concept of physical and software-based side channel attacks on hardware resources, architectural support for security (e.g., memory management and permissions, disk encryption), and trusted execution environments (Intel SGX, ARM TrustZone, AMD SEV, and RISC-V Keystone). In the second part, the focus is on system design and methodologies for building secure systems. Topics include: common software faults (e.g., buffer overflows, etc.), bug-detection, writing secure software (design, architecture, QA, testing), compiler-supported security (e.g., control-flow integrity), and language-supported security (e.g., memory safety). Along the lectures, model cases will be elaborated and evaluated in the exercises. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-4640-00L | Network Security | W | 8 credits | 2V + 2U + 3A | P. De Vaere, S. Frei, K. Paterson, A. Perrig | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Some of today's most damaging attacks on computer systems involve exploitation of network infrastructure, either as the target of attack or as a vehicle to attack end systems. This course provides an in-depth study of network attack techniques and methods to defend against them. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | - Students are familiar with fundamental network-security concepts. - Students can assess current threats that Internet services and networked devices face, and can evaluate appropriate countermeasures. - Students can identify and assess vulnerabilities in software systems and network protocols. - Students have an in-depth understanding of a range of important state-of-the-art security technologies. - Students can implement network-security protocols based on cryptographic libraries. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The course will cover topics spanning four broad themes with a focus on the first two themes: (1) network defense mechanisms such as public-key infrastructures, TLS, VPNs, anonymous-communication systems, secure routing protocols, secure DNS systems, and network intrusion-detection systems; (2) network attacks such as hijacking, spoofing, denial-of-service (DoS), and distributed denial-of-service (DDoS) attacks; (3) analysis and inference topics such as traffic monitoring and network forensics; and (4) new technologies related to next-generation networks. In addition, several guest lectures will provide in-depth insights into specific current real-world network-security topics. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | This lecture is intended for students with an interest in securing Internet communication services and network devices. Students are assumed to have knowledge in networking as taught in a communication networks lecture like 252-0064-00L or 227-0120-00L. Basic knowledge of information security or applied cryptography as taught in 252-0211-00L or 263-4660-00L is beneficial, but an overview of the most important cryptographic primitives will be provided at the beginning of the course. The course will involve several graded course projects. Students are expected to be familiar with a general-purpose or network programming language such as C/C++, Go, Python, or Rust. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
252-0535-00L | Advanced Machine Learning | W | 10 credits | 3V + 2U + 4A | J. M. Buhmann, C. Cotrini Jimenez | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Machine learning algorithms provide analytical methods to search data sets for characteristic patterns. Typical tasks include the classification of data, function fitting and clustering, with applications in image and speech analysis, bioinformatics and exploratory data analysis. This course is accompanied by practical machine learning projects. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Students will be familiarized with advanced concepts and algorithms for supervised and unsupervised learning; reinforce the statistics knowledge which is indispensible to solve modeling problems under uncertainty. Key concepts are the generalization ability of algorithms and systematic approaches to modeling and regularization. Machine learning projects will provide an opportunity to test the machine learning algorithms on real world data. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The theory of fundamental machine learning concepts is presented in the lecture, and illustrated with relevant applications. Students can deepen their understanding by solving both pen-and-paper and programming exercises, where they implement and apply famous algorithms to real-world data. Topics covered in the lecture include: Fundamentals: What is data? Bayesian Learning Computational learning theory Supervised learning: Ensembles: Bagging and Boosting Max Margin methods Neural networks Unsupservised learning: Dimensionality reduction techniques Clustering Mixture Models Non-parametric density estimation Learning Dynamical Systems | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Lecture notes | No lecture notes, but slides will be made available on the course webpage. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | C. Bishop. Pattern Recognition and Machine Learning. Springer 2007. R. Duda, P. Hart, and D. Stork. Pattern Classification. John Wiley & Sons, second edition, 2001. T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning: Data Mining, Inference and Prediction. Springer, 2001. L. Wasserman. All of Statistics: A Concise Course in Statistical Inference. Springer, 2004. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | The course requires solid basic knowledge in analysis, statistics and numerical methods for CSE as well as practical programming experience for solving assignments. Students should have followed at least "Introduction to Machine Learning" or an equivalent course offered by another institution. PhD students are required to obtain a passing grade in the course (4.0 or higher based on project and exam) to gain credit points. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-2400-00L | Reliable and Trustworthy Artificial Intelligence | W | 6 credits | 2V + 2U + 1A | M. Vechev | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | Creating reliable, secure, robust, and fair machine learning models is a core challenge in artificial intelligence and one of fundamental importance. The goal of the course is to teach both the mathematical foundations of this new and emerging area as well as to introduce students to the latest and most exciting research in the space. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Upon completion of the course, the students should have mastered the underlying methods and be able to apply them to a variety of engineering and research problems. To facilitate deeper understanding, the course includes a group coding project where students will build a system based on the learned material. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The course is split into 4 parts: Robustness of Machine Learning -------------------------------------------- - Adversarial attacks and defenses on deep learning models. - Automated certification of deep learning models (major trends: convex relaxations, branch-and-bound, randomized smoothing). - Certified training of deep neural networks (combining symbolic and continuous methods). Privacy of Machine Learning -------------------------------------- - Threat models (e.g., stealing data, poisoning, membership inference, etc.). - Attacking federated machine learning (across vision, natural language and tabular data). - Differential privacy for defending machine learning. - AI Regulations and checking model compliance. Fairness of Machine Learning --------------------------------------- - Introduction to fairness (motivation, definitions). - Enforcing individual fairness (for both vision and tabular data). - Enforcing group fairness (e.g., demographic parity, equalized odds). Robustness, Privacy and Fairness of Foundation Models --------------------------------------------------------------------------- - We discuss all previous topics, as well as programmability, in the context of latest foundation models (e.g., LLMs). More information here: https://www.sri.inf.ethz.ch/teaching/rtai24. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | While not a formal requirement, the course assumes familiarity with basics of machine learning (especially linear algebra, gradient descent, and neural networks as well as basic probability theory). These topics are usually covered in “Intro to ML” classes at most institutions (e.g., “Introduction to Machine Learning” at ETH). The coding project will utilize Python and PyTorch. Thus some programming experience in Python is expected. Students without prior knowledge of PyTorch are expected to acquire it early in the course by solving exercise sheets. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-3845-00L | Data Management Systems | W | 8 credits | 3V + 1U + 3A | G. Alonso | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The course will cover the implementation aspects of data management systems using relational database engines as a starting point to cover the basic concepts of efficient data processing and then expanding those concepts to modern implementations in data centers and the cloud. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The goal of the course is to convey the fundamental aspects of efficient data management from a systems implementation perspective: storage, access, organization, indexing, consistency, concurrency, transactions, distribution, query compilation vs interpretation, data representations, etc. Using conventional relational engines as a starting point, the course will aim at providing an in depth coverage of the latest technologies used in data centers and the cloud to implement large scale data processing in various forms. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The course will first cover fundamental concepts in data management: storage, locality, query optimization, declarative interfaces, concurrency control and recovery, buffer managers, management of the memory hierarchy, presenting them in a system independent manner. The course will place an special emphasis on understating these basic principles as they are key to understanding what problems existing systems try to address. It will then proceed to explore their implementation in modern relational engines supporting SQL to then expand the range of systems used in the cloud: key value stores, geo-replication, query as a service, serverless, large scale analytics engines, etc. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | The main source of information for the course will be articles and research papers describing the architecture of the systems discussed. The list of papers will be provided at the beginning of the course. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | The course requires to have completed the Data Modeling and Data Bases course at the Bachelor level as it assumes knowledge of databases and SQL. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5902-00L | Computer Vision | W | 8 credits | 3V + 1U + 3A | M. Pollefeys, S. Tang | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | The goal of this course is to provide students with a good understanding of computer vision and image analysis techniques. The main concepts and techniques will be studied in depth and practical algorithms and approaches will be discussed and explored through the exercises. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The objectives of this course are: 1. To introduce the fundamental problems of computer vision. 2. To introduce the main concepts and techniques used to solve those. 3. To enable participants to implement solutions for reasonably complex problems. 4. To enable participants to make sense of the computer vision literature. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | Camera models and calibration, invariant features, Multiple-view geometry, Model fitting, Stereo Matching, Segmentation, 2D Shape matching, Shape from Silhouettes, Optical flow, Structure from motion, Tracking, Object recognition, Object category recognition | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | It is recommended that students have taken the Visual Computing lecture or a similar course introducing basic image processing concepts before taking this course. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
252-3005-00L | Natural Language Processing | W | 7 credits | 3V + 3U + 1A | R. Cotterell | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This course presents topics in natural language processing with an emphasis on modern techniques, primarily focusing on statistical and deep learning approaches. The course provides an overview of the primary areas of research in language processing as well as a detailed exploration of the models and techniques used both in research and in commercial natural language systems. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | The objective of the course is to learn the basic concepts in the statistical processing of natural languages. The course will be project-oriented so that the students can also gain hands-on experience with state-of-the-art tools and techniques. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | This course presents an introduction to general topics and techniques used in natural language processing today, primarily focusing on statistical approaches. The course provides an overview of the primary areas of research in language processing as well as a detailed exploration of the models and techniques used both in research and in commercial natural language systems. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Literature | Lectures will make use of textbooks such as the one by Jurafsky and Martin where appropriate, but will also make use of original research and survey papers. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
263-5057-00L | From Publication to the Doctor's Office The deadline for deregistering expires at the end of the second week of the semester. Students who are still registered after that date, but do not attend the seminar, will officially fail the seminar. | W | 3 credits | 2S + 1A | O. Demler | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Abstract | This seminar course is designed to provide students with an opportunity to review and critically evaluate recent publications in medical field focusing on examples when CS method or bioinformatics/statistical technique has lead to an instrumentation, technique or drug approved for clinical practice use. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Learning objective | Throughout the course, students will read and analyze recent publications that demonstrate successful applications and sometimes failures in medicine. Promissing research applications will also be duscussed. The publications will cover a wide range of topics, including drug discovery, image analysis, prognostic models, and learning healthcare. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Content | The course will be structured as half lecture content and half seminar content. Lectures will review state of the medical practice prior to the discovery, obstacles in moving the field forward, and the need for improvement. Lecture will be followed by the seminar part where students will take turns presenting the assigned publications and leading the discussions. Students‘ presentations will focus on the main findings, and specfic steps taken to translate the finding into clinical practice. Publications will include examples of: • specific CS/bioinformatics/statistics applications that has been brought to „bedside“ – has been approved by European Medicines Agency / Food and Drug Administration (USA) for clinical use or are widely used in medical research; • examples of failures of how a discovery did not translate into an endproduct and why; current active research areas. Covered topics will include some of the following: • Drug discovery: Computer-aided drug discovery has become an integral part of the drug development process, enabling researchers to design and screen large libraries of molecules in silico (i.e., using computer simulations) before synthesizing and testing them in the lab. This has led to the discovery of new drug candidates for a wide range of diseases, including cancer, Alzheimer's disease, and HIV/AIDS. • Genomics: Advances in computational genomics have enabled researchers to analyze and interpret large-scale genomic data, including DNA sequencing data, to identify disease-causing mutations, genetic risk factors, and drug targets. Examples of the development of personalized medicine, where treatments are tailored to an individual's genetic makeup. Examples when drug target identified by genetics has led to approved treatment. • Imaging: Computer vision and image processing techniques have revolutionized medical imaging, enabling researchers to extract quantitative information from medical images that were previously inaccessible. This has led to the development of new diagnostic and prognostic tools for a wide range of diseases, including cancer, cardiovascular disease, and neurological disorders. • Real-world data applications: emulation of clinical trials using electronic health records data. • Large language models: Generating clinical trial protocols using large language models. Natural Language Processing for information extraction and interpretation. • Learning healthcare systems: Advances in data analytics and information technology have enabled the development of learning healthcare systems, which use real-time data from electronic health records, medical devices, and other sources to improve patient outcomes and reduce healthcare costs. This has the potential to transform the way healthcare is delivered, making it more personalized, efficient, and effective. In addition to the presentations, students will also be required to write critical reviews of the assigned publications throughout the course. The reviews will be evaluated based on the students' ability to identify the strengths and weaknesses of the publications and to provide insightful and constructive feedback. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Prerequisites / Notice | The course is intended for advanced undergraduate and graduate students with a background in computer science, bioinformatics, or a related field and interest in applying their skills to medical research. This course assumes a working knowledge of R/Python and intermediate statistical analysis, including linear, logistic, survival regressions or ability and interest to learn them outside of the class. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Competencies |
|
- Page 1 of 1