Suchergebnis: Katalogdaten im Frühjahrssemester 2020

Cyber Security Master Information
Ergänzung
Computational Science
Wahlfächer
NummerTitelTypECTSUmfangDozierende
252-0526-00LStatistical Learning Theory Information W7 KP3V + 2U + 1AJ. M. Buhmann, C. Cotrini Jimenez
KurzbeschreibungThe course covers advanced methods of statistical learning:

- Variational methods and optimization.
- Deterministic annealing.
- Clustering for diverse types of data.
- Model validation by information theory.
LernzielThe course surveys recent methods of statistical learning. The fundamentals of machine learning, as presented in the courses "Introduction to Machine Learning" and "Advanced Machine Learning", are expanded from the perspective of statistical learning.
Inhalt- Variational methods and optimization. We consider optimization approaches for problems where the optimizer is a probability distribution. We will discuss concepts like maximum entropy, information bottleneck, and deterministic annealing.

- Clustering. This is the problem of sorting data into groups without using training samples. We discuss alternative notions of "similarity" between data points and adequate optimization procedures.

- Model selection and validation. This refers to the question of how complex the chosen model should be. In particular, we present an information theoretic approach for model validation.

- Statistical physics models. We discuss approaches for approximately optimizing large systems, which originate in statistical physics (free energy minimization applied to spin glasses and other models). We also study sampling methods based on these models.
SkriptA draft of a script will be provided. Lecture slides will be made available.
LiteraturHastie, Tibshirani, Friedman: The Elements of Statistical Learning, Springer, 2001.

L. Devroye, L. Gyorfi, and G. Lugosi: A probabilistic theory of pattern recognition. Springer, New York, 1996
Voraussetzungen / BesonderesKnowledge of machine learning (introduction to machine learning and/or advanced machine learning)
Basic knowledge of statistics.
261-5120-00LMachine Learning for Health Care Information Belegung eingeschränkt - Details anzeigen
Number of participants limited to 150.
W5 KP3P + 1AG. Rätsch, J. Vogt, V. Boeva
KurzbeschreibungThe course will review the most relevant methods and applications of Machine Learning in Biomedicine, discuss the main challenges they present and their current technical problems.
LernzielDuring the last years, we have observed a rapid growth in the field of Machine Learning (ML), mainly due to improvements in ML algorithms, the increase of data availability and a reduction in computing costs. This growth is having a profound impact in biomedical applications, where the great variety of tasks and data types enables us to get benefit of ML algorithms in many different ways. In this course we will review the most relevant methods and applications of ML in biomedicine, discuss the main challenges they present and their current technical solutions.
InhaltThe course will consist of four topic clusters that will cover the most relevant applications of ML in Biomedicine:
1) Structured time series: Temporal time series of structured data often appear in biomedical datasets, presenting challenges as containing variables with different periodicities, being conditioned by static data, etc.
2) Medical notes: Vast amount of medical observations are stored in the form of free text, we will analyze stategies for extracting knowledge from them.
3) Medical images: Images are a fundamental piece of information in many medical disciplines. We will study how to train ML algorithms with them.
4) Genomics data: ML in genomics is still an emerging subfield, but given that genomics data are arguably the most extensive and complex datasets that can be found in biomedicine, it is expected that many relevant ML applications will arise in the near future. We will review and discuss current applications and challenges.
Voraussetzungen / BesonderesData Structures & Algorithms, Introduction to Machine Learning, Statistics/Probability, Programming in Python, Unix Command Line

Relation to Course 261-5100-00 Computational Biomedicine: This course is a continuation of the previous course with new topics related to medical data and machine learning. The format of Computational Biomedicine II will also be different. It is helpful but not essential to attend Computational Biomedicine before attending Computational Biomedicine II.
263-5300-00LGuarantees for Machine Learning Information Belegung eingeschränkt - Details anzeigen W5 KP2V + 2AF. Yang
KurzbeschreibungThis course teaches classical and recent methods in statistics and optimization commonly used to prove theoretical guarantees for machine learning algorithms. The knowledge is then applied in project work that focuses on understanding phenomena in modern machine learning.
LernzielThis course is aimed at advanced master and doctorate students who want to understand and/or conduct independent research on theory for modern machine learning. For this purpose, students will learn common mathematical techniques from statistical learning theory. In independent project work, they then apply their knowledge and go through the process of critically questioning recently published work, finding relevant research questions and learning how to effectively present research ideas to a professional audience.
InhaltThis course teaches some classical and recent methods in statistical learning theory aimed at proving theoretical guarantees for machine learning algorithms, including topics in

- concentration bounds, uniform convergence
- high-dimensional statistics (e.g. Lasso)
- prediction error bounds for non-parametric statistics (e.g. in kernel spaces)
- minimax lower bounds
- regularization via optimization

The project work focuses on active theoretical ML research that aims to understand modern phenomena in machine learning, including but not limited to

- how overparameterization could help generalization ( interpolating models, linearized NN )
- how overparameterization could help optimization ( non-convex optimization, loss landscape )
- complexity measures and approximation theoretic properties of randomly initialized and
trained NN
- generalization of robust learning ( adversarial robustness, standard and robust error tradeoff )
- prediction with calibrated confidence ( conformal prediction, calibration )
Voraussetzungen / BesonderesIt’s absolutely necessary for students to have a strong mathematical background (basic real analysis, probability theory, linear algebra) and good knowledge of core concepts in machine learning taught in courses such as “Introduction to Machine Learning”, “Regression”/ “Statistical Modelling”. It's also helpful to have heard an optimization course or approximation theoretic course. In addition to these prerequisites, this class requires a certain degree of mathematical maturity—including abstract thinking and the ability to understand and write proofs.
  •  Seite  1  von  1