Search result: Catalogue data in Spring Semester 2021
Electrical Engineering and Information Technology Master | ||||||
Master Studies (Programme Regulations 2008) | ||||||
Major Courses A total of 42 CP must be achieved form courses during the Master Program. The individual study plan is subject to the tutor's approval. | ||||||
Signal Processing and Machine Learning | ||||||
Core Subjects | ||||||
Number | Title | Type | ECTS | Hours | Lecturers | |
---|---|---|---|---|---|---|
227-0391-00L | Medical Image Analysis Basic knowledge of computer vision would be helpful. | W | 3 credits | 2G | E. Konukoglu, M. A. Reyes Aguirre | |
Abstract | It is the objective of this lecture to introduce the basic concepts used in Medical Image Analysis. In particular the lecture focuses on shape representation schemes, segmentation techniques, machine learning based predictive models and various image registration methods commonly used in Medical Image Analysis applications. | |||||
Objective | This lecture aims to give an overview of the basic concepts of Medical Image Analysis and its application areas. | |||||
Prerequisites / Notice | Prerequisites: Basic concepts of mathematical analysis and linear algebra. Preferred: Basic knowledge of computer vision and machine learning would be helpful. The course will be held in English. | |||||
227-0427-10L | Advanced Signal Analysis, Modeling, and Machine Learning | W | 6 credits | 4G | H.‑A. Loeliger | |
Abstract | The course develops a selection of topics pivoting around graphical models (factor graphs), state space methods, sparsity, and pertinent algorithms. | |||||
Objective | The course develops a selection of topics pivoting around factor graphs, state space methods, and pertinent algorithms: - factor graphs and message passing algorithms - hidden-Markov models - linear state space models, Kalman filtering, and recursive least squares - Gaussian message passing - Gibbs sampling, particle filter - recursive local polynomial fitting & applications - parameter learning by expectation maximization - sparsity and spikes - binary control and digital-to-analog conversion - duality and factor graph transforms | |||||
Lecture notes | Lecture notes | |||||
Prerequisites / Notice | Solid mathematical foundations (especially in probability, estimation, and linear algebra) as provided by the course "Introduction to Estimation and Machine Learning". | |||||
227-0434-10L | Mathematics of Information | W | 8 credits | 3V + 2U + 2A | H. Bölcskei | |
Abstract | The class focuses on mathematical aspects of 1. Information science: Sampling theorems, frame theory, compressed sensing, sparsity, super-resolution, spectrum-blind sampling, subspace algorithms, dimensionality reduction 2. Learning theory: Approximation theory, greedy algorithms, uniform laws of large numbers, Rademacher complexity, Vapnik-Chervonenkis dimension | |||||
Objective | The aim of the class is to familiarize the students with the most commonly used mathematical theories in data science, high-dimensional data analysis, and learning theory. The class consists of the lecture, exercise sessions with homework problems, and of a research project, which can be carried out either individually or in groups. The research project consists of either 1. software development for the solution of a practical signal processing or machine learning problem or 2. the analysis of a research paper or 3. a theoretical research problem of suitable complexity. Students are welcome to propose their own project at the beginning of the semester. The outcomes of all projects have to be presented to the entire class at the end of the semester. | |||||
Content | Mathematics of Information 1. Signal representations: Frame theory, wavelets, Gabor expansions, sampling theorems, density theorems 2. Sparsity and compressed sensing: Sparse linear models, uncertainty relations in sparse signal recovery, super-resolution, spectrum-blind sampling, subspace algorithms (ESPRIT), estimation in the high-dimensional noisy case, Lasso 3. Dimensionality reduction: Random projections, the Johnson-Lindenstrauss Lemma Mathematics of Learning 4. Approximation theory: Nonlinear approximation theory, best M-term approximation, greedy algorithms, fundamental limits on compressibility of signal classes, Kolmogorov-Tikhomirov epsilon-entropy of signal classes, optimal compression of signal classes 5. Uniform laws of large numbers: Rademacher complexity, Vapnik-Chervonenkis dimension, classes with polynomial discrimination | |||||
Lecture notes | Detailed lecture notes will be provided at the beginning of the semester. | |||||
Prerequisites / Notice | This course is aimed at students with a background in basic linear algebra, analysis, statistics, and probability. We encourage students who are interested in mathematical data science to take both this course and "401-4944-20L Mathematics of Data Science" by Prof. A. Bandeira. The two courses are designed to be complementary. H. Bölcskei and A. Bandeira | |||||
227-0449-00L | Seminar in Biomedical Image Computing | W | 1 credit | 2S | E. Konukoglu, B. Menze, M. A. Reyes Aguirre | |
Abstract | This is a seminar lecture focusing on recent research topics in biomedical image computing, machine learning techniques related to interpreting biomedical images and medical data in general. Every week a different topic will be presented and discussed. | |||||
Objective | The goal of this lecture is to provide a glimpse of the current research landscape to graduate students who are interested in working on biomedical image computing and related areas. Different topics will be covered by different speakers every week to provide a broad perspective and highlight current challenges. Every week students will be asked to read a paper, prepare discussion questions and participate in the discussion. Upon completion of this course, students will have a broad overview of the recent developments in biomedical image computing and ability to critically discuss a scientific article. | |||||
Prerequisites / Notice | Knowledge in computer vision, machine learning and biomedical image analysis would be essential. | |||||
252-0220-00L | Introduction to Machine Learning Limited number of participants. Preference is given to students in programmes in which the course is being offered. All other students will be waitlisted. Please do not contact Prof. Krause for any questions in this regard. If necessary, please contact Link | W | 8 credits | 4V + 2U + 1A | A. Krause, F. Yang | |
Abstract | The course introduces the foundations of learning and making predictions based on data. | |||||
Objective | The course will introduce the foundations of learning and making predictions from data. We will study basic concepts such as trading goodness of fit and model complexitiy. We will discuss important machine learning algorithms used in practice, and provide hands-on experience in a course project. | |||||
Content | - Linear regression (overfitting, cross-validation/bootstrap, model selection, regularization, [stochastic] gradient descent) - Linear classification: Logistic regression (feature selection, sparsity, multi-class) - Kernels and the kernel trick (Properties of kernels; applications to linear and logistic regression); k-nearest neighbor - Neural networks (backpropagation, regularization, convolutional neural networks) - Unsupervised learning (k-means, PCA, neural network autoencoders) - The statistical perspective (regularization as prior; loss as likelihood; learning as MAP inference) - Statistical decision theory (decision making based on statistical models and utility functions) - Discriminative vs. generative modeling (benefits and challenges in modeling joint vy. conditional distributions) - Bayes' classifiers (Naive Bayes, Gaussian Bayes; MLE) - Bayesian approaches to unsupervised learning (Gaussian mixtures, EM) | |||||
Literature | Textbook: Kevin Murphy, Machine Learning: A Probabilistic Perspective, MIT Press | |||||
Prerequisites / Notice | Designed to provide a basis for following courses: - Advanced Machine Learning - Deep Learning - Probabilistic Artificial Intelligence - Seminar "Advanced Topics in Machine Learning" | |||||
401-4944-20L | Mathematics of Data Science Does not take place this semester. | W | 8 credits | 4G | A. Bandeira | |
Abstract | Mostly self-contained, but fast-paced, introductory masters level course on various theoretical aspects of algorithms that aim to extract information from data. | |||||
Objective | Introduction to various mathematical aspects of Data Science. | |||||
Content | These topics lie in overlaps of (Applied) Mathematics with: Computer Science, Electrical Engineering, Statistics, and/or Operations Research. Each lecture will feature a couple of Mathematical Open Problem(s) related to Data Science. The main mathematical tools used will be Probability and Linear Algebra, and a basic familiarity with these subjects is required. There will also be some (although knowledge of these tools is not assumed) Graph Theory, Representation Theory, Applied Harmonic Analysis, among others. The topics treated will include Dimension reduction, Manifold learning, Sparse recovery, Random Matrices, Approximation Algorithms, Community detection in graphs, and several others. | |||||
Lecture notes | Link | |||||
Prerequisites / Notice | The main mathematical tools used will be Probability, Linear Algebra (and real analysis), and a working knowledge of these subjects is required. In addition to these prerequisites, this class requires a certain degree of mathematical maturity--including abstract thinking and the ability to understand and write proofs. We encourage students who are interested in mathematical data science to take both this course and ``227-0434-10L Mathematics of Information'' taught by Prof. H. Bölcskei. The two courses are designed to be complementary. A. Bandeira and H. Bölcskei |
- Page 1 of 1