Suchergebnis: Katalogdaten im Herbstsemester 2018
|Data Science Master|
|227-0417-00L||Information Theory I||W||6 KP||4G||A. Lapidoth|
|Kurzbeschreibung||This course covers the basic concepts of information theory and of communication theory. Topics covered include the entropy rate of a source, mutual information, typical sequences, the asymptotic equi-partition property, Huffman coding, channel capacity, the channel coding theorem, the source-channel separation theorem, and feedback capacity.|
|Lernziel||The fundamentals of Information Theory including Shannon's source coding and channel coding theorems|
|Inhalt||The entropy rate of a source, Typical sequences, the asymptotic equi-partition property, the source coding theorem, Huffman coding, Arithmetic coding, channel capacity, the channel coding theorem, the source-channel separation theorem, feedback capacity|
|Literatur||T.M. Cover and J. Thomas, Elements of Information Theory (second edition)|
|227-0427-00L||Signal Analysis, Models, and Machine Learning||W||6 KP||4G||H.‑A. Loeliger|
|Kurzbeschreibung||Mathematical methods in signal processing and machine learning. |
I. Linear signal representation and approximation: Hilbert spaces, LMMSE estimation, regularization and sparsity.
II. Learning linear and nonlinear functions and filters: neural networks, kernel methods.
III. Structured statistical models: hidden Markov models, factor graphs, Kalman filter, Gaussian models with sparse events.
|Lernziel||The course is an introduction to some basic topics in signal processing and machine learning.|
|Inhalt||Part I - Linear Signal Representation and Approximation: Hilbert spaces, least squares and LMMSE estimation, projection and estimation by linear filtering, learning linear functions and filters, L2 regularization, L1 regularization and sparsity, singular-value decomposition and pseudo-inverse, principal-components analysis.|
Part II - Learning Nonlinear Functions: fundamentals of learning, neural networks, kernel methods.
Part III - Structured Statistical Models and Message Passing Algorithms: hidden Markov models, factor graphs, Gaussian message passing, Kalman filter and recursive least squares, Monte Carlo methods, parameter estimation, expectation maximization, linear Gaussian models with sparse events.
|Voraussetzungen / Besonderes||Prerequisites: |
- local bachelors: course "Discrete-Time and Statistical Signal Processing" (5. Sem.)
- others: solid basics in linear algebra and probability theory
|227-0689-00L||System Identification||W||4 KP||2V + 1U||R. Smith|
|Kurzbeschreibung||Theory and techniques for the identification of dynamic models from experimentally obtained system input-output data.|
|Lernziel||To provide a series of practical techniques for the development of dynamical models from experimental data, with the emphasis being on the development of models suitable for feedback control design purposes. To provide sufficient theory to enable the practitioner to understand the trade-offs between model accuracy, data quality and data quantity.|
|Inhalt||Introduction to modeling: Black-box and grey-box models; Parametric and non-parametric models; ARX, ARMAX (etc.) models.|
Predictive, open-loop, black-box identification methods. Time and frequency domain methods. Subspace identification methods.
Optimal experimental design, Cramer-Rao bounds, input signal design.
Parametric identification methods. On-line and batch approaches.
Closed-loop identification strategies. Trade-off between controller performance and information available for identification.
|Literatur||"System Identification; Theory for the User" Lennart Ljung, Prentice Hall (2nd Ed), 1999.|
"Dynamic system identification: Experimental design and data analysis", GC Goodwin and RL Payne, Academic Press, 1977.
|Voraussetzungen / Besonderes||Control systems (227-0216-00L) or equivalent.|
- Seite 2 von 2 Alle