Suchergebnis: Katalogdaten im Herbstsemester 2019
Data Science Master ![]() | ||||||
![]() | ||||||
![]() ![]() | ||||||
![]() ![]() ![]() | ||||||
Nummer | Titel | Typ | ECTS | Umfang | Dozierende | |
---|---|---|---|---|---|---|
252-0535-00L | Advanced Machine Learning ![]() | W | 8 KP | 3V + 2U + 2A | J. M. Buhmann | |
Kurzbeschreibung | Machine learning algorithms provide analytical methods to search data sets for characteristic patterns. Typical tasks include the classification of data, function fitting and clustering, with applications in image and speech analysis, bioinformatics and exploratory data analysis. This course is accompanied by practical machine learning projects. | |||||
Lernziel | Students will be familiarized with advanced concepts and algorithms for supervised and unsupervised learning; reinforce the statistics knowledge which is indispensible to solve modeling problems under uncertainty. Key concepts are the generalization ability of algorithms and systematic approaches to modeling and regularization. Machine learning projects will provide an opportunity to test the machine learning algorithms on real world data. | |||||
Inhalt | The theory of fundamental machine learning concepts is presented in the lecture, and illustrated with relevant applications. Students can deepen their understanding by solving both pen-and-paper and programming exercises, where they implement and apply famous algorithms to real-world data. Topics covered in the lecture include: Fundamentals: What is data? Bayesian Learning Computational learning theory Supervised learning: Ensembles: Bagging and Boosting Max Margin methods Neural networks Unsupservised learning: Dimensionality reduction techniques Clustering Mixture Models Non-parametric density estimation Learning Dynamical Systems | |||||
Skript | No lecture notes, but slides will be made available on the course webpage. | |||||
Literatur | C. Bishop. Pattern Recognition and Machine Learning. Springer 2007. R. Duda, P. Hart, and D. Stork. Pattern Classification. John Wiley & Sons, second edition, 2001. T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning: Data Mining, Inference and Prediction. Springer, 2001. L. Wasserman. All of Statistics: A Concise Course in Statistical Inference. Springer, 2004. | |||||
Voraussetzungen / Besonderes | The course requires solid basic knowledge in analysis, statistics and numerical methods for CSE as well as practical programming experience for solving assignments. Students should have followed at least "Introduction to Machine Learning" or an equivalent course offered by another institution. PhD students are required to obtain a passing grade in the course (4.0 or higher based on project and exam) to gain credit points. | |||||
227-0423-00L | Neural Network Theory | W | 4 KP | 2V + 1U | H. Bölcskei, E. Riegler | |
Kurzbeschreibung | The class focuses on fundamental mathematical aspects of neural networks with an emphasis on deep networks: Universal approximation theorems, capacity of separating surfaces, generalization, reproducing Kernel Hilbert spaces, support vector machines, fundamental limits of deep neural network learning, dimension measures, feature extraction with scattering networks | |||||
Lernziel | After attending this lecture, participating in the exercise sessions, and working on the homework problem sets, students will have acquired a working knowledge of the mathematical foundations of neural networks. | |||||
Inhalt | 1. Universal approximation with single- and multi-layer networks 2. Geometry of decision surfaces 3. Separating capacity of nonlinear decision surfaces 4. Generalization 5. Reproducing Kernel Hilbert Spaces, support vector machines 6. Deep neural network approximation theory: Fundamental limits on compressibility of signal classes, Kolmogorov epsilon-entropy of signal classes, covering numbers, fundamental limits of deep neural network learning 7. Learning of real-valued functions: Pseudo-dimension, fat-shattering dimension, Vapnik-Chervonenkis dimension 8. Scattering networks | |||||
Skript | Detailed lecture notes will be provided as we go along. | |||||
Voraussetzungen / Besonderes | This course is aimed at students with a strong mathematical background in general, and in linear algebra, analysis, and probability theory in particular. |
Seite 1 von 1