Search result: Catalogue data in Autumn Semester 2018
|Doctoral Dep. of Information Technology and Electrical Engineering |
More Information at: https://www.ethz.ch/en/doctorate.html
| Doctoral and Post-Doctoral Courses|
A minimum of 12 ECTS credit points must be obtained during doctoral studies.
The courses on offer below are but a small selection out of a much larger available number of courses. Please discuss your course selection with your PhD supervisor.
|151-0906-00L||Frontiers in Energy Research|
Does not take place this semester.
This course is only for doctoral students.
|W||2 credits||2S||D. Poulikakos, R. Boes, V. Hoffmann, G. Hug, M. Mazzotti, A. Patt, A. Schlüter|
|Abstract||Doctoral students at ETH Zurich working in the broad area of energy present their research to their colleagues, their advisors and the scientific community. Each week a different student gives a 50-60 min presentation of their research (a full introduction, background & findings) followed by discussion with the audience.|
|Objective||Knowledge of advanced research in the area of energy.|
|Content||PhD students at ETH Zurich working in the broad area of energy present their research to their colleagues, to their advisors and to the scientific community. Every week there are two presentations, each structured as follows: 15 min introduction to the research topic, 15 min presentation of the results, 15 min discussion with the audience.|
|Lecture notes||Slides will be distributed.|
|227-0225-00L||Linear System Theory||W||6 credits||5G||M. Kamgarpour|
|Abstract||The class is intended to provide a comprehensive overview of the theory of linear dynamical systems, stability analysis, and their use in control and estimation. The focus is on the mathematics behind the physical properties of these systems and on understanding and constructing proofs of properties of linear control systems.|
|Objective||Students should be able to apply the fundamental results in linear system theory to analyze and control linear dynamical systems.|
|Content||- Proof techniques and practices.|
- Linear spaces, normed linear spaces and Hilbert spaces.
- Ordinary differential equations, existence and uniqueness of solutions.
- Continuous and discrete-time, time-varying linear systems. Time domain solutions. Time invariant systems treated as a special case.
- Controllability and observability, duality. Time invariant systems treated as a special case.
- Stability and stabilization, observers, state and output feedback, separation principle.
|Lecture notes||Available on the course Moodle platform.|
|Prerequisites / Notice||Sufficient mathematical maturity with special focus on logic, linear algebra, analysis.|
|227-0417-00L||Information Theory I||W||6 credits||4G||A. Lapidoth|
|Abstract||This course covers the basic concepts of information theory and of communication theory. Topics covered include the entropy rate of a source, mutual information, typical sequences, the asymptotic equi-partition property, Huffman coding, channel capacity, the channel coding theorem, the source-channel separation theorem, and feedback capacity.|
|Objective||The fundamentals of Information Theory including Shannon's source coding and channel coding theorems|
|Content||The entropy rate of a source, Typical sequences, the asymptotic equi-partition property, the source coding theorem, Huffman coding, Arithmetic coding, channel capacity, the channel coding theorem, the source-channel separation theorem, feedback capacity|
|Literature||T.M. Cover and J. Thomas, Elements of Information Theory (second edition)|
|227-0427-00L||Signal Analysis, Models, and Machine Learning||W||6 credits||4G||H.‑A. Loeliger|
|Abstract||Mathematical methods in signal processing and machine learning. |
I. Linear signal representation and approximation: Hilbert spaces, LMMSE estimation, regularization and sparsity.
II. Learning linear and nonlinear functions and filters: neural networks, kernel methods.
III. Structured statistical models: hidden Markov models, factor graphs, Kalman filter, Gaussian models with sparse events.
|Objective||The course is an introduction to some basic topics in signal processing and machine learning.|
|Content||Part I - Linear Signal Representation and Approximation: Hilbert spaces, least squares and LMMSE estimation, projection and estimation by linear filtering, learning linear functions and filters, L2 regularization, L1 regularization and sparsity, singular-value decomposition and pseudo-inverse, principal-components analysis.|
Part II - Learning Nonlinear Functions: fundamentals of learning, neural networks, kernel methods.
Part III - Structured Statistical Models and Message Passing Algorithms: hidden Markov models, factor graphs, Gaussian message passing, Kalman filter and recursive least squares, Monte Carlo methods, parameter estimation, expectation maximization, linear Gaussian models with sparse events.
|Lecture notes||Lecture notes.|
|Prerequisites / Notice||Prerequisites: |
- local bachelors: course "Discrete-Time and Statistical Signal Processing" (5. Sem.)
- others: solid basics in linear algebra and probability theory
|227-0689-00L||System Identification||W||4 credits||2V + 1U||R. Smith|
|Abstract||Theory and techniques for the identification of dynamic models from experimentally obtained system input-output data.|
|Objective||To provide a series of practical techniques for the development of dynamical models from experimental data, with the emphasis being on the development of models suitable for feedback control design purposes. To provide sufficient theory to enable the practitioner to understand the trade-offs between model accuracy, data quality and data quantity.|
|Content||Introduction to modeling: Black-box and grey-box models; Parametric and non-parametric models; ARX, ARMAX (etc.) models.|
Predictive, open-loop, black-box identification methods. Time and frequency domain methods. Subspace identification methods.
Optimal experimental design, Cramer-Rao bounds, input signal design.
Parametric identification methods. On-line and batch approaches.
Closed-loop identification strategies. Trade-off between controller performance and information available for identification.
|Literature||"System Identification; Theory for the User" Lennart Ljung, Prentice Hall (2nd Ed), 1999.|
"Dynamic system identification: Experimental design and data analysis", GC Goodwin and RL Payne, Academic Press, 1977.
|Prerequisites / Notice||Control systems (227-0216-00L) or equivalent.|
|227-0955-00L||Seminar in Electromagnetics, Photonics and Terahertz||W||3 credits||2S||J. Leuthold|
|Abstract||Selected topics of the current research activities at the IEF and closely related institutions are discussed.|
|Objective||Have an overview on the research activities of the IEF institute.|
|227-0974-00L||TNU Colloquium||W||0 credits||2K||K. Stephan|
|Abstract||This colloquium for MSc/PhD students at D-ITET discusses current research in Neuromodeling (the development of mathematical models for diagnostics of brain diseases) and application to Computational Psychiatry/Psychosomatics. The range of topics is broad, incl. statistics and computational modeling, experimental paradigms (fMRI, EEG, behaviour), and clinical questions.|
|252-0535-00L||Advanced Machine Learning||W||8 credits||3V + 2U + 2A||J. M. Buhmann|
|Abstract||Machine learning algorithms provide analytical methods to search data sets for characteristic patterns. Typical tasks include the classification of data, function fitting and clustering, with applications in image and speech analysis, bioinformatics and exploratory data analysis. This course is accompanied by practical machine learning projects.|
|Objective||Students will be familiarized with advanced concepts and algorithms for supervised and unsupervised learning; reinforce the statistics knowledge which is indispensible to solve modeling problems under uncertainty. Key concepts are the generalization ability of algorithms and systematic approaches to modeling and regularization. Machine learning projects will provide an opportunity to test the machine learning algorithms on real world data.|
|Content||The theory of fundamental machine learning concepts is presented in the lecture, and illustrated with relevant applications. Students can deepen their understanding by solving both pen-and-paper and programming exercises, where they implement and apply famous algorithms to real-world data.|
Topics covered in the lecture include:
What is data?
Computational learning theory
Ensembles: Bagging and Boosting
Max Margin methods
Dimensionality reduction techniques
Non-parametric density estimation
Learning Dynamical Systems
|Lecture notes||No lecture notes, but slides will be made available on the course webpage.|
|Literature||C. Bishop. Pattern Recognition and Machine Learning. Springer 2007.|
R. Duda, P. Hart, and D. Stork. Pattern Classification. John Wiley &
Sons, second edition, 2001.
T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical
Learning: Data Mining, Inference and Prediction. Springer, 2001.
L. Wasserman. All of Statistics: A Concise Course in Statistical
Inference. Springer, 2004.
|Prerequisites / Notice||The course requires solid basic knowledge in analysis, statistics and numerical methods for CSE as well as practical programming experience for solving assignments.|
Students should have followed at least "Introduction to Machine Learning" or an equivalent course offered by another institution.
|252-0417-00L||Randomized Algorithms and Probabilistic Methods||W||8 credits||3V + 2U + 2A||A. Steger|
|Abstract||Las Vegas & Monte Carlo algorithms; inequalities of Markov, Chebyshev, Chernoff; negative correlation; Markov chains: convergence, rapidly mixing; generating functions; Examples include: min cut, median, balls and bins, routing in hypercubes, 3SAT, card shuffling, random walks|
|Objective||After this course students will know fundamental techniques from probabilistic combinatorics for designing randomized algorithms and will be able to apply them to solve typical problems in these areas.|
|Content||Randomized Algorithms are algorithms that "flip coins" to take certain decisions. This concept extends the classical model of deterministic algorithms and has become very popular and useful within the last twenty years. In many cases, randomized algorithms are faster, simpler or just more elegant than deterministic ones. In the course, we will discuss basic principles and techniques and derive from them a number of randomized methods for problems in different areas.|
|Literature||- Randomized Algorithms, Rajeev Motwani and Prabhakar Raghavan, Cambridge University Press (1995)|
- Probability and Computing, Michael Mitzenmacher and Eli Upfal, Cambridge University Press (2005)
|263-4500-00L||Advanced Algorithms||W||6 credits||2V + 2U + 1A||M. Ghaffari, A. Krause|
|Abstract||This is an advanced course on the design and analysis of algorithms, covering a range of topics and techniques not studied in typical introductory courses on algorithms.|
|Objective||This course is intended to familiarize students with (some of) the main tools and techniques developed over the last 15-20 years in algorithm design, which are by now among the key ingredients used in developing efficient algorithms.|
|Content||the lectures will cover a range of topics, including the following: graph sparsifications while preserving cuts or distances, various approximation algorithms techniques and concepts, metric embeddings and probabilistic tree embeddings, online algorithms, multiplicative weight updates, streaming algorithms, sketching algorithms, and a bried glance at MapReduce algorithms.|
|Prerequisites / Notice||This course is designed for masters and doctoral students and it especially targets those interested in theoretical computer science, but it should also be accessible to last-year bachelor students. |
Sufficient comfort with both (A) Algorithm Design & Analysis and (B) Probability & Concentrations. E.g., having passed the course Algorithms, Probability, and Computing (APC) is highly recommended, though not required formally. If you are not sure whether you're ready for this class or not, please consulte the instructor.
|327-2132-00L||Multifunctional Ferroic Materials: Growth, Characterisation, Simulation||W||2 credits||2G||M. Trassin, M. Fiebig|
|Abstract||The course will explore the growth of (multi-) ferroic oxide thin films. The structural characterization and ferroic state investigation by force microscopy and by laser-optical techniques will be addressed.|
Oxide electronics device concepts will be discussed.
|Objective||Oxide films with a thickness of just a few atoms can now be grown with a precision matching that of semiconductors. This opens up a whole world of functional device concepts and fascinating phenomena that would not occur in the expanded bulk crystal. Particularly interesting phenomena occur in films showing magnetic or electric order or, even better, both of these ("multiferroics").|
In this course students will obtain an overarching view on oxide thin epitaxial films and heterostructures design, reaching from their growth by pulsed laser deposition to an understanding of their magnetoelectric functionality from advanced characterization techniques. Students will therefore understand how to fabricate and characterize highly oriented films with magnetic and electric properties not found in nature.
|Content||Types of ferroic order, multiferroics, oxide materials, thin-film growth by pulsed laser deposition, molecular beam epitaxy, RF sputtering, structural characterization (reciprocal space - basics-, XRD for thin films, RHEED) epitaxial strain related effects, scanning probe microscopy techniques, laser-optical characterization, oxide thin film based devices and examples.|
|401-3054-14L||Probabilistic Methods in Combinatorics||W||6 credits||2V + 1U||B. Sudakov|
|Abstract||This course provides a gentle introduction to the Probabilistic Method, with an emphasis on methodology. We will try to illustrate the main ideas by showing the application of probabilistic reasoning to various combinatorial problems.|
|Content||The topics covered in the class will include (but are not limited to): linearity of expectation, the second moment method, the local lemma, correlation inequalities, martingales, large deviation inequalities, Janson and Talagrand inequalities and pseudo-randomness.|
|Literature||- The Probabilistic Method, by N. Alon and J. H. Spencer, 3rd Edition, Wiley, 2008.|
- Random Graphs, by B. Bollobás, 2nd Edition, Cambridge University Press, 2001.
- Random Graphs, by S. Janson, T. Luczak and A. Rucinski, Wiley, 2000.
- Graph Coloring and the Probabilistic Method, by M. Molloy and B. Reed, Springer, 2002.
|» Course Catalogue of ETH Zurich|
- Page 1 of 1