Search result: Catalogue data in Autumn Semester 2018

Doctoral Dep. of Information Technology and Electrical Engineering Information
More Information at: Link
Doctoral and Post-Doctoral Courses
A minimum of 12 ECTS credit points must be obtained during doctoral studies.

The courses on offer below are but a small selection out of a much larger available number of courses. Please discuss your course selection with your PhD supervisor.
NumberTitleTypeECTSHoursLecturers
151-0906-00LFrontiers in Energy Research
Does not take place this semester.
This course is only for doctoral students.
W2 credits2SD. Poulikakos, R. Boes, V. Hoffmann, G. Hug, M. Mazzotti, A. Patt, A. Schlüter
AbstractDoctoral students at ETH Zurich working in the broad area of energy present their research to their colleagues, their advisors and the scientific community. Each week a different student gives a 50-60 min presentation of their research (a full introduction, background & findings) followed by discussion with the audience.
ObjectiveKnowledge of advanced research in the area of energy.
ContentPhD students at ETH Zurich working in the broad area of energy present their research to their colleagues, to their advisors and to the scientific community. Every week there are two presentations, each structured as follows: 15 min introduction to the research topic, 15 min presentation of the results, 15 min discussion with the audience.
Lecture notesSlides will be distributed.
227-0225-00LLinear System TheoryW6 credits5GM. Kamgarpour
AbstractThe class is intended to provide a comprehensive overview of the theory of linear dynamical systems, stability analysis, and their use in control and estimation. The focus is on the mathematics behind the physical properties of these systems and on understanding and constructing proofs of properties of linear control systems.
ObjectiveStudents should be able to apply the fundamental results in linear system theory to analyze and control linear dynamical systems.
Content- Proof techniques and practices.
- Linear spaces, normed linear spaces and Hilbert spaces.
- Ordinary differential equations, existence and uniqueness of solutions.
- Continuous and discrete-time, time-varying linear systems. Time domain solutions. Time invariant systems treated as a special case.
- Controllability and observability, duality. Time invariant systems treated as a special case.
- Stability and stabilization, observers, state and output feedback, separation principle.
Lecture notesAvailable on the course Moodle platform.
Prerequisites / NoticeSufficient mathematical maturity with special focus on logic, linear algebra, analysis.
227-0417-00LInformation Theory IW6 credits4GA. Lapidoth
AbstractThis course covers the basic concepts of information theory and of communication theory. Topics covered include the entropy rate of a source, mutual information, typical sequences, the asymptotic equi-partition property, Huffman coding, channel capacity, the channel coding theorem, the source-channel separation theorem, and feedback capacity.
ObjectiveThe fundamentals of Information Theory including Shannon's source coding and channel coding theorems
ContentThe entropy rate of a source, Typical sequences, the asymptotic equi-partition property, the source coding theorem, Huffman coding, Arithmetic coding, channel capacity, the channel coding theorem, the source-channel separation theorem, feedback capacity
LiteratureT.M. Cover and J. Thomas, Elements of Information Theory (second edition)
227-0427-00LSignal Analysis, Models, and Machine LearningW6 credits4GH.‑A. Loeliger
AbstractMathematical methods in signal processing and machine learning.
I. Linear signal representation and approximation: Hilbert spaces, LMMSE estimation, regularization and sparsity.
II. Learning linear and nonlinear functions and filters: neural networks, kernel methods.
III. Structured statistical models: hidden Markov models, factor graphs, Kalman filter, Gaussian models with sparse events.
ObjectiveThe course is an introduction to some basic topics in signal processing and machine learning.
ContentPart I - Linear Signal Representation and Approximation: Hilbert spaces, least squares and LMMSE estimation, projection and estimation by linear filtering, learning linear functions and filters, L2 regularization, L1 regularization and sparsity, singular-value decomposition and pseudo-inverse, principal-components analysis.
Part II - Learning Nonlinear Functions: fundamentals of learning, neural networks, kernel methods.
Part III - Structured Statistical Models and Message Passing Algorithms: hidden Markov models, factor graphs, Gaussian message passing, Kalman filter and recursive least squares, Monte Carlo methods, parameter estimation, expectation maximization, linear Gaussian models with sparse events.
Lecture notesLecture notes.
Prerequisites / NoticePrerequisites:
- local bachelors: course "Discrete-Time and Statistical Signal Processing" (5. Sem.)
- others: solid basics in linear algebra and probability theory
227-0689-00LSystem IdentificationW4 credits2V + 1UR. Smith
AbstractTheory and techniques for the identification of dynamic models from experimentally obtained system input-output data.
ObjectiveTo provide a series of practical techniques for the development of dynamical models from experimental data, with the emphasis being on the development of models suitable for feedback control design purposes. To provide sufficient theory to enable the practitioner to understand the trade-offs between model accuracy, data quality and data quantity.
ContentIntroduction to modeling: Black-box and grey-box models; Parametric and non-parametric models; ARX, ARMAX (etc.) models.

Predictive, open-loop, black-box identification methods. Time and frequency domain methods. Subspace identification methods.

Optimal experimental design, Cramer-Rao bounds, input signal design.

Parametric identification methods. On-line and batch approaches.

Closed-loop identification strategies. Trade-off between controller performance and information available for identification.
Literature"System Identification; Theory for the User" Lennart Ljung, Prentice Hall (2nd Ed), 1999.

"Dynamic system identification: Experimental design and data analysis", GC Goodwin and RL Payne, Academic Press, 1977.
Prerequisites / NoticeControl systems (227-0216-00L) or equivalent.
227-0955-00LSeminar in Electromagnetics, Photonics and Terahertz Information W3 credits2SJ. Leuthold
AbstractSelected topics of the current research activities at the IEF and closely related institutions are discussed.
ObjectiveHave an overview on the research activities of the IEF institute.
227-0974-00LTNU Colloquium Restricted registration - show details W0 credits2KK. Stephan
AbstractThis colloquium for MSc/PhD students at D-ITET discusses current research in Neuromodeling (the development of mathematical models for diagnostics of brain diseases) and application to Computational Psychiatry/Psychosomatics. The range of topics is broad, incl. statistics and computational modeling, experimental paradigms (fMRI, EEG, behaviour), and clinical questions.
Objectivesee above
252-0535-00LAdvanced Machine Learning Information W8 credits3V + 2U + 2AJ. M. Buhmann
AbstractMachine learning algorithms provide analytical methods to search data sets for characteristic patterns. Typical tasks include the classification of data, function fitting and clustering, with applications in image and speech analysis, bioinformatics and exploratory data analysis. This course is accompanied by practical machine learning projects.
ObjectiveStudents will be familiarized with advanced concepts and algorithms for supervised and unsupervised learning; reinforce the statistics knowledge which is indispensible to solve modeling problems under uncertainty. Key concepts are the generalization ability of algorithms and systematic approaches to modeling and regularization. Machine learning projects will provide an opportunity to test the machine learning algorithms on real world data.
ContentThe theory of fundamental machine learning concepts is presented in the lecture, and illustrated with relevant applications. Students can deepen their understanding by solving both pen-and-paper and programming exercises, where they implement and apply famous algorithms to real-world data.

Topics covered in the lecture include:

Fundamentals:
What is data?
Bayesian Learning
Computational learning theory

Supervised learning:
Ensembles: Bagging and Boosting
Max Margin methods
Neural networks

Unsupservised learning:
Dimensionality reduction techniques
Clustering
Mixture Models
Non-parametric density estimation
Learning Dynamical Systems
Lecture notesNo lecture notes, but slides will be made available on the course webpage.
LiteratureC. Bishop. Pattern Recognition and Machine Learning. Springer 2007.

R. Duda, P. Hart, and D. Stork. Pattern Classification. John Wiley &
Sons, second edition, 2001.

T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical
Learning: Data Mining, Inference and Prediction. Springer, 2001.

L. Wasserman. All of Statistics: A Concise Course in Statistical
Inference. Springer, 2004.
Prerequisites / NoticeThe course requires solid basic knowledge in analysis, statistics and numerical methods for CSE as well as practical programming experience for solving assignments.
Students should have followed at least "Introduction to Machine Learning" or an equivalent course offered by another institution.
252-0417-00LRandomized Algorithms and Probabilistic MethodsW8 credits3V + 2U + 2AA. Steger
AbstractLas Vegas & Monte Carlo algorithms; inequalities of Markov, Chebyshev, Chernoff; negative correlation; Markov chains: convergence, rapidly mixing; generating functions; Examples include: min cut, median, balls and bins, routing in hypercubes, 3SAT, card shuffling, random walks
ObjectiveAfter this course students will know fundamental techniques from probabilistic combinatorics for designing randomized algorithms and will be able to apply them to solve typical problems in these areas.
ContentRandomized Algorithms are algorithms that "flip coins" to take certain decisions. This concept extends the classical model of deterministic algorithms and has become very popular and useful within the last twenty years. In many cases, randomized algorithms are faster, simpler or just more elegant than deterministic ones. In the course, we will discuss basic principles and techniques and derive from them a number of randomized methods for problems in different areas.
Lecture notesYes.
Literature- Randomized Algorithms, Rajeev Motwani and Prabhakar Raghavan, Cambridge University Press (1995)
- Probability and Computing, Michael Mitzenmacher and Eli Upfal, Cambridge University Press (2005)
263-4500-00LAdvanced Algorithms Information W6 credits2V + 2U + 1AM. Ghaffari, A. Krause
AbstractThis is an advanced course on the design and analysis of algorithms, covering a range of topics and techniques not studied in typical introductory courses on algorithms.
ObjectiveThis course is intended to familiarize students with (some of) the main tools and techniques developed over the last 15-20 years in algorithm design, which are by now among the key ingredients used in developing efficient algorithms.
Contentthe lectures will cover a range of topics, including the following: graph sparsifications while preserving cuts or distances, various approximation algorithms techniques and concepts, metric embeddings and probabilistic tree embeddings, online algorithms, multiplicative weight updates, streaming algorithms, sketching algorithms, and a bried glance at MapReduce algorithms.
Prerequisites / NoticeThis course is designed for masters and doctoral students and it especially targets those interested in theoretical computer science, but it should also be accessible to last-year bachelor students.

Sufficient comfort with both (A) Algorithm Design & Analysis and (B) Probability & Concentrations. E.g., having passed the course Algorithms, Probability, and Computing (APC) is highly recommended, though not required formally. If you are not sure whether you're ready for this class or not, please consulte the instructor.
327-2132-00LMultifunctional Ferroic Materials: Growth, Characterisation, SimulationW2 credits2GM. Trassin, M. Fiebig
AbstractThe course will explore the growth of (multi-) ferroic oxide thin films. The structural characterization and ferroic state investigation by force microscopy and by laser-optical techniques will be addressed.
Oxide electronics device concepts will be discussed.
ObjectiveOxide films with a thickness of just a few atoms can now be grown with a precision matching that of semiconductors. This opens up a whole world of functional device concepts and fascinating phenomena that would not occur in the expanded bulk crystal. Particularly interesting phenomena occur in films showing magnetic or electric order or, even better, both of these ("multiferroics").

In this course students will obtain an overarching view on oxide thin epitaxial films and heterostructures design, reaching from their growth by pulsed laser deposition to an understanding of their magnetoelectric functionality from advanced characterization techniques. Students will therefore understand how to fabricate and characterize highly oriented films with magnetic and electric properties not found in nature.
ContentTypes of ferroic order, multiferroics, oxide materials, thin-film growth by pulsed laser deposition, molecular beam epitaxy, RF sputtering, structural characterization (reciprocal space - basics-, XRD for thin films, RHEED) epitaxial strain related effects, scanning probe microscopy techniques, laser-optical characterization, oxide thin film based devices and examples.
401-3054-14LProbabilistic Methods in Combinatorics Information W6 credits2V + 1UB. Sudakov
AbstractThis course provides a gentle introduction to the Probabilistic Method, with an emphasis on methodology. We will try to illustrate the main ideas by showing the application of probabilistic reasoning to various combinatorial problems.
Objective
ContentThe topics covered in the class will include (but are not limited to): linearity of expectation, the second moment method, the local lemma, correlation inequalities, martingales, large deviation inequalities, Janson and Talagrand inequalities and pseudo-randomness.
Literature- The Probabilistic Method, by N. Alon and J. H. Spencer, 3rd Edition, Wiley, 2008.
- Random Graphs, by B. Bollobás, 2nd Edition, Cambridge University Press, 2001.
- Random Graphs, by S. Janson, T. Luczak and A. Rucinski, Wiley, 2000.
- Graph Coloring and the Probabilistic Method, by M. Molloy and B. Reed, Springer, 2002.
» Course Catalogue of ETH Zurich
  •  Page  1  of  1