Amos Lapidoth: Catalogue data in Spring Semester 2019

Award: The Golden Owl
Name Prof. Dr. Amos Lapidoth
FieldInformationstheorie
Address
Inst. f. Signal-u.Inf.verarbeitung
ETH Zürich, ETF E 107
Sternwartstrasse 7
8092 Zürich
SWITZERLAND
Telephone+41 44 632 51 92
E-maillapidoth@isi.ee.ethz.ch
DepartmentInformation Technology and Electrical Engineering
RelationshipFull Professor

NumberTitleECTSHoursLecturers
227-0104-00LCommunication and Detection Theory Information 6 credits4GA. Lapidoth
AbstractThis course teaches the foundations of modern digital communications and detection theory. Topics include the geometry of the space of energy-limited signals; the baseband representation of passband signals, spectral efficiency and the Nyquist Criterion; the power and power spectral density of PAM and QAM; hypothesis testing; Gaussian stochastic processes; and detection in white Gaussian noise.
ObjectiveThis is an introductory class to the field of wired and wireless communication. It offers a glimpse at classical analog modulation (AM, FM), but mainly focuses on aspects of modern digital communication, including modulation schemes, spectral efficiency, power budget analysis, block and convolu- tional codes, receiver design, and multi- accessing schemes such as TDMA, FDMA and Spread Spectrum.
Content- Baseband representation of passband signals.
- Bandwidth and inner products in baseband and passband.
- The geometry of the space of energy-limited signals.
- The Sampling Theorem as an orthonormal expansion.
- Sampling passband signals.
- Pulse Amplitude Modulation (PAM): energy, power, and power spectral density.
- Nyquist Pulses.
- Quadrature Amplitude Modulation (QAM).
- Hypothesis testing.
- The Bhattacharyya Bound.
- The multivariate Gaussian distribution
- Gaussian stochastic processes.
- Detection in white Gaussian noise.
Lecture notesn/a
LiteratureA. Lapidoth, A Foundation in Digital Communication, Cambridge University Press, 2nd edition (2017)
227-0420-00LInformation Theory II Information 6 credits2V + 2UA. Lapidoth, S. M. Moser
AbstractThis course builds on Information Theory I. It introduces additional topics in single-user communication, connections between Information Theory and Statistics, and Network Information Theory.
ObjectiveThe course has two objectives: to introduce the students to the key information theoretic results that underlay the design of communication systems and to equip the students with the tools that are needed to conduct research in Information Theory.
ContentDifferential entropy, maximum entropy, the Gaussian channel and water filling, the entropy-power inequality, Sanov's Theorem, Fisher information, the broadcast channel, the multiple-access channel, Slepian-Wolf coding, and the Gelfand-Pinsker problem.
Lecture notesn/a
LiteratureT.M. Cover and J.A. Thomas, Elements of Information Theory, second edition, Wiley 2006
401-5680-00LFoundations of Data Science Seminar Information 0 creditsP. L. Bühlmann, H. Bölcskei, J. M. Buhmann, T. Hofmann, A. Krause, A. Lapidoth, H.‑A. Loeliger, M. H. Maathuis, N. Meinshausen, G. Rätsch, S. van de Geer
AbstractResearch colloquium
Objective