Amos Lapidoth: Catalogue data in Spring Semester 2021

Award: The Golden Owl
Name Prof. Dr. Amos Lapidoth
FieldInformationstheorie
Address
Inst. f. Signal-u.Inf.verarbeitung
ETH Zürich, ETF E 107
Sternwartstrasse 7
8092 Zürich
SWITZERLAND
Telephone+41 44 632 51 92
E-maillapidoth@isi.ee.ethz.ch
DepartmentInformation Technology and Electrical Engineering
RelationshipFull Professor

NumberTitleECTSHoursLecturers
227-0104-00LCommunication and Detection Theory Information 6 credits4GA. Lapidoth
AbstractThis course teaches the foundations of modern digital communications and detection theory. Topics include the geometry of the space of energy-limited signals; the baseband representation of passband signals, spectral efficiency and the Nyquist Criterion; the power and power spectral density of PAM and QAM; hypothesis testing; Gaussian stochastic processes; and detection in white Gaussian noise.
ObjectiveThis is an introductory class to the field of wired and wireless communication. It offers a glimpse at classical analog modulation (AM, FM), but mainly focuses on aspects of modern digital communication, including modulation schemes, spectral efficiency, power budget analysis, block and convolu- tional codes, receiver design, and multi- accessing schemes such as TDMA, FDMA and Spread Spectrum.
Content- Baseband representation of passband signals.
- Bandwidth and inner products in baseband and passband.
- The geometry of the space of energy-limited signals.
- The Sampling Theorem as an orthonormal expansion.
- Sampling passband signals.
- Pulse Amplitude Modulation (PAM): energy, power, and power spectral density.
- Nyquist Pulses.
- Quadrature Amplitude Modulation (QAM).
- Hypothesis testing.
- The Bhattacharyya Bound.
- The multivariate Gaussian distribution
- Gaussian stochastic processes.
- Detection in white Gaussian noise.
Lecture notesn/a
LiteratureA. Lapidoth, A Foundation in Digital Communication, Cambridge University Press, 2nd edition (2017)
227-0420-00LInformation Theory II Information 6 credits4GA. Lapidoth, S. M. Moser
AbstractThis course builds on Information Theory I. It introduces additional topics in single-user communication, connections between Information Theory and Statistics, and Network Information Theory.
ObjectiveThe course's objective is to introduce the students to additional information measures and to equip them with the tools that are needed to conduct research in Information Theory as it relates to Communication Networks and to Statistics.
ContentSanov's Theorem, Rényi entropy and guessing, differential entropy, maximum entropy, the Gaussian channel, the entropy-power inequality, the broadcast channel, the multiple-access channel, Slepian-Wolf coding, the Gelfand-Pinsker problem, and Fisher information.
Lecture notesn/a
LiteratureT.M. Cover and J.A. Thomas, Elements of Information Theory, second edition, Wiley 2006
Prerequisites / NoticeBasic introductory course on Information Theory.
401-5680-00LFoundations of Data Science Seminar Information 0 creditsP. L. Bühlmann, A. Bandeira, H. Bölcskei, J. M. Buhmann, T. Hofmann, A. Krause, A. Lapidoth, H.‑A. Loeliger, M. H. Maathuis, N. Meinshausen, G. Rätsch, S. van de Geer, F. Yang
AbstractResearch colloquium
Objective