401-3905-68L  Convex Optimization in Machine Learning and Computational Finance

SemesterAutumn Semester 2018
LecturersP. Cheridito, M. Baes
Periodicitynon-recurring course
Language of instructionEnglish


Abstract
Objective
ContentPart 1: Convex Analysis
Lecture 1: General introduction, convex sets and functions
Lecture 2: Semidefinite cone, Separation theorems (Application to the Fundamental Theorem of Asset Pricing)
Lecture 3: Analytic properties of convex functions, duality (Application to Support Vector Machines)
Lecture 4: Lagrangian duality, conjugate functions, support functions
Lecture 5: Subgradients and subgradient calculus (Application to Automatic Differentiation and Lexicographic Differentiation)
Lecture 6: Karush-Kuhn-Tucker Conditions (Application to Markowitz portfolio optimization)
Part 2: Applications
Lecture 7: Approximation, Lasso optimization, Covariance matrix estimation (Application: a politically optimal splitting of Switzerland)
Lecture 8: Clustering and MaxCut problems, Optimal coalitions and Shapley Value
Part 3: Algorithms
Lecture 9: Intractability of Optimization, Gradient Method for convex optimization, Stochastic Gradient Method (Application to Neural Networks)
Lecture 10: Fundamental flaws of Gradient Methods, Mirror Descent Method (Application to Multiplicative Weight Method and Adaboost)
Lecture 11: Accelerated Gradient Method, Smoothing Technique (Application to large-scale Lasso optimization)
Lecture 12: Newton Method and its fundamental drawbacks, Self-Concordant Functions
Lecture 13: Interior-Point Methods