* Introduction: Motivation, applications, outline.
* Bayesian estimation: General Bayesian estimator, MMSE estimator, MAP estimator, ML estimator, application example.
* Bayesian classification: General Bayesian classifier, MAP classifier, ML classifier, application examples.
* Exponential family: Definition and expressions, log-partition function, sufficient statistic, ML estimator, posterior distribution, conjugate prior, MMSE and MAP estimators, examples.
* Bayesian networks: Definition, basic examples, conditional independence, d-separation property, Markov blanket and boundary.
* Variational Bayesian inference: Laplace approximation, evidence lower bound, mean-field approximation, CAVI algorithm, exponential family model, model with global and local parameters, application example: word-topic modeling using latent Dirichlet allocation, stochastic variational inference, expectation propagation.
* Latent variable methods: EM algorithm, MAP-EM algorithm, variational EM algorithm, auto-encoding variational Bayes method, variational autoencoder.
The prof (Hlawatsch) verbally presents the class material, discusses the material with his students, and answers the students' questions. For this, he uses a blackboard, on which he writes certain characters and draws simple figures with a piece of chalk (also using different colors if helpful). He also uses a tablecloth to erase the board every now and then. Finally, he uses an overhead projector to project more complicated figures and tables on a screen. The prof's presentation is supported by lecture notes.
This course is an optional part of the "Wahlmodul Advanced Signal Processing."
First class: Thursday, 6 October 2022, 3.15 pm in seminar room 389 (room CG 01 18). The course will take place in presence mode.
Lecture notes can be downloaded -- see links further below.
Recommended textbook: Christopher M. Bishop, Pattern Recognition and Machine Learning, Springer, 2006. Free download:
https://www.microsoft.com/en-us/research/people/cmbishop/prml-book/