* Introduction: Motivation, applications, outline.
* Bayesian estimation: General Bayesian estimator, MMSE estimator, MAP estimator, ML estimator.
* Bayesian classification: General Bayesian classifier, MAP classifier, ML classifier.
* Exponential family: Definition and expressions, log-partition function, sufficient statistic, posterior distribution, conjugate prior, Bayesian estimators, examples.
* Bayesian networks: Definition, basic examples, conditional independence, d-separation property, Markov blanket.
* Elementary distributions and conjugate priors: Gaussian, gamma, inverse gamma, Wishart, inverse Wishart, Bernoulli, beta, multinomial, Dirichlet, Student's t, embedding in the exponential family, elementary inference problems.
* Sampling methods: Rejection sampling, importance sampling, MCMC methods, Metropolis-Hastings algorithm, cycles of MCMC kernels, Gibbs sampler.
* Variational Bayesian methods: Laplace approximation, evidence lower bound, mean-field approximation, CAVI algorithm, stochastic variational inference, variational EM algorithm, expectation-propagation algorithm.
Optional topics (two to be chosen by the students):
* Inference in probabilistic networks: Factor graph, sum-product algorithm, max-sum algorithm, loopy belief propagation.
* Gaussian mixtures: Definition, ML methods, sampling methods, variational Bayesian methods, clustering.
* Gaussian process regression: Gaussian process model, regression, learning the hyperparameters, Gaussian process classification, relevance vector machine.
* Bayesian deep learning: Bayesian neural networks, learning the weights, variational Bayesian methods, dropout.