Introduction to Probabilistic Graphical Models and Deep Generative Models
P. LATOUCHE, P.A. MATTEI
LearningTheory

Prè-requis

Course on Probability

Objectif du cours

This course provides a unifying introduction to probabilistic modelling through the framework of graphical models, together with their associated learning and inference algorithms. It is one of the few historical courses at the core of the MVA program. Recent developments in deep latent variable models such as deep neural networks, variational autoencoders, generative adversarial networks are now part of the program of this course.

 

Presentation : here

 

 

Organisation des séances

  • 9 lectures of 3 hours each
  • All lectures and materials will be in English
  • All lectures will be on zoom and recorded. The lectures at ENS Paris Saclay will be used to answer questions in person, regarding the lectures and the project.

Mode de validation

Report + poster presentation on a research paper

 

Références

 

Thèmes abordés

  • Maximum likelihood
  • Linear regression
  • Logistic regression
  • K-means
  • EM
  • Gaussian mixtures
  • PPCA
  • Bayesian linear regression
  • Gaussian processes
  • EM revisited
  • Model selection
  • Directed graphical models: theory and examples
  • Undirected graphical models
  • Sum-product algorithm
  • HMM
  • Approximate inference I: variational techniques
  • Stochastic block models + VEM
  • Expectation propagation
  • Approximate inference II: Monte Carlo, MCMC
  • Approximate inference III: amortized variational inference
  • Deep latent variable models, variational auto-encoders
  • Deep generative models beyond VAEs
  • GAN, autoregressive models, normalizing flows

 

This course is important for taking the « Generative Models for Images » course in the second semester.

 

Les intervenants

Pierre LATOUCHE

Pierre-Alexandre MATTEI

voir les autres cours du 1er semestre