Modèles génératifs pour l’image
B. GALERNE, V. DE BORTOLI
Image processingModelling

Prè-requis

Les bases d’un M1 de mathématiques appliquées en probabilités, statistiques et optimisation.

Objectif du cours

The goal of this course is to give an overview on existing methods for generative modelling with applications to images. The first part of the course will be focused on texture synthesis with several methods (Gaussian fields, optimization using pre-trained CNN, maximum entropy models…) which do not require the training of a neural network and can be obtained from a single image. For more structured images we will turn to modern image models in a second part of the course. We will introduce the notions of Variational AutoEncoders (VAEs), Generative Adversarial Networks (GAN) and Normalizing Flows (NFs). The rest of the course will be dedicated to the study of a new contender in generative modelling called Score-Based Generative Modelling (SGM). We will study the mathematical aspects (Time reversal of stochastic processes, Links with Regularized Optimal Transport, Sinkhorn algorithm) and practical aspects of these algorithms.

 

Classes will be given in French but slides will be written in English.

Organisation des séances

9 séances de 3h

Mode de validation

1 Devoir maison obligatoire à mi-cours
1 Projet avec étude d’un article à la fin du cours

Références

Random Phase Textures: Theory and Synthesis, (Bruno Galerne, Yann Gousseau, Jean-Michel Morel)
Texture Synthesis Using Convolutional Neural Networks (Leon A. Gatys, Alexander S. Ecker, Matthias Bethge)
Maximum entropy methods for texture synthesis: Theory and practice (Valentin. De Bortoli, Agnès Desolneux, Alain Durmus, Bruno Galerne and Arthur Leclaire)
Generative Adversarial Networks (Ian J. Goodfellow, Jean Pouget-Abadie, Mehdi Mirza, Bing Xu, David Warde-Farley, Sherjil Ozair, Aaron Courville, Yoshua Bengio)
Wasserstein GAN (Martin Arjovsky, Soumith Chintala, Léon Bottou)
Generative Modeling by Estimating Gradients of the Data Distribution (Yang Song, Stefano Ermon)
Score-Based Generative Modeling through Stochastic Differential Equations (Yang Song, Jascha Sohl-Dickstein, Diederik P. Kingma, Abhishek Kumar, Stefano Ermon, Ben Poole)
Denoising Diffusion Probabilistic Models (Jonathan Ho, Ajay Jain, Pieter Abbeel)
Diffusion Schrödinger Bridge with Applications to Score-Based Generative Modeling (Valentin De Bortoli, James Thornton, Jeremy Heng, Arnaud Doucet)

 

Thèmes abordés

Gaussian random fields
Texture synthesis
Maximum entropy models
Generative Adversarial Networks
Score-based Generative Modelling
Regularized Optimal Transport
Stochastic Processes and their discretizations

 

Les intervenants

Bruno Galerne

(Institut Denis Poisson, Université d'Orléans)

Valentin De Bortoli

(CNRS et DI ENS)

voir les autres cours du 2nd semestre