Computational Statistics
LearningTrack Santé

Objectif du cours

This course will detail statistical computational methods and bayesian inference.

The course will start with the stochastic gradient algorithm, its theoretical properties and numerical bottlenecks. After a brief introduction to Bayesian inference we will discuss the numerical challenges when computing bayesian estimators. This will lead us to the EM algorithm and its stochastic versions, and more generally to Majorize – Minimize algorithms.

Stochastic approximations will be detailed to better understand the convergence conditions of the previous algorithms.

The second part of the course will be devoted to random variable simulation methods. Starting from the rejection method, we will then present some classical Markov Chain Monte Carlo samplers (Metropolis Hastings and Gibbs algorithms). We will then discuss recent advances in MCMC : Adaptive MCMC, adaptive parallel tempering and Approximated Bayesian Computing methods.

Presentation : here

Mode de validation

This course will be decomposed into 20 h of lectures and 20h of training and lab work sessions.

Validation : 2/3 = homework following each training session (4)  –  1/3 = Oral presentation of one scientific paper ; Team work (2), no report

Master 2 MVA and Data Science

Les intervenants

Stéphanie Allassonière

Université Paris Cité

voir les autres cours du 1er semestre