Basic algebra and statistics. Python / Matlab programming.
Objectif du cours
This course deals with multivariate data analysis, which plays an increasing role in imaging. In a wide range of scientific fields, multivariate data are of uttermost importance: particularly in astrophysics, the fast development of new imaging systems and multi-spectral/multi-temporal sensors enables to solve key problems. Among them, one can cite: the cosmological microwave background estimation (Planck mission); the study of the most powerful phenomena of the universe (incoming Athena telescope); gravitational wave detection (incoming LISA mission); the composition study of supernovae remnants (Chandra satellite); exoplanet detection (incoming Ariel observatory)… Nevertheless, such multi-variate data require the development of adequate mathematical tools. Interestingly, all the previous examples are actually related through similar mathematical problems, in which matrix factorization, or more precisely Blind Source Separation (BSS) plays a key role.
This course thus aims at presenting Blind Source Separation methods for multi-variate data analysis in astrophysics. We will start by introducing classical models and methods, and we will continue with the most recent methods using deep learning. This class will in particular highlight the algorithmic aspects, which are fundamental to solve the considered non-convex problems. It will further be based on the applications to real astrophysical data, with an emphasis on the implementation details, the model limits and the methods for large-scale data analysis.
Organisation des séances
The 24 hours of courses will aim at being interactive, by mixing pratical sessions and lectures.
Mode de validation
– Practical works;
– Short project, due at the end of the semester
– Models and theory: , , 
– Methods : [5, 4, 6]
– Optimization and algorithms: [7, 8, 9]
– Applications: Planck mission (lien), Chandra (lien) and others
 Gillis, N. (2020). Nonnegative Matrix Factorization. Society for Industrial and Applied Mathematics.
 Kervazo, C., Gillis, N., Dobigeon, N. (2020). Provably robust blind source separation of linear-quadratic near-separable mixtures. arXiv preprint arXiv:2011.11966.
 Rapin, J., Bobin, J., Larue, A., Starck, J. L. (2013). Sparse and non-negative BSS for noisy data. IEEE Transactions on Signal Processing, 61(22), 5620-5632.
 Bobin, J., Starck, J. L., Fadili, J. M., Moudden, Y., Donoho, D. L. (2007). Morphological component analysis: An adaptive thresholding strategy. IEEE transactions on image processing, 16(11), 2675-2681.
 Brakel, P., Bengio, Y. (2017). Learning independent features with adversarial nets for non-linear ica. arXiv preprint arXiv:1710.05050. ISO 690
 Parikh, N., Boyd, S. (2014). Proximal algorithms. Foundations and Trends in optimization, 1(3), 127-239. ISO 690
 Bolte, J., Sabach, S., Teboulle, M. (2014). Proximal alternating linearized minimization for nonconvex and nonsmooth problems. Mathematical Programming, 146(1), 459-494.
 Monga, V., Li, Y., Eldar, Y. C. (2021). Algorithm unrolling: Interpretable, efficient deep learning for signal and image processing. IEEE Signal Processing Magazine, 38(2), 18-44. ISO 690
 Dobigeon, N., Tourneret, J. Y., Richard, C., Bermudez, J. C. M., McLaughlin, S., Hero, A. O. (2013). Nonlinear unmixing of hyperspectral images: Models and algorithms. IEEE Signal processing magazine, 31(1), 82-94.
1. Statistical approaches for unsupervised matrix factorization and BSS:
– Classical statistical methods : PCA, ICA, theoretical foundations and algorithms, practical implementation. Non-linear ICA through deep learning and adversarial learning.
– Regularized inverse problem for BSS : from sparsity-based models to learned regularizations (representation learning, manifold learning). Hybrid variational/learning methods, plug-and-play approaches.
– Beyond classical statistical models for data analysis in astrophysics : non-stationary models, highly-complex noise.
2. Algorithmic framework for unsupervised matrix factorization :
– From gradient descent to proximal methods
– Proximal methods for multi-convex problems
– Learning to optimize for matrix factorization: algorithm unfolding/unrolling
3. Nonnegative matrix factorization (NMF) :
– NMF models, identifiability and theoretical guarantees
– Algorithms for NMF
– From linear to non-linear models.