Foundations of Distributed and Large Scale Computing Optimization
E. CHOUZENOUX
LearningTheory

Objectif du cours

The objective of this course is to introduce the theoretical background which makes it possible to develop efficient algorithms to successfully address these problems by taking advantage of modern multicore or distributed computing architectures. This course will be mainly focused on nonlinear optimization tools for dealing with convex problems. Proximal tools, splitting techniques and Majorization-Minimization strategies which are now very popular for processing massive datasets will be presented. Illustrations of these methods on various applicative examples will be provided.

 

Presentation : here

Organisation des séances

8 courses

Mode de validation

TP reports + exam

Références

Bauschke, H.H. and Combettes, P. L.: ConvexAnalysis and Monotone Operator Theory in HilbertSpaces. Springer, New York, 2011.

Parikh, N. and Boyd, S.: Proximal Algorithms. Foundationsand Trends in Optimization, vol. 1, no. 3, Jan. 2014.

Thèmes abordés

* Large scale optimization

* Proximal algorithms

* Majorization-Minimization methods

* Parallel optimization

* Distributed optimization strategies

Les intervenants

Emilie CHOUZENOUX

voir les autres cours du 1er semestre