Opérateurs monotones aléatoires et applications à l’optimisation stochastique
Axis : DataSense, tâche 4|
Subject : Opérateurs monotones aléatoires et applications à l’optimisation stochastique
Directors : Walid HACHEM, UPEst Marne La Vallée, Pascal BIANCHI, Telecom ParisTech, Jérémie JAKUBOWICZ, Telecom SudParis
Institutions : LTCI, SAMOVAR
Administrator laboratory : LTCI
PhD Student : Adil SALIM
Beginning : automne 2015
Thesis defence : october or november 2018
Scientifics production :
The general objective of the thesis is to study the behaviour of optimization algorithms based on the proximal operator in a noisy framework, and more generally, to evaluate the asymptotic behaviour of evolution equations generated by random monotonic operators. It is a question of building methodological tools to establish a bridge between the theory of stochastic approximation and that of monotonous operators. This theoretical connection puts into perspective the construction of new stochastic algorithms, such as primary algorithms, with high application potential for statistical learning and signal processing.
The thesis is part of a research program articulated around random monotonous operators recently addressed members of the project team. Starting from the algorithm where the sequence of observed random variables (ξ k) k∈N satisfies a given statistical model (iid, martingale, or Markov chain), we give ourselves the following objectives :
Establish the convergence of the sequence of iterations (x k) or their empirical means towards the set of zeros of the integral of Aumann A, in the case where the steps γ k are decreasing, More generally, show that the trajectories converge towards a dynamic system solution x(t) ∈ -At?, Consider the case where the steps γ are constant. In this context, the evidence of convergence is of a different nature from the case where steps tend towards zero, Use the results obtained to establish the convergence of algorithms more complex than al- the proximal point gorithm, such as ADMM, primary algorithms, proximal point algorithms
descent by coordinates. To demonstrate experimentally the good behaviour of the methods thanks to numerical validations on massive data sets.
By relying on a strong expertise in stochastic approximation, the project will allow the development of efficient "solvers" better adapted to use on massive data typically encountered in statistical learning problems.