Publications

Preprints

On the irreducibility and convergence of a class of nonsmooth nonlinear state-space models on manifolds, Armand Gissler, Alain Durmus, Anne Auger, 2024
Abstract:
In this paper, we analyze a large class of general nonlinear state-space models on a state-space $\mathsf X$, defined by the recursion $\phi_{k+1} = F(\phi_k,\alpha(\phi_k,U_{k+1}))$, $k \in\mathbb N$, where $F,\alpha$ are some functions and $\{U_{k+1}\}_{k\in\mathbb N}$ is a sequence of i.i.d. random variables. More precisely, we extend conditions under which this class of Markov chains is irreducible, aperiodic and satisfies important continuity properties, relaxing two key assumptions from prior works. First, the state-space $\mathsf X$ is supposed to be a smooth manifold instead of an open subset of a Euclidean space. Second, we only suppose that $F$ is locally Lipschitz continuous. We demonstrate the significance of our results through their application to Markov chains underlying optimization algorithms. These schemes belong to the class of evolution strategies with covariance matrix adaptation and step-size adaptation.


Journal articles

Asymptotic estimations of a perturbed symmetric eigenproblem, Armand Gissler, Anne Auger, Nikolaus Hansen. Applied Mathematics Letters, Volume 150, 2024.
Abstract:
We study ill-conditioned positive definite matrices that are disturbed by the sum of rank-one matrices of a specific form. We provide estimates for the eigenvalues and eigenvectors. When the condition number of the initial matrix tends to infinity, we bound the values of the coordinates of the eigenvectors of the perturbed matrix. Equivalently, in the coordinate system where the initial matrix is diagonal, we bound the rate of convergence of coordinates that tend to zero.

A note on the K-epigraph, Armand Gissler, Tim Hoheisel. Optimization, 72(9), 2251–2285, 2022.
Abstract:
We study the question as to when the closed convex hull of the graph of a K-convex map equals its K-epigraph. In particular, we shed light onto the smallest cone K such that a given map has convex and closed K-epigraph, respectively. We apply our findings to several examples in matrix space as well as to convex composite functions.
Scaling-invariant functions versus positively homogeneous functions, Cheikh Touré, Armand Gissler, Anne Auger, Nikolaus Hansen. Journal of Optimization and Their Applications (JOTA), Volume 191, pages 363–383, 2021.
Abstract:
Scaling-invariant functions preserve the order of points when the points are scaled by the same positive scalar (usually with respect to a unique reference point). Composites of strictly monotonic functions with positively homogeneous functions are scaling-invariant with respect to zero. We prove in this paper that also the reverse is true for large classes of scaling-invariant functions. Specifically, we give necessary and sufficient conditions for scaling-invariant functions to be composites of a strictly monotonic function with a positively homogeneous function. We also study sublevel sets of scaling-invariant functions generalizing well-known properties of positively homogeneous functions.


Conference proceedings

Evaluation of the impact of various modifications to CMA-ES that facilitate its theoretical analysis, Armand Gissler. Proceedings of the Companion Conference on Genetic and Evolutionary Computation (GECCO '23 Companion), Association for Computing Machinery, New York, NY, USA, 1603–1610, 2023.
Abstract:
In this paper we introduce modified versions of CMA-ES with the objective to help to prove convergence of CMA-ES. In order to ensure that the modifications do not alter the performances of the algorithm too much, we benchmark variants of the algorithm derived from them on problems of the bbob test suite. We observe that the main performances losses are observed on ill-conditioned problems, which is probably due to the absence of cumulation in the adaptation of the covariance matrix. However, the versions of CMA-ES presented in this paper have globally similar performances to the original CMA-ES. Data

Learning rate adaptation by line search in evolution strategies with recombination, Armand Gissler, Anne Auger, Nikolaus Hansen. Proceedings of the Genetic and Evolutionary Computation Conference (GECCO '22), Association for Computing Machinery, New York, NY, USA, 630–638, 2022.
Abstract:
In this paper, we investigate the effect of a learning rate for the mean in Evolution Strategies with recombination. We study the effect of a half-line search after the mean shift direction is established, hence the learning rate value is conditioned to the direction. We prove convergence and study convergence rates in different dimensions and for different population sizes on the sphere function with the step-size proportional to the distance to the optimum. We empirically find that a perfect half-line search increases the maximal convergence rate on the sphere function by up to about 70%, assuming the line search imposes no additional costs. The speedup becomes less pronounced with increasing dimension. The line search reduces—however does not eliminate—the dependency of the convergence rate on the step-size. The optimal step-size assumes considerably smaller values with line search, which is consistent with previous results for different learning rate settings. The step-size difference is more pronounced in larger dimension and with larger population size, thereby diminishing an important advantage of a large population.

Communications

Convergence analysis of CMA-ES

ISMP 2024, July 2024, Montréal (Canada)

Irreducibility and convergence of nonlinear state-space models

JPS 2023, October 2023, Île d'Oléron (France)

Convergence of CMA-ES

CJC-MA 2023, September 2023, CentraleSupélec (France)

Evaluation of the impact of various modifications to CMA-ES that facilitate its theoretical analysis

GECCO '23, July 2023, Lisbon (Portugal)

Convergence Analysis of CMA-ES via Markov Chain Stability Analysis

SIAM OP23, June 2023, Seattle (US)

Learning Rate Adaptation by Line Search in Evolution Strategies

GECCO '22, July 2022, Boston (US)

Teaching

Bachelor of Science in École polytechnique, Teaching assistant for LAB 102 - How to write mathematics: 2023-2024, 2022-2023, 2021-2022

Misc.

2023: Reviewer for GECCO '23

2022: Reviewer for Set-Valued and Variational Analysis

2022-2024: Member of the Laboratory life commission (CMAP)

2022-2024: Organization of the CMAP-CMLS PhD students seminar

2022: Editorial assistant for Dagstuhl Seminar Theory of Randomized Optimization Heuristics