I was a PhD student in mathematics in the Randopt Inria team, located at CMAP, in École polytechnique. My supervisors are Anne Auger and Nikolaus Hansen.
Currently, I work on theoretical aspects of the algorithm CMA-ES (Evolution Strategy with Covariance Matrix Adaptation), specifically on a proof of convergence. CMA-ES is a stochastic derivative-free optimization algorithm, which has good empirical performances on hard (noisy, multi-modal, ill-conditioned) optimization problems. I am also interested in various theoretical topics in the field of mathematical optimization, probability and statistics.Armand Gissler
Publications
Preprints
Irreducibility of nonsmooth state-space models with an application to CMA-ES, Armand Gissler, Shan-Conrad Wolf, Anne Auger, Nikolaus Hansen, 2024
On the irreducibility and convergence of a class of nonsmooth nonlinear state-space models on manifolds, Armand Gissler, Alain Durmus, Anne Auger, 2024
Abstract:
In this paper, we analyze a large class of general nonlinear state-space models on a state-space $\mathsf X$, defined by the recursion $\phi_{k+1} = F(\phi_k,\alpha(\phi_k,U_{k+1}))$, $k \in\mathbb N$, where $F,\alpha$ are some functions and $\{U_{k+1}\}_{k\in\mathbb N}$ is a sequence of i.i.d. random variables. More precisely, we extend conditions under which this class of Markov chains is irreducible, aperiodic and satisfies important continuity properties, relaxing two key assumptions from prior works. First, the state-space $\mathsf X$ is supposed to be a smooth manifold instead of an open subset of a Euclidean space. Second, we only suppose that $F$ is locally Lipschitz continuous. We demonstrate the significance of our results through their application to Markov chains underlying optimization algorithms. These schemes belong to the class of evolution strategies with covariance matrix adaptation and step-size adaptation.Journal articles
Asymptotic estimations of a perturbed symmetric eigenproblem, Armand Gissler, Anne Auger, Nikolaus Hansen. Applied Mathematics Letters, Volume 150, 2024.
Abstract:
We study ill-conditioned positive definite matrices that are disturbed by the sum of rank-one matrices of a specific form. We provide estimates for the eigenvalues and eigenvectors. When the condition number of the initial matrix tends to infinity, we bound the values of the coordinates of the eigenvectors of the perturbed matrix. Equivalently, in the coordinate system where the initial matrix is diagonal, we bound the rate of convergence of coordinates that tend to zero.A note on the K-epigraph, Armand Gissler, Tim Hoheisel. Optimization, 72(9), 2251–2285, 2022.
Abstract:
We study the question as to when the closed convex hull of the graph of a K-convex map equals its K-epigraph. In particular, we shed light onto the smallest cone K such that a given map has convex and closed K-epigraph, respectively. We apply our findings to several examples in matrix space as well as to convex composite functions.Scaling-invariant functions versus positively homogeneous functions, Cheikh Touré, Armand Gissler, Anne Auger, Nikolaus Hansen. Journal of Optimization and Their Applications (JOTA), Volume 191, pages 363–383, 2021.
Abstract:
Scaling-invariant functions preserve the order of points when the points are scaled by the same positive scalar (usually with respect to a unique reference point). Composites of strictly monotonic functions with positively homogeneous functions are scaling-invariant with respect to zero. We prove in this paper that also the reverse is true for large classes of scaling-invariant functions. Specifically, we give necessary and sufficient conditions for scaling-invariant functions to be composites of a strictly monotonic function with a positively homogeneous function. We also study sublevel sets of scaling-invariant functions generalizing well-known properties of positively homogeneous functions.Conference proceedings
Evaluation of the impact of various modifications to CMA-ES that facilitate its theoretical analysis, Armand Gissler. Proceedings of the Companion Conference on Genetic and Evolutionary Computation (GECCO '23 Companion), Association for Computing Machinery, New York, NY, USA, 1603–1610, 2023.
Abstract:
In this paper we introduce modified versions of CMA-ES with the objective to help to prove convergence of CMA-ES. In order to ensure that the modifications do not alter the performances of the algorithm too much, we benchmark variants of the algorithm derived from them on problems of the bbob test suite. We observe that the main performances losses are observed on ill-conditioned problems, which is probably due to the absence of cumulation in the adaptation of the covariance matrix. However, the versions of CMA-ES presented in this paper have globally similar performances to the original CMA-ES. DataLearning rate adaptation by line search in evolution strategies with recombination, Armand Gissler, Anne Auger, Nikolaus Hansen. Proceedings of the Genetic and Evolutionary Computation Conference (GECCO '22), Association for Computing Machinery, New York, NY, USA, 630–638, 2022.
Abstract:
In this paper, we investigate the effect of a learning rate for the mean in Evolution Strategies with recombination. We study the effect of a half-line search after the mean shift direction is established, hence the learning rate value is conditioned to the direction. We prove convergence and study convergence rates in different dimensions and for different population sizes on the sphere function with the step-size proportional to the distance to the optimum. We empirically find that a perfect half-line search increases the maximal convergence rate on the sphere function by up to about 70%, assuming the line search imposes no additional costs. The speedup becomes less pronounced with increasing dimension. The line search reduces—however does not eliminate—the dependency of the convergence rate on the step-size. The optimal step-size assumes considerably smaller values with line search, which is consistent with previous results for different learning rate settings. The step-size difference is more pronounced in larger dimension and with larger population size, thereby diminishing an important advantage of a large population.Communications
Convergence analysis of CMA-ES
ISMP 2024, July 2024, Montréal (Canada)
Convergence proof of CMA-ES - Analysis of underlying Markov chains
Dagstuhl seminar Theory of Randomized Optimization Heuristics, July 2024, Dagstuhl (Germany)
Convergence analysis of evolution strategies with covariance matrix adaptation
Séminaire des doctorants CMAP-CMLS, April 2024, Palaiseau (France)
Irreducibility and convergence of nonlinear state-space models
JPS 2023, October 2023, Île d'Oléron (France)
Convergence of CMA-ES
CJC-MA 2023, September 2023, CentraleSupélec (France)
Evaluation of the impact of various modifications to CMA-ES that facilitate its theoretical analysis
GECCO '23, July 2023, Lisbon (Portugal)
Convergence Analysis of CMA-ES via Markov Chain Stability Analysis
SIAM OP23, June 2023, Seattle (US)
Learning Rate Adaptation by Line Search in Evolution Strategies
GECCO '22, July 2022, Boston (US)
Misc.
2023: Reviewer for GECCO '23
2022: Reviewer for Set-Valued and Variational Analysis
2022-2024: Member of the Laboratory life commission (CMAP)
2022-2024: Organization of the CMAP-CMLS PhD students seminar
2022: Editorial assistant for Dagstuhl Seminar Theory of Randomized Optimization Heuristics