I am a postdoctoral fellow at École Polytechnique (Paris), working in the SIMPAS team with El Mahdi El Mhamdi's group on robust learning theory, and preference learning methods.
From January 2023 to Septembre 2025, I was a postdoctoral fellow at CTU (Prague) in the optimization group, working with Jakub Mareček on the optimization of tame functions.
Prior to this position, I did my PhD at the Laboratoire Jean Kuntzmann (Grenoble), in the DAO team under the supervision of Franck Iutzeler and Jérôme Malick. I focused on structured nonsmooth problems, which include learning problems and more general nonsmooth problems. I developed structure identification procedures and fast (Newton-type) methods for such nonsmooth nonconvex problems. More in my PhD thesis!
Here is a resume and a list of publications.
Interests: Machine learning; Nonsmooth nonconvex optimization; Second-order methods; Riemannian optimization.
April 2026: I will attend ICLR 2026, with contributions on stochastic constrained optimization, outliers-dropout interplay in LLMs, and Byzantine Robustness.
March 2026: I will attend the Journées MODE in Nice, where I'll present recent work on Byzantine Robustness, Slides.
Dec. 2025: I will attend at Eurips Copenhaguen! I'll present Generalization and Monotonicity in Preference Learning (on Wedneday, 3 Dec.), and Benchmarking Algorithms for Fairness-Constrained Training of Deep Neural Networks (on Friday, 5 Dec.) at Le Salon des Refusés. Feel free to reach out if you want to discuss!
Sept. 2025: Our benchmark of solvers for learning with constraint, with Andrii, Jana and Jakub got accepted at the Neurips Constrained Optim for ML workshop!
Sept. 2025: Our work on generalization and monotonicity in preference learning, with Julien Fageot, Peva Blanchard and Lê-Nguyên Hoang got accepted at Neurips!
June 2023: I am presenting a poster on Hybrid Methods for global optimization of large-scale polynomials at the FOCM conference in Paris. poster
Dec. 2022: Newton methods for structured nonsmooth optimization, Inria Mind seminar, Saclay (online).
Dec. 2022: I defended my PhD on December 2nd, 2022. I was very happy to have in the jury: Jalal Fadili and Claudia Sagastizábal as rapporters, Jean-Charles Gilbert and Mathurin Massias as examiners, and Nadia Brauner as president. Claude Lemaréchal also attended! Manuscript & Slides
Nov. 2022: Newton methods for structured nonsmooth optimization, Rutgers Optim & ML seminar (online).
Oct. 2022: Newton methods for structured nonsmooth optimization, Inria MLSP seminar, Lyon.
Oct. 2022: Newton methods for nonsmooth composite optimization, Journées MOA, Nice, Slides.
June 2022: Conjuguer Newton et gradient proximal pour l’optimisation non lisse, CANUM, Evian, Abstract & Slides.
June 2022: Newton methods for nonsmooth composite optimization, Journées MODE, Limoges. Received the "prix Dodu" awarding the three best talks among young researchers. Abstract & Slides
Dec. 2020: Harnessing Structure in Optimization for Large-scale Learning, LJK PhD day, Grenoble.
Sept. 2020: On the Interplay between Acceleration and Identification for the Proximal Gradient algorithm, Journées MODE (virtual). Abstract & Slides.
Feb. 2020 : Randomized Progressive Hedging methods for Multi-stage Stochastic Programming, ROADEF, Montpellier, Abstract & Slides.
E-mail: firstname . lastname |a|_ polytechnique.edu