Focus Period lund 2026
PhD Student
École Polytechnique Fédérale de Lausanne – EPFL (Switzerland)
Andreea-Alexandra Musat is a PhD student at EPFL, supervised by Prof. Nicolas Boumal. She works on nonconvex optimization, sometimes in settings with geometric structure like manifolds, and is interested in understanding why simple algorithms often work so well in practice. In her recent work, she uses tools from dynamical systems to study the behavior of optimization trajectories and the kinds of points they can converge to.
Presenting: A non-autonomous center-stable set theorem for saddle avoidance in optimization
Optimization algorithms usually do not converge to strict saddle points. For methods with a fixed update rule, this is well understood through the center-stable manifold theorem. But many algorithms of practical interest are not of that kind: their update rule changes over time, for example because the step size varies and may even go to zero. In this talk, I’ll present a new center-stable set theorem for such non-autonomous systems. The theorem is designed precisely to cover settings that earlier results do not handle, including vanishing step sizes. Using this theorem, we obtain saddle avoidance results for gradient descent in both Euclidean and Riemannian settings. Joint work with Nicolas Boumal.
