photo

Marylou Gabrié

Assistant professor at École Polytechnique (CMAP)
Previouly postdoc at NYU (Center for Data Science) and Flatiron Institute (CCM).

Research topics: Statistical mechanics, Machine learning for physics and scientific computing, Monte Carlo methods.



Latest news

  • [October 2022] I am part of the steering committee of the AI-PhyStat semester of the AISSAI center. Join us October 3-4 for the workshop on ML assisted sampling and scientific computing at Collège de France (Paris): register here.

  • [June 2022] The slides to my lecture at the MLxSummer School of the Flatiron are here.

  • [June 2022] The third edition of Youth in high-dimension took place at ICTP (Trieste) the last week of June. Videos are here .

  • [May 2022] I co-organized a workshop @CECAM headquarters in Lausanne on Machine Learning Augmented Sampling for the Molecular Sciences. The videos of the talks by our great speakers are on the website!

  • [Oct 2021] Giorgio Parisi was awarded the Physics Nobel Prize! His great contributions to the physics of disordered systems can also shade lights in learning theory. I tried to write an accessible tutorial for machine learners and physicists from other area of expertise. This published in JPhysA Special Issue on ML and based on my PhD thesis under the great supervision of Florent Krzakala and Lenka Zdeborová .

Teaching

  • [Spring 2022] Emerging topics in Machine Learning (MAP 588) for Polytechnique 3rd year students. Co-taught with Rémi Flamary . Moodle link for students only.

  • [Fall 2021] Optimization and Computational Linear Algebra Graduate Course, NYU CDS (DS-GA1014). The course page.

  • [Spring 2021] Machine Learning Graduate Course, NYU CDS (DS-GA1003) along with Prof. He He. The course website.

  • [August 2021] A lecture on the Statistical Mechanics of Learning at the Summer School Machine Learning in Quantum Physics and Chemistry in Warsaw: slides.

Selected Publications

  • Adaptive Monte Carlo augmented with normalizing flows [pdf]
    Gabrié M., Rotskoff G. M., Vanden-Eijnden E. (PNAS 2022)

  • Adaptation of the Independent Metropolis-Hastings Sampler with Normalizing Flow Proposals. [pdf]
    Brofos J. A., Gabrié M., Brubaker M. A., & Lederman R. R. (AISTAT 2022)

  • Efficient Bayesian Sampling Using Normalizing Flows to Assist Markov Chain Monte Carlo Methods. [pdf]
    Gabrié M., Rotskoff G. M., & Vanden-Eijnden E. Invertible Neural Networks, NormalizingFlows, and Explicit Likelihood Models (ICML Workshop) (2021) - Accepted for contributed talk.

  • On the interplay between data structure and loss function in classification problems [pdf]
    d'Ascoli S., Gabrié M., Sagun L., Biroli G. (NeurIPS 2021)

  • Mean-field inference methods for neural networks, [pdf]
    Gabrié, M. Journal of Physics A: Mathematical and Theoretical, 53(22), 1–58. (2020)

  • Entropy and mutual information in models of deep neural networks, [pdf] [spotlight short video]
    Gabrié, M., Manoel, A., Luneau, C., Barbier, J., Macris, N., Krzakala, F., & Zdeborová, L. Advances in Neural Information Processing Systems 31, 1826--1836 (2018)