photo

Marylou Gabrié

Assistant professor at École Polytechnique (CMAP)
Previouly postdoc at NYU (Center for Data Science) and Flatiron Institute (CCM).

Research topics: Statistical mechanics, Machine learning for physics and scientific computing, Monte Carlo methods.



Latest news

  • [June 2023] The 4th edition of Youth in high-dimension will take place at ICTP (Trieste) the last week of May. Register by April 3rd here .

  • [December 2022] Here is the link to a small tutorial on sampling with generative models attached to my lecture for the course of Martin Weigt on Machine Learning in the Masters of Physics of Complex Systems (i-PCS). The slides of the lecture are here.

  • [October 2022] I am part of the steering committee of the AI-PhyStat semester of the AISSAI center. Join us October 3-4 for the workshop on ML assisted sampling and scientific computing at Collège de France (Paris): register here.

  • [June 2022] The slides to my lecture at the MLxSummer School of the Flatiron are here.

Teaching

  • [Spring 2023] CAPSTONE projects (M2DS) - projects between industrial mentors and Masters students. Reach out if you are an industrial and interested in mentoring a Data Science project for 2024 edition!

  • [Spring 2023, Spring 2022] Emerging topics in Machine Learning (MAP 588) for Polytechnique 3rd year students. Co-taught with Rémi Flamary . Moodle link for students only.

  • [Fall 2021] Optimization and Computational Linear Algebra Graduate Course, NYU CDS (DS-GA1014). The course page.

  • [Spring 2021] Machine Learning Graduate Course, NYU CDS (DS-GA1003) along with Prof. He He. The course website.

  • [August 2021] A lecture on the Statistical Mechanics of Learning at the Summer School Machine Learning in Quantum Physics and Chemistry in Warsaw: slides.

Selected Publications

  • Local-Global MCMC kernels: the best of both worlds [pdf]
    Samsonov S., Lagutin E., Gabrié M., Durmus A., Naumov A., Moulines E. (NeurIPS 2022)

  • Adaptive Monte Carlo augmented with normalizing flows [pdf]
    Gabrié M., Rotskoff G. M., Vanden-Eijnden E. (PNAS 2022)

  • Adaptation of the Independent Metropolis-Hastings Sampler with Normalizing Flow Proposals. [pdf]
    Brofos J. A., Gabrié M., Brubaker M. A., & Lederman R. R. (AISTAT 2022)

  • Efficient Bayesian Sampling Using Normalizing Flows to Assist Markov Chain Monte Carlo Methods. [pdf]
    Gabrié M., Rotskoff G. M., & Vanden-Eijnden E. Invertible Neural Networks, NormalizingFlows, and Explicit Likelihood Models (ICML Workshop) (2021) - Accepted for contributed talk.

  • On the interplay between data structure and loss function in classification problems [pdf]
    d'Ascoli S., Gabrié M., Sagun L., Biroli G. (NeurIPS 2021)

  • Mean-field inference methods for neural networks, [pdf]
    Gabrié, M. Journal of Physics A: Mathematical and Theoretical, 53(22), 1–58. (2020)

  • Entropy and mutual information in models of deep neural networks, [pdf] [spotlight short video]
    Gabrié, M., Manoel, A., Luneau, C., Barbier, J., Macris, N., Krzakala, F., & Zdeborová, L. Advances in Neural Information Processing Systems 31, 1826--1836 (2018)