Marylou Gabrié

Postdoc jointly at NYU (Center for Data Science) and Flatiron Institute (CCM).

My research lies at the boundary of machine learning and statistical physics. I use statistical physics to understand machine learning successes. I am also interested in adapting machine learning technics for scientific use, in particular in the context of scarce data.

Latest news

  • [Jan 2022] I will start as an Assistant Professor at CMAP of École Polytechnique.

  • [Oct 2021] Giorgio Parisi was awarded the Physics Nobel Prize! His great contributions to the physics of disordered systems can also shade lights in learning theory. I tried to write an accessible tutorial for machine learners and physicists from other area of expertise. This published in JPhysA Special Issue on ML and based on my PhD thesis under the great supervision of Florent Krzakala and Lenka Zdeborová .

  • [August 2021] I gave a lecture on the Statistical Mechanics of Learning at the Summer School Machine Learning in Quantum Physics and Chemistry in Warsaw. If you are interested by the interface of Physics and Machine Learning, you should apply! The slides.

  • [June 2021] You can find the videos of the second (virtual) edition of Yough in High-Dimensions at ICTP here.

  • [Oct 2020] Last fall I gave a tutorial at the Flatiron Wide Algorithms and Mathematics conference on "Inverse Problems, Sparsity and Neural Network Priors". Its recording along with the other great talks of the conference can be found on the page linked above.


  • This Fall, I am teaching the Optimization and Computational Linear Algebra Graduate Course of NYU CDS (DS-GA1014). Here is the course page.

  • Last spring I taught Machine Learning Graduate Course of NYU CDS (DS-GA1003) along with Prof. He He. The course website is here

  • In February 2020, I gave a lecture at the ESI Winter School on Machine Learning in Physics which was an introduction on the Statistical Physics of Learning. The slides of the lecture and the notebook of the tutorial are available here.


  • Adaptation of the Independent Metropolis-Hastings Sampler with Normalizing Flow Proposals. [pdf]
    Brofos J. A., Gabrié M., Brubaker M. A., & Lederman R. R. arXiv:2110.13216 (2021)

  • Efficient Bayesian Sampling Using Normalizing Flows to Assist Markov Chain Monte Carlo Methods. [pdf]
    Gabrié M., Rotskoff G. M., & Vanden-Eijnden E. Invertible Neural Networks, NormalizingFlows, and Explicit Likelihood Models (ICML Workshop) (2021) - Accepted for contributed talk.

  • Adaptive Monte Carlo augmented with normalizing flows [pdf]
    Gabrié M., Rotskoff G. M., Vanden-Eijnden E. arXiv:2105.12603 (2021)

  • On the interplay between data structure and loss function in classification problems [pdf]
    d'Ascoli S., Gabrié M., Sagun L., Biroli G. arXiv:2103.05524 (2021) - Accepted at NeurIPS 2021

  • Phase Retrieval with Holography and Untrained Priors: Tackling the Challenges of Low-Photon Nanoscale Imaging [pdf]
    Lawrence H., Barmherzig D. A., Li H., Eickenberg M., Gabrié M. Proceedings of Machine Learning Research, 107, 1-31, MSML (2021)

  • Mean-field inference methods for neural networks, [pdf]
    Gabrié, M. Journal of Physics A: Mathematical and Theoretical, 53(22), 1–58. (2020)

  • Blind calibration for compressed sensing: State evolution and an online algorithm, [pdf]
    Gabrié, M., Barbier, J., Krzakala, F., & Zdeborová, L. Journal of Physics A: Mathematical and Theoretical, 53(33), 334004. (2020)

  • Entropy and mutual information in models of deep neural networks, [pdf] [spotlight short video]
    Gabrié, M., Manoel, A., Luneau, C., Barbier, J., Macris, N., Krzakala, F., & Zdeborová, L. Advances in Neural Information Processing Systems 31, 1826--1836 (2018)

  • Deterministic and Generalized Framework for Unsupervised Learning with Restricted Boltzmann Machines, [pdf]
    Tramel, E. W., Gabrié, M., Manoel, A., Caltagirone, F., & Krzakala, F. Advances in Neural Physical Review X, 8(4), 041006. (2018)

  • Phase transitions in the q-coloring of random hypergraphs, [pdf]
    Gabrié, M., Dani, V., Semerjian, G., & Zdeborová, L. Journal of Physics A: Mathematical and Theoretical, 50(50). (2017)

  • Inferring sparsity: Compressed sensing using generalized restricted Boltzmann machines, [pdf]
    Tramel, E. W., Manoel, A., Caltagirone, F., Gabrié, M., & Krzakala, F. IEEE Information Theory Workshop (ITW), 265–269. (2016)

  • Training Restricted Boltzmann Machines via the Thouless-Anderson-Palmer Free Energy, [pdf]
    Gabrié, M., Tramel, E. W., & Krzakala, F. Advances in Neural Information Processing Systems 28, 640--648. (2015)