photo

Marylou Gabrié

Postdoc jointly at NYU (Center for Data Science) and Flatiron Institute (CCM).

My research lies at the boundary of machine learning and statistical physics. I use statistical physics to understand machine learning successes. I am also interested in adapting machine learning technics for scientific use, in particular in the context of scarce data.



Latest news

  • [August 2021] I will give a lecture on the Statistical Mechanics of Learning at the Summer School Machine Learning in Quantum Physics and Chemistry in Warsaw. If you are interested by the interface of Physics and Machine Learning, you should apply! The slides.

  • [June 2021] You can find the videos of the second (virtual) edition of Yough in High-Dimensions at ICTP here.

  • [Oct 2020] Last fall I gave a tutorial at the Flatiron Wide Algorithms and Mathematics conference on "Inverse Problems, Sparsity and Neural Network Priors". Its recording along with the other great talks of the conference can be found on the page linked above.

  • [Nov 2019] Machine learners, intrigued by results obtained with message passing and replicas? Physicists, wondering how stat-mech can help ML theory? Check my review in JPhysA Special Issue on ML. Based on my PhD thesis under the great supervision of Florent Krzakala and Lenka Zdeborová.

Teaching

  • This Fall, I am teaching the Optimization and Computational Linear Algebra Graduate Course of NYU CDS (DS-GA1014). Here is the course page.

  • Last spring I taught Machine Learning Graduate Course of NYU CDS (DS-GA1003) along with Prof. He He. The course website is here

  • In February 2020, I gave a lecture at the ESI Winter School on Machine Learning in Physics which was an introduction on the Statistical Physics of Learning. The slides of the lecture and the notebook of the tutorial are available here.

Publications

  • Adaptive Monte Carlo augmented with normalizing flows [pdf]
    Gabrié M., Rotskoff G. M., Vanden-Eijnden E. arXiv:2105.12603 (2021)

  • More data or more parameters? Investigating the effect of data structure on generalization [pdf]
    d'Ascoli S., Gabrié M., Sagun L., Biroli G. arXiv:2103.05524 (2021)

  • Phase Retrieval with Holography and Untrained Priors: Tackling the Challenges of Low-Photon Nanoscale Imaging [pdf]
    Lawrence H., Barmherzig D. A., Li H., Eickenberg M., Gabrié M. arXiv:2012.07386 (2020) - Accepted for publication at MSML 2021.

  • Mean-field inference methods for neural networks, [pdf]
    Gabrié, M. Journal of Physics A: Mathematical and Theoretical, 53(22), 1–58. (2020)

  • Blind calibration for compressed sensing: State evolution and an online algorithm, [pdf]
    Gabrié, M., Barbier, J., Krzakala, F., & Zdeborová, L. Journal of Physics A: Mathematical and Theoretical, 53(33), 334004. (2020)

  • Entropy and mutual information in models of deep neural networks, [pdf] [spotlight short video]
    Gabrié, M., Manoel, A., Luneau, C., Barbier, J., Macris, N., Krzakala, F., & Zdeborová, L. Advances in Neural Information Processing Systems 31, 1826--1836 (2018)

  • Deterministic and Generalized Framework for Unsupervised Learning with Restricted Boltzmann Machines, [pdf]
    Tramel, E. W., Gabrié, M., Manoel, A., Caltagirone, F., & Krzakala, F. Advances in Neural Physical Review X, 8(4), 041006. (2018)

  • Phase transitions in the q-coloring of random hypergraphs, [pdf]
    Gabrié, M., Dani, V., Semerjian, G., & Zdeborová, L. Journal of Physics A: Mathematical and Theoretical, 50(50). (2017)

  • Inferring sparsity: Compressed sensing using generalized restricted Boltzmann machines, [pdf]
    Tramel, E. W., Manoel, A., Caltagirone, F., Gabrié, M., & Krzakala, F. IEEE Information Theory Workshop (ITW), 265–269. (2016)

  • Training Restricted Boltzmann Machines via the Thouless-Anderson-Palmer Free Energy, [pdf]
    Gabrié, M., Tramel, E. W., & Krzakala, F. Advances in Neural Information Processing Systems 28, 640--648. (2015)