photo

Marylou Gabrié

Postdoc jointly at NYU (Center for Data Science) and Flatiron Institute (CCM).

My research lies at the boundary of machine learning and statistical physics. I use statistical physics to understand machine learning successes. I am also interested in adapting machine learning technics for scientific use, in particular in the context of scarce data.



Latest news

  • Next August I will give a lecture on the Statistical Mechanics of Learning at the Summer School Machine Learning in Quantum Physics and Chemistry in Warsaw. If you are interested by the interface of Physics and Machine Learning, you should apply!

  • We are organizing the second edition of Yough in High-Dimensions (virtually) at ICTP next June. The list of amazing speakers is here.

  • Machine learners, intrigued by results obtained with message passing and replicas? Physicists, wondering how stat-mech can help ML theory? Check my review in JPhysA Special Issue on ML. Based on my PhD thesis under the great supervision of Florent Krzakala and Lenka Zdeborová.

  • I recently gave a tutorial at the Flatiron Wide Algorithms and Mathematics conference on "Inverse Problems, Sparsity and Neural Network Priors". Its recording along with the other great talks of the conference can be found on the page linked above.

Teaching

  • I am teaching the Machine Learning Graduate Course of NYU CDS (DS-GA1003) along with Prof. He He this Spring. The course website is here

  • I will teach the Linear Algebra Graduate Course of NYU CDS (DS-GA1014) next Fall. You can check the website of the Fall 2020 edition given by Léo Miolane.

  • In February 2020, I gave a lecture at the ESI Winter School on Machine Learning in Physics which was an introduction on the Statistical Physics of Learning. The slides of the lecture and the notebook of the tutorial are available here.

Publications

  • More data or more parameters? Investigating the effect of data structure on generalization [pdf]
    d'Ascoli S., Gabrié M., Sagun L., Biroli G. arXiv:2103.05524 (2021)

  • Phase Retrieval with Holography and Untrained Priors: Tackling the Challenges of Low-Photon Nanoscale Imaging [pdf]
    Lawrence H., Barmherzig D. A., Li H., Eickenberg M., Gabrié M. arXiv:2012.07386 (2020)

  • Mean-field inference methods for neural networks, [pdf]
    Gabrié, M. Journal of Physics A: Mathematical and Theoretical, 53(22), 1–58. (2020)

  • Blind calibration for compressed sensing: State evolution and an online algorithm, [pdf]
    Gabrié, M., Barbier, J., Krzakala, F., & Zdeborová, L. Journal of Physics A: Mathematical and Theoretical, 53(33), 334004. (2020)

  • Entropy and mutual information in models of deep neural networks, [pdf] [short video]
    Gabrié, M., Manoel, A., Luneau, C., Barbier, J., Macris, N., Krzakala, F., & Zdeborová, L. Advances in Neural Information Processing Systems 31, 1826--1836 (2018)

  • Deterministic and Generalized Framework for Unsupervised Learning with Restricted Boltzmann Machines, [pdf]
    Tramel, E. W., Gabrié, M., Manoel, A., Caltagirone, F., & Krzakala, F. Advances in Neural Physical Review X, 8(4), 041006. (2018)

  • Phase transitions in the q-coloring of random hypergraphs, [pdf]
    Gabrié, M., Dani, V., Semerjian, G., & Zdeborová, L. Journal of Physics A: Mathematical and Theoretical, 50(50). (2017)

  • Inferring sparsity: Compressed sensing using generalized restricted Boltzmann machines, [pdf]
    Tramel, E. W., Manoel, A., Caltagirone, F., Gabrié, M., & Krzakala, F. IEEE Information Theory Workshop (ITW), 265–269. (2016)

  • Training Restricted Boltzmann Machines via the Thouless-Anderson-Palmer Free Energy, [pdf]
    Gabrié, M., Tramel, E. W., & Krzakala, F. Advances in Neural Information Processing Systems 28, 640--648. (2015)