I’m a PhD student at MIT co-advised by Armando Solar-Lezama in EECS and Josh Tenenbaum in BCS. My research combines methods from programming languages (PL) research with machine learning to tackle problems in artificial intelligence.
Most recently, I presented our latest work on a language called Pluck in the PLDI 2025 paper Stochastic Lazy Knowledge Compilation for Inference in Discrete Probabilistic Programs. Check out the Pluck website for links to the paper as well as the associated library and artifact.
Research Interests
My research interests center on program synthesis, probabilistic programming, and artificial intelligence. I’m particularly interested in neurosymbolic methods that bridge the machine learning and programming languages communities. I believe symbolic methods can augment neural methods to facilitate low-data learning, generalization, transfer learning, interpretability, and other desiderata.
I’m particularly interested in abstraction learning, as in Ellis et al.’s DreamCoder. I led follow up work building a tool called Stitch (paper & code) published at POPL 2023 that achieves a 1,000-10,000x speedup in abstraction learning over DreamCoder. I’m interested in exploring new applications of abstraction learning, and I’m particularly interested in its application to world modelling through probablistic programs.
I previously published as Matthew Bowers.
Google Scholar / CV / Github / Bluesky / Twitter / Email (mlbowers@csail.mit.edu)
Conference Publications
- Stochastic Lazy Knowledge Compilation for Inference in Discrete Probabilistic Programs (PLDI 2025; preprint coming soon).
Maddy Bowers*, Alexander K. Lew*, Joshua B. Tenenbaum, Armando Solar-Lezama, Vikash Mansinghka. - LILO: Learning Interpretable Libraries by Compressing and Documenting Code (ICLR 2024).
Gabriel Grand, Lionel Wong, Maddy Bowers, Theo X. Olausson, Muxin Liu, Joshua B. Tenenbaum, Jacob Andreas. - Language Models Can Teach Themselves to Program Better (ICLR 2023).
Patrick Haluptzok, Matthew Bowers, Adam Tauman Kalai. - Top-Down Synthesis for Library Learning (POPL 2023; William A. Martin Master’s Thesis Award (2024); Awarded Artifact Reusable; code; tutorial & docs).
Matthew Bowers, Theo X. Olausson, Lionel Wong, Gabriel Grand, Joshua B. Tenenbaum, Kevin Ellis, Armando Solar-Lezama. - Representing Partial Programs With Blended Abstract Semantics (ICLR 2021).
Maxwell Nye, Yewen Pu, Matthew Bowers, Jacob Andreas, Joshua B. Tenenbaum, Armando Solar-Lezama. - Universal Reshaping of Arrested Colloidal Gels via Active Doping (The Journal of Chemical Physics 2020).
Stewart Mallory, Matthew Bowers, Angelo Cacciuto. - Active Sculpting of Colloidal Crystals (The Journal of Chemical Physics 2019).
Shibananda Das, Matthew Bowers, Clara Bakker, Angelo Cacciuto.
Workshop Publications
- Lazy Knowledge Compilation for Discrete PPLs (Languages For Inference Workshop at POPL 2025).
Maddy Bowers*, Alexander K. Lew*, Joshua B. Tenenbaum, Vikash Mansinghka, Armando Solar-Lezama. - MathDSL: A Domain-Specific Language for Concise Mathematical Solutions Via Program Synthesis (Math-AI workshop at NeurIPS 2025).
Sagnik Anupam, Maddy Bowers, Omar Costilla-Reyes, Armando Solar-Lezama. - Concept Learning as Coarse-to-Fine Probabilistic Program Induction (Poster (Abstract) at CogSci 2024; poster).
Maddy Bowers*, Alexander K. Lew*, Wenhao Qi, Vikash Mansinghka, Joshua Rule, Joshua B. Tenenbaum, Armando Solar-Lezama. - Toward Probabilistic Coarse-to-Fine Program Synthesis (Languages for Inference Workshop at POPL 2024; paper).
Maddy L. Bowers*, Alexander K. Lew*, Vikash Mansinghka, Joshua B. Tenenbaum, Armando Solar-Lezama. - Codeplay: Autotelic Learning through Collaborative Self-Play in Programming Environments (Intrinsically Motivated Open-ended Learning Workshop at NeurIPS 2024).
Laetitia Teodorescu, Cédric Colas, Matthew Bowers*, Thomas Carta, Pierre-Yves Oudeyer.
Awards
- William A. Martin Master’s Thesis Award (2024) (Master’s Thesis)
- NSF Graduate Research Fellowship (2022)
See my CV for earlier awards