Last Updated: Jan 2020

I’m a PhD candidate in CSAIL at MIT co-advised by Armando Solar-Lezama in EECS and Josh Tenenbaum in BCS. My research is in program synthesis and artificial intelligence.

A list of publications can be found on my Google Scholar, my resume can be found here, and my GitHub is here. You can contact me by email.

See our latest work, presented at the NeurIPS 2020 Workshop on Computer-Assisted Programming and accepted to ICLR 2021: Representing Partial Programs with Blended Abstract Semantics.


My primary research interest is in program synthesis and its applications to artificial intelligence. I’m particularly interested in neurosymbolic methods, learning systems that combine deep learning with synthesis techniques from the programming language community. I see program induction as a promising approach to designing learning systems that share more features with human cognition (e.g. rule-based reasoning, low-data learning, generalization, interpretability). Some of the areas I’m interested in recently include:

  • Improving how we represent programs in deep learning, for example our Blended Abstract Semantics work.
  • Concept/library learning and compression, for example DreamCoder.
  • Applying neurosymbolic program synthesis to developing interpretable scientific models, for example our NSF Expeditions project.
  • Search mechanisms that use learned symbolic components, for example the predicate learning of BUSTLE and Property Signatures.


In 2020 I graduated from Columbia University with a BA in Computer science and a BA in Chemistry. At Columbia, I worked with Professor Angelo Cacciuto on chemical simulations of self-assembling colloids and authored two publications:

  • Das, S., Lee Bowers, M., Bakker, C., & Cacciuto, A. (2019). Active sculpting of colloidal crystals. The Journal of Chemical Physics, 150 (13), 134505.
  • Mallory, S., Lee Bowers, M., & Cacciuto, A. (2020). Universal reshaping of arrested colloidal gels via active doping. The Journal of Chemical Physics, 153, 084901.

In the summer of 2019, I worked in the Learning Matter Group at MIT under Professor Rafael Gomez-Bombarelli, where I applied graph neural network methods to molecular property prediction. I’m interested in combining these methods with program synthesis techniques in a chemical domain in the future.

Fun Stuff

I wrote Coral, a gradually-typed Python compiler which runs type inference and generates fast equivalent LLVM-IR code whenever possible.

I wrote Espresso, a Bash-Python hybrid shell that I used for several years, though I’ve recently migrated to Xonsh.

I like writing my own tools and libraries to speed up development, some of which I’ve packaged into the mlb Python library.