My main research interests lie at the interface of representation theory and machine learning. Specific topics of interest include equivariant neural networks, model compression, parameter space symmetries, and gradient flow dynamics. My background is in geometric representation theory.
Previously, I worked in the research group of Rami Aizenbud at the Weizmann Institute of Science, and before that in the Hausel group at IST Austria. I completed a PhD in mathematics at the University of Texas at Austin in 2016 under the guidance of David Ben-Zvi.
For my papers and preprints, see the research page. This site also contains a collection of notes on representation theory, and links related to events and outreach.
Fall 2022: Deep Learning.
Winter 2020-2021: Algebraic Groups.