Research
My research connects three questions: how brains compute social meaning, how to build tools that make natural behavior measurable, and how vision models process information.
Social Cognition
How do brains compute social meaning during natural behavior?
I study the neural and behavioral basis of social intelligence in primates — both human and non-human — combining wireless neural recordings, quantitative behavioral analysis, and computational modeling to understand how brains process social information under naturalistic conditions. Recordings from the macaque mid-STS reveal that the region tracks spatial context, behavioral state, and social prediction errors during natural behavior.
AI for Science
Building measurement tools that make natural behavior scientifically accessible.
I build machine learning tools that enable quantitative study of human and animal behavior at scale — from primate face analysis and pose estimation to clinical video understanding and movement science. These tools serve as both scientific instruments and benchmarks for evaluating AI generalization.
- Grounding Intelligence in Movement
- PrimateFace: A Machine Learning Resource for Automated Face Analysis in Human and Non-human Primates
- Computational Kinematics of Dance: Distinguishing Hip Hop Genres
- Quantifying cross-species primate facial cues
- Vision-language models for decoding provider attention during neonatal resuscitation
- A large language model-assisted education tool to provide feedback on open-ended responses
Vision & Interpretability
Rigorous methods for understanding what vision models learn.
In a new research direction, I study how large vision models perceive the world and how interpretability methods can help us understand them.