Understanding how primates move, communicate, and interact in their natural environments is one of the problems I care about most in biology. Since around 2011, researchers have built systems that detect primate faces, reconstruct 3D body pose from dozens of synchronized cameras, classify complex social behaviors, decode vocalizations, and generate realistic 3D avatars. The work now spans 14 topic areas, dozens of species from lemurs to great apes, and methods ranging from detection and pose estimation to facial action coding, hand tracking, species identification, and reinforcement learning.
To help the community navigate this growing literature, we built Awesome Computational Primatology (GitHub, HF) — a curated, open registry of 97+ papers at this intersection, with an AI-powered chat assistant for querying the corpus in natural language.
But the diversity of approaches also shows how far we have to go. No single method, dataset, or species captures the full complexity of primate behavior — and too many models and datasets stay siloed or invisible to researchers working on related problems. That is why resources like this matter: connecting work across species, modalities, and methods so we can see where the gaps are and where open tools already exist. If you work at this intersection — or want to — we would love your contributions. Add a paper, open-source a model, share a dataset. Solving behavior understanding in primates is not something any one lab will crack alone; it will take a community building bridges across all of these approaches, and I believe this generation of researchers is up for it.