Generally speaking, I do research in the areas of machine learning, artificial intelligence, cognition, and neuroscience.
data science tools for animal vocalizations and bioacoustics
My work in applied machine learning focuses mainly on tools for animal vocalizations and bioacoustics.
Most recently, I collaborated with Yarden Cohen and Tim Gardner to develop a neural network that learns how to annotate birdsong from spectrograms:
neural network toolbox for animal vocalizations and bioacoustics
a tool to work with any format for annotating animal vocalizations
hybrid-vocal-classifier: a Python machine learning library for animal vocalizations and bioacoustics https://hybrid-vocal-classifier.readthedocs.io/en/latest/
scikit-learnwith functionality for researchers studying animal vocalizations
functions for working with files created by the EvTAF program and the evsonganaly GUI.
The functions can be used to work with Bengalese finch song in this data repository:
functions for working with this data repository:
code to accompany the SciPy 2016 Proceedings Paper "Comparison of machine learning methods applied to birdsong element classification"
a measure of similarity between birdsongs (and other things)
As described in:
Mets, David G., and Michael S. Brainard. "An automated approach to the quantitation of vocalizations and vocal learning in the songbird." PLoS computational biology 14.8 (2018): e1006437.
in collaboration with David Mets
visual search and visual attention
During my post-doctoral fellowship at Emory University in Atlanta, Georgia, I worked with Astrid Prinz in the Biology department, on brain-inspired algorithms for continual machine learning, as part of a DARPA program. The Prinz lab provided neuroscience expertise for members of our team working on algorithms for goal-driven perception. My goal for this project was to understand visual search: how does our brain solve the problem of finding an object we're looking for?
experiments to test whether the untangling mechanism proposed for object recognition can also account for behavior measured in visual search tasks, using deep neural network models of the primate ventral visual stream.
library for modeling visual search behavior with neural networks.
Python package to make stimuli like those used in classic visual search experiments.
Recurrent Models of Visual Attention
replication of "Recurrent models of visual attention", Mnih et al. 2014
aver: active vision models in Nengo
framework for visual search models that incorporate eye movements.
proposal for Nengo summer school 2019.
study of Cortes et al. 1994 "Learning Curves" paper with scikit-learn and scipy for first Atlanta Jupyter Day.
teaching / organizing
Data Science for Scientists ATL
BEST data science short course
short course in scientific computing and data science sponsored by the Atlanta BEST program.
EWIN Coding Bootcamp
coding bootcamp sponsored by Emory Women In Neuroscience