Current Projects

Check out what we're working on

How do neurons see the world?

Typical neuroscience experiments start by assuming we know the set of variables that drive neural activity. But what if neurons are tuned to variables we would never have guessed? What if, as with social interaction, the stimulus set is too complex to be boiled down to a few dimensions. With Jeff Beck, we’re developing models that infer stimulus categories directly from data, allowing us to “tag” images and movies based on neural responses.

Neural responses are sums of sensitivities to binary image "tags."

In an example dataset, the model correctly tagged monkey faces, whole monkeys, and monkey body parts.

Strategic social decision-making

Most neuroscience experiments begin by stripping away as much of the complexity of the real world as they can afford to. But when the phenomena of interest are our social interactions — who we trust, who we fight with, who we love — there’s only so much complexity you can remove. In P[λ]ab, we’re studying the ways in which humans and other primates make strategic social decisions in real time by recording from the brain as pairs of individuals play dynamic games.

The "penalty shot" task. The goalie (red) attempts to block the puck (blue).

By modeling these interactions, we’re able to generate realistic samples of actual play, as well as characterize players’ strategies.

Real puck trajectories.
Generated puck trajectories.
A "potential energy" function explains player dynamics.

Work in progress: Real-time analysis of neural data

Thanks to advances in microscopy and calcium indicators, it’s now possible to collect terabytes of data in a single experiment. But that increase in data volume comes at the cost of increased processing time. Yet recent work on preprocessing algorithms for imaging data, along with methods for characterizing cell responses and inferring the functional relationships between them, has made it possible to envision a real-time pipeline for neural data analysis.

Together with Eva Naumann’s lab, we’re working to develop a fully-integrated online analysis platform that will facilitate closed-loop, all-optical control in the larval zebrafish. This is work in progress, so stay tuned!

A whole zebrafish brain activity map, showing the distribution of motion-sensitive neurons, color-coded to show the preferred motion direction.
Concept for the closed-loop pipeline. Neural data from the zebrafish are collected in the form of images, preprocessed, and analyzed in real-time. Targets for optical stimulation are then chosen based on the results of this analysis, creating adaptive experiments that test causal hypotheses.

Eye tracking unplugged

Where we look speaks volumes about what we’re thinking. For over a century, psychologists and neurobiologists have used the movements of the eyes and measurements of pupil size to study the mind, but the need for experimental control has limited our ability to study eye movements in naturalistic settings. In P[λ]ab, we are pairing new advances in eye tracking technology with methods in computer vision and machine learning to tackle the challenge of studying eye movements in real-world settings, with applications ranging from treatment of acute fear to how we view art.

Mapping gaze between three and two dimensions.
Gaze mapping free viewing of art.
Three dimensional reconstruction of viewer position.