Collaboratory for Adaptive Sensory Technologies

Fellows

Salk Institute for Biological Studies - Collaboratory for Adaptive Sensory Technologies - Fellows

Fellows



Thomas Albright

Professor

Tom Albright is the director of the Vision Center Laboratory and the Center for Neurobiology of Vision at the Salk Institute. He studies the neural structures and events underlying the perception of motion, form and color. Albright is one of the leaders of the Academy of Neuroscience for Architecture, for which he served as president in 2013-2014. “Good architects have lots of intuitions, and that’s why good architecture works,” says Albright to explain his interest in the link between neuroscience and architecture. “Our hope is that we can identify principles backing up those intuitions that are more deeply rooted in knowledge about how the brain works. We’d like to be able to identify, for example, what particular elements would give you a better space for learning.”

Projects

Perceptual scaling for eyewitness identification»
Neural mechanisms of adaptation in visual systems»
Automated individual displays for persons with dementia»


Sergei Gepshtein

Staff Scientist

Sergei Gepshtein is the founding director of the Collaboratory for Adaptive Sensory Technologies at the Salk Institute. He studies visual perception and visually guided behavior using methods of sensorimotor psychophysics and computational neuroscience. One of Gepshtein’s long-standing interests is the interaction between two aspects of visual perception: the entry process called early vision and the constructive process called perceptual organization. Early vision captures visual information and thus determines the boundaries between the visible and invisible. Perceptual organization creates visual meaning; it constructs our visual experience (the visual world) from the information captured by early vision. Gepshtein also directs several translational studies in design of visual media and built environments.

Projects

Perceptual scaling for eyewitness identification»
Vision science for dynamic architecture»
Adaptive perception of motion in natural environments»
Towards optimal sensory ecology of built environment»
Prospective optimization with limited resources
Automated individual displays for persons with dementia»
A lightweight platform for visual psychophysics»


Satchidananda Panda

Professor

Satchin Panda leads the Regulatory Biology Laboratory at the Salk Institute where he and his colleagues study the molecular bases of circadian timekeeping mechanism in mammals. The goal is to understand the mechanism of light perception by specialized light sensitive ganglion cells in the retina. These cells entrain the master circadian oscillator resident in the neural structure called Suprachiasmatic Nucleus. This work helps to understand the regulatory mechanisms that maintain the nearly 24-hour molecular rhythm in the master oscillator, and also generate rhythms in gene function that ultimately produce several overt rhythms in physiology and behavior.

Projects

Towards optimal sensory ecology of built environment»


Tatyana Sharpee

Professor

Tatyana Sharpee leads the Computational Neurobiology Laboratory at the Salk Institute. The Laboratory works on theoretical principles of how the brain processes information. Sharpee is interested in how sensory processing in the brain is shaped by the need to create parsimonious representations of events in the outer world. She pursues these questions using methods derived from statistical physics, mathematics and information theory. Of particular interest to Sharpee is how we perceive natural stimuli, such as video clips or sound recordings from natural environments: forests, city streets, etc. This is in contrast to much of the previous work in sensory neuroscience that had studied neuronal responses to simple patterns made of dots, lines and luminance gratings.

Projects

Object recognition in natural dynamic environments»
Adaptive perception of motion in natural environments»
Optimal decision making in dynamic environments»