Amplifying people's abilities in and outside of the lab


Our lab finds new ways to enhance people's abilities with training and technology


View of an urban street, cars, trees and building.

Applied Research

Outside of the lab

Building the next generation assistive technologies

Our goal is to impact life outside of the lab. To do this we characterize the real world challenges people with disabilities face in order to develop the most effective assistive technologies (from shopping, mobility, and identifying facial expressions). We use augmented reality, eye tracking, voice and sound technologies to develop new technologies that improve real life challenges. Most recently our research has focused on people with low vision.

Basic Science

In the lab

Revealing how our brain learns new perceptual information

We examine the interaction of  processes in visual perception, attention and learning and how these develop across the lifespan. We test whether and how these process differ in typical populations and people with disabilities. Our methods include psychophysics, eye tracking, and computational models to reveal how our brain learns to process new perceptual information. 


Feb 2023

Open Phd and postdoc positions in the lab to study visual plasticity and augmented reality! See info here!

August 2022

Yay! The Israeli National Science Foundation has awarded us with funding for 5 years to study how objects appear after perceptual learning. We are also super fortunate to have received an equipment grant to fund our new lab!

April 2022

We were awarded funding from the Psychobiology Institute to study perceptual learning in audition and vision!

December 2021

Renana presented her project on the perception of facial experssion in autism at the ISCOP conference (online unfortunately).

August 2020

Promobilia Foundation has awarded us funding to study ways to improve mobility for low vision people. 


If you are interested in joining the team, please send an email to sarit[dot] with your CV and research interests.