Friday, January 5 2018
10:30 - 11:45

Alladi Ramakrishnan Hall

Spatiotemporal patterns of eye movements and brain activity underlying the visual perception of complex tool-use

Nikhilesh Natraj

University of California San Francisco

The ability to understand the affordances of tools (i.e. how they are to be grasped and used) is thought to be a defining characteristic of the human species. For example, when we sit down to eat, we are immediately able to piece together the functional significance of all the utensils in our visual space. As simple as it may appear, this is an extremely hard decision problem that robotics has yet to solve. Understanding how the human brain solves this problem has immediate relevance to the fields of robotics and clinical neurology wherein patients with brain injuries sometimes have deficits in understanding affordances. My doctoral dissertation (Neuroscience, Georgia Institute of Technology) was directed towards this problem by analyzing patterns of eye movements and brain activity as healthy human subjects were presented with images of tools, objects and various tool-grasps. The collected data is inherently high dimensional given the vast combinations of stimuli, brain sensors, decisions and temporal trajectories of eye movements. To this end, a novel statistical model of human attention was developed (Bayesian-Markov model) and various statistical theories of exchangeability, robustness and dimensionality reduction were applied in the analysis of brain activity. Briefly, results from this work showed a very high temporal sensitivity of brain activity and eye movements when subjects made perceptual decisions on the affordances of tool-grasps, all within a few hundred milliseconds after viewing the image.

Download as iCalendar