Categories
Kreiman Lab News

CBMM: The brain’s operating system. Research Update 1

We hope that you will be able to join next week’s research meeting with presentations by Trenton Bricken and Will Xiao, Kreiman Lab.

CBMM Research Meeting

Title: Module 2 Research presentation

Speaker/s: Trenton Bricken and Will Xiao, Kreiman Lab

Date/Time: October 19, 2021, 4:00 pm to 5:30 pm ET

RSVP for post meeting social

Will Xiao
Will Xiao

Will Xiao‘s presentation:

Title: What you see is what IT gets: Responses in primate visual cortex during natural viewing

Abstract: How does the brain support our ability to see? Studies of primate vision have typically focused on controlled viewing conditions exemplified by the rapid serial visual presentation (RSVP) task, where the subject must hold fixation while images are flashed briefly in randomized order. In contrast, during natural viewing, eyes move frequently, guided by subject-initiated saccades, resulting in a sequence of related sensory input. Thus, natural viewing departs from traditional assumptions of independent and unpredictable visual inputs, leaving it an open question how visual neurons respond in real life. We recorded responses of interior temporal (IT) cortex neurons in macaque monkeys freely viewing natural images. We first examined responses of face-selective neurons and found that face neurons responded according to whether individual fixations were near a face, meticulously distinguishing single fixations. Second, we considered repeated fixations on very close-by locations, termed ‘return fixations.’ Responses were more similar during return fixations, and again distinguished individual fixations. Third, computation models could partially explain neuronal responses from an image crop centered on each fixation. These results shed light on how the IT cortex does (and does not) contribute to our daily visual percept: a stable world despite frequent saccades.

Video presentation

Trenton Bricken
Trenton Bricken

Trenton Bricken‘s presentation:

Title: Attention Approximates Sparse Distributed Memory

Abstract: While Attention has come to be an important mechanism in deep learning, it emerged out of a heuristic process of trial and error, providing limited intuition for why it works so well. Here, we show that Transformer Attention closely approximates Sparse Distributed Memory (SDM), a biologically plausible associative memory model, under certain data conditions. We confirm that these conditions are satisfied in pre-trained GPT2 Transformer models. We discuss the implications of the Attention-SDM map and provide new computational and biological interpretations of Attention.

Video presentation

The Fall 2021 CBMM Research Meetings will be hosted in a hybrid format. Please see the information included below regarding attending the event either in-person or remotely via Zoom connection