Representational Content of Oscillatory Brain Activity during Object Recognition: Contrasting Cortical and Deep Neural Network Hierarchies Author: Leila Reddy1,2, Radoslaw Martin Cichy3, Rufin VanRullen4,2 Affiliation: <sup>1</sup> ANITI, Université de Toulouse 3, Toulouse 31052, France leila.reddy@cnrs.fr. <sup>2</sup> Centre National de la Recherche Scientifique, Centre de Recherche Cerveau et Cognition (CerCo), Toulouse 31052, France. <sup>3</sup> Department of Education and Psychology, Freie Universität Berlin, Berlin, Germany. <sup>4</sup> ANITI, Université de Toulouse 3, Toulouse 31052, France. Conference/Journal: eNeuro Date published: 2021 Apr 26 Other: Special Notes: doi: 10.1523/ENEURO.0362-20.2021. , Word Count: 344 Numerous theories propose a key role for brain oscillations in visual perception. Most of these theories postulate that sensory information is encoded in specific oscillatory components (e.g., power or phase) of specific frequency bands. These theories are often tested with whole-brain recording methods of low spatial resolution (EEG or MEG), or depth recordings that provide a local, incomplete view of the brain. Opportunities to bridge the gap between local neural populations and whole-brain signals are rare. Here, using representational similarity analysis (RSA) in human participants we explore which MEG oscillatory components (power and phase, across various frequency bands) correspond to low or high-level visual object representations, using brain representations from fMRI, or layer-wise representations in seven recent deep neural networks (DNNs) as a template for low/high-level object representations. The results showed that around stimulus onset and offset, most transient oscillatory signals correlated with low-level brain patterns (V1). During stimulus presentation, sustained β (∼20 Hz) and γ (>60 Hz) power best correlated with V1, while oscillatory phase components correlated with IT representations. Surprisingly, this pattern of results did not always correspond to low-level or high-level DNN layer activity. In particular, sustained β band oscillatory power reflected high-level DNN layers, suggestive of a feed-back component. These results begin to bridge the gap between whole-brain oscillatory signals and object representations supported by local neuronal activations.Significance StatementBrain oscillations are thought to play a key role in visual perception. We asked how oscillatory signals relate to visual object representations in localized brain regions, and how these representations evolve over time in terms of their complexity. We used representational similarity analysis (RSA) between MEG oscillations (considering both phase and amplitude) and (1) fMRI signals (to assess local activations along the cortical hierarchy), or (2) feedforward deep neural network (DNN) layers (to probe the complexity of visual representations). Our results reveal a complex picture, with the successive involvement of different oscillatory components (phase, amplitude) in different frequency bands and in different brain regions during visual object recognition. Keywords: MEG; brain oscillations; deep neural networks; fMRI; object recognition; representational similarity analysis. PMID: 33903182 DOI: 10.1523/ENEURO.0362-20.2021