A unified neural representation model for spatial and conceptual computations

SignificanceThe hippocampus and entorhinal cortex exhibit place-specific and grid-like neural activity patterns (place cells and grid cells) when an animal travels in the physical space. There are neurons that respond to specific nonspatial semantic concepts in the same brain regions, which are call...

Full description

Saved in:
Bibliographic Details
Published inProceedings of the National Academy of Sciences - PNAS Vol. 122; no. 11; p. e2413449122
Main Authors Haga, Tatsuya, Oseki, Yohei, Fukai, Tomoki
Format Journal Article
LanguageEnglish
Published United States National Academy of Sciences 18.03.2025
Subjects
Online AccessGet full text
ISSN0027-8424
1091-6490
1091-6490
DOI10.1073/pnas.2413449122

Cover

More Information
Summary:SignificanceThe hippocampus and entorhinal cortex exhibit place-specific and grid-like neural activity patterns (place cells and grid cells) when an animal travels in the physical space. There are neurons that respond to specific nonspatial semantic concepts in the same brain regions, which are called concept cells. How are those neural representations related? In this paper, we propose a unified computational model of spatial navigation and semantic representation learning, demonstrating that the model produces neural representations which resemble place cells, grid cells, and concept cells. Our model suggests a tight theoretical relationship between spatial and semantic neural representations in the brain. The hippocampus and entorhinal cortex encode spaces by spatially local and hexagonal grid activity patterns (place cells and grid cells), respectively. In addition, the same brain regions also implicate neural representations for nonspatial, semantic concepts (concept cells). These observations suggest that neurocomputational mechanisms for spatial knowledge and semantic concepts are related in the brain. However, the exact relationship remains to be understood. Here, we show a mathematical correspondence between a value function for goal-directed spatial navigation and an information measure for word embedding models in natural language processing. Based on this relationship, we integrate spatial and semantic computations into a neural representation model called “disentangled successor information” (DSI). DSI generates biologically plausible neural representations: spatial representations like place cells and grid cells, and concept-specific word representations which resemble concept cells. Furthermore, with DSI representations, we can perform inferences of spatial contexts and words by a common computational framework based on simple arithmetic operations. This computation can be biologically interpreted by partial modulations of cell assemblies of nongrid cells and concept cells. Our model offers a theoretical connection of spatial and semantic computations and suggests possible computational roles of hippocampal and entorhinal neural representations.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
Edited by Edvard Moser, Norwegian University of Science and Technology, Trondheim, Norway; received July 10, 2024; accepted January 26, 2025
ISSN:0027-8424
1091-6490
1091-6490
DOI:10.1073/pnas.2413449122