Neural correlates of human action observation in hearing and deaf subjects

Accumulating evidence has suggested the existence of a human action recognition system involving inferior frontal, parietal, and superior temporal regions that may participate in both the perception and execution of actions. However, little is known about the specificity of this system in response t...

Full description

Saved in:
Bibliographic Details
Published inBrain research Vol. 1152; pp. 111 - 129
Main Authors Corina, David, Chiu, Yi-Shiuan, Knapp, Heather, Greenwald, Ralf, San Jose-Robertson, Lucia, Braun, Allen
Format Journal Article
LanguageEnglish
Published London Elsevier B.V 04.06.2007
Amsterdam Elsevier
New York, NY
Subjects
Online AccessGet full text
ISSN0006-8993
1872-6240
1872-6240
DOI10.1016/j.brainres.2007.03.054

Cover

More Information
Summary:Accumulating evidence has suggested the existence of a human action recognition system involving inferior frontal, parietal, and superior temporal regions that may participate in both the perception and execution of actions. However, little is known about the specificity of this system in response to different forms of human action. Here we present data from PET neuroimaging studies from passive viewing of three distinct action types, intransitive self-oriented actions (e.g., stretching, rubbing one’s eyes, etc.), transitive object-oriented actions (e.g., opening a door, lifting a cup to the lips to drink), and the abstract, symbolic actions-signs used in American Sign Language. Our results show that these different classes of human actions engage a frontal/parietal/STS human action recognition system in a highly similar fashion. However, the results indicate that this neural consistency across motion classes is true primarily for hearing subjects. Data from deaf signers shows a non-uniform response to different classes of human actions. As expected, deaf signers engaged left-hemisphere perisylvian language areas during the perception of signed language signs. Surprisingly, these subjects did not engage the expected frontal/parietal/STS circuitry during passive viewing of non-linguistic actions, but rather reliably activated middle–occipital temporal–ventral regions which are known to participate in the detection of human bodies, faces, and movements. Comparisons with data from hearing subjects establish statistically significant contributions of middle–occipital temporal–ventral during the processing of non-linguistic actions in deaf signers. These results suggest that during human motion processing, deaf individuals may engage specialized neural systems that allow for rapid, online differentiation of meaningful linguistic actions from non-linguistic human movements.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:0006-8993
1872-6240
1872-6240
DOI:10.1016/j.brainres.2007.03.054