The Avatar’s Gist: How to Transfer Affective Components From Dynamic Walking to Static Body Postures
Dynamic virtual representations of the human being can communicate a broad range of affective states through body movements, thus effectively studying emotion perception. However, the possibility of modeling static body postures preserving affective information is still fundamental in a broad spectr...
Saved in:
Published in | Frontiers in neuroscience Vol. 16; p. 842433 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
Switzerland
Frontiers Research Foundation
15.06.2022
Frontiers Media S.A |
Subjects | |
Online Access | Get full text |
ISSN | 1662-453X 1662-4548 1662-453X |
DOI | 10.3389/fnins.2022.842433 |
Cover
Abstract | Dynamic virtual representations of the human being can communicate a broad range of affective states through body movements, thus effectively studying emotion perception. However, the possibility of modeling static body postures preserving affective information is still fundamental in a broad spectrum of experimental settings exploring time-locked cognitive processes. We propose a novel automatic method for creating virtual affective body postures starting from kinematics data. Exploiting body features related to postural cues and movement velocity, we transferred the affective components from dynamic walking to static body postures of male and female virtual avatars. Results of two online experiments showed that participants coherently judged different valence and arousal levels in the avatar’s body posture, highlighting the reliability of the proposed methodology. In addition, esthetic and postural cues made women more emotionally expressive than men. Overall, we provided a valid methodology to create affective body postures of virtual avatars, which can be used within different virtual scenarios to understand better the way we perceive the affective state of others. |
---|---|
AbstractList | Dynamic virtual representations of the human being can communicate a broad range of affective states through body movements, thus effectively studying emotion perception. However, the possibility of modeling static body postures preserving affective information is still fundamental in a broad spectrum of experimental settings exploring time-locked cognitive processes. We propose a novel automatic method for creating virtual affective body postures starting from kinematics data. Exploiting body features related to postural cues and movement velocity, we transferred the affective components from dynamic walking to static body postures of male and female virtual avatars. Results of two online experiments showed that participants coherently judged different valence and arousal levels in the avatar's body posture, highlighting the reliability of the proposed methodology. In addition, esthetic and postural cues made women more emotionally expressive than men. Overall, we provided a valid methodology to create affective body postures of virtual avatars, which can be used within different virtual scenarios to understand better the way we perceive the affective state of others. Dynamic virtual representations of the human being can communicate a broad range of affective states through body movements, thus effectively studying emotion perception. However, the possibility of modeling static body postures preserving affective information is still fundamental in a broad spectrum of experimental settings exploring time-locked cognitive processes. Here, we propose a novel automatic method for creating virtual affective body postures starting from kinematics data. Exploiting body features related to postural cues and movement velocity, we transferred the affective components from dynamic walking to static body postures of male and female virtual avatars. Results of two online experiments showed that subjects coherently judged different valence and arousal levels in the avatar’s body posture, highlighting the reliability of the proposed methodology. In addition, aesthetic and postural cues made females more emotionally expressive than males. Overall, we provided a valid methodology to create affective body postures of virtual avatars, which can be used within different virtual scenarios to understand better the way we perceive the affective state of others. Dynamic virtual representations of the human being can communicate a broad range of affective states through body movements, thus effectively studying emotion perception. However, the possibility of modeling static body postures preserving affective information is still fundamental in a broad spectrum of experimental settings exploring time-locked cognitive processes. We propose a novel automatic method for creating virtual affective body postures starting from kinematics data. Exploiting body features related to postural cues and movement velocity, we transferred the affective components from dynamic walking to static body postures of male and female virtual avatars. Results of two online experiments showed that participants coherently judged different valence and arousal levels in the avatar's body posture, highlighting the reliability of the proposed methodology. In addition, esthetic and postural cues made women more emotionally expressive than men. Overall, we provided a valid methodology to create affective body postures of virtual avatars, which can be used within different virtual scenarios to understand better the way we perceive the affective state of others.Dynamic virtual representations of the human being can communicate a broad range of affective states through body movements, thus effectively studying emotion perception. However, the possibility of modeling static body postures preserving affective information is still fundamental in a broad spectrum of experimental settings exploring time-locked cognitive processes. We propose a novel automatic method for creating virtual affective body postures starting from kinematics data. Exploiting body features related to postural cues and movement velocity, we transferred the affective components from dynamic walking to static body postures of male and female virtual avatars. Results of two online experiments showed that participants coherently judged different valence and arousal levels in the avatar's body posture, highlighting the reliability of the proposed methodology. In addition, esthetic and postural cues made women more emotionally expressive than men. Overall, we provided a valid methodology to create affective body postures of virtual avatars, which can be used within different virtual scenarios to understand better the way we perceive the affective state of others. |
Author | Ruzzon, Davide Vecchiato, Giovanni Avanzini, Pietro Caruana, Fausto Galasso, Gaia Maria Presti, Paolo |
AuthorAffiliation | 1 Institute of Neuroscience, National Research Council of Italy , Parma , Italy 2 Department of Medicine and Surgery, University of Parma Parma , Italy 4 Dipartimento Culture del Progetto, University IUAV , Venice , Italy 3 TUNED, Lombardini22 , Milan , Italy |
AuthorAffiliation_xml | – name: 2 Department of Medicine and Surgery, University of Parma Parma , Italy – name: 1 Institute of Neuroscience, National Research Council of Italy , Parma , Italy – name: 4 Dipartimento Culture del Progetto, University IUAV , Venice , Italy – name: 3 TUNED, Lombardini22 , Milan , Italy |
Author_xml | – sequence: 1 givenname: Paolo surname: Presti fullname: Presti, Paolo – sequence: 2 givenname: Davide surname: Ruzzon fullname: Ruzzon, Davide – sequence: 3 givenname: Gaia Maria surname: Galasso fullname: Galasso, Gaia Maria – sequence: 4 givenname: Pietro surname: Avanzini fullname: Avanzini, Pietro – sequence: 5 givenname: Fausto surname: Caruana fullname: Caruana, Fausto – sequence: 6 givenname: Giovanni surname: Vecchiato fullname: Vecchiato, Giovanni |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/35784850$$D View this record in MEDLINE/PubMed |
BookMark | eNp1ks2OFCEUhStmjPOjD-DGkLhx0y1cqqBwYdK2zk8yiSa20R2hKOihrYIWqtv0ztfw9XwSqekZMzOJKwic8-XAPcfFgQ_eFMVzgqeU1uK19c6nKWCAaV1CSemj4ogwBpOyot8O7uwPi-OUVhgzyLonxSGteF3WFT4q7OLKoNlWDSr--fU7oTOXhjfoPPxEQ0CLqHyyJqKZtUYPbmvQPPTrHMIPCZ3G0KP3O696p9FX1X13fjm6Pg9qyCfvQrtDn0IaNtGkp8Vjq7pknt2sJ8WX0w-L-fnk8uPZxXx2OdEl0GHCibYtYKHAWMEtZ1o1uuac1oapEpggLWgQWtgamtYoXOsWalxRVRrQGOhJcbHntkGt5Dq6XsWdDMrJ64MQl1LFnK4zUhBBm4YKrBmUhGFBKSYWC9ZQWrFWZ9bbPWu9aXrT6vzoqLp70Ps33l3JZdhKASXmJcmAVzeAGH5sTBpk75I2Xae8CZskgeURUMCsytKXD6SrsIk-f1VWccYEBj6qXtxN9C_K7TizgO8FOoaUorFSu3EaYQzoOkmwHIsjr4sjx-LIfXGykzxw3sL_7_kLMLbHnw |
CitedBy_id | crossref_primary_10_1073_pnas_2302215120 |
Cites_doi | 10.1007/s10919-012-0130-0 10.1371/journal.pone.0216591 10.1109/T-AFFC.2012.16 10.1038/nrn1651 10.1109/TVCG.2019.2953063 10.1109/IC3D.2016.7823448 10.1016/j.neuropsychologia.2011.12.022 10.1109/JBHI.2019.2938111 10.1037/a0030811 10.1037/a0030737 10.1145/2767130 10.1016/j.jbef.2017.12.004 10.1098/rstb.2009.0190 10.1007/s00221-009-2037-5 10.21203/rs.3.rs-910384/v1 10.1007/978-3-642-34014-7_6 10.1016/j.biopsycho.2010.03.010 10.1145/1957656.1957781 10.1109/ISMAR50242.2020.00020 10.1371/journal.pone.0158666 10.1145/3343036.3343129 10.1023/B:JONB.0000023655.25550.be 10.1007/978-3-642-04380-2_31 10.1145/3014812.3014876 10.1016/S0010-0277(01)00147-0 10.1037/0022-3514.65.5.1010 10.1016/j.humov.2017.11.008 10.1109/TVCG.2019.2932235 10.1002/wics.101 10.3389/frobt.2016.00074 10.1093/oso/9780195169157.003.0014 10.1109/ACII.2015.7344582 10.1109/TSMCB.2010.2044040 10.1007/s00221-012-3357-4 10.1016/j.intcom.2006.04.003 10.1126/science.1224313 10.1109/IROS.2014.6942922 10.1007/978-3-540-74889-2_6 10.1109/T-AFFC.2013.29 10.1037//0022-3514.74.3.686 10.1016/j.neubiorev.2009.10.008 10.1017/S0140525X11000446 10.1007/BF00999605 10.1007/s00221-010-2220-8 10.1016/j.ijpsycho.2008.03.004 10.1007/s12369-014-0243-1 10.1109/TVCG.2008.62 10.1007/978-3-030-49062-1_11 10.1007/978-3-540-74889-2_7 10.3758/BF03193146 10.1109/IROS.2009.5354205 10.1016/j.humov.2017.10.012 10.1080/10447318.2013.802200 10.1068/p5096 10.1002/cav.29 10.1016/j.humov.2011.05.001 10.1007/s12369-017-0427-6 10.1007/978-3-540-74889-2_5 10.1093/cercor/bhaa196 10.1371/journal.pone.0218179 10.1038/nrn1872 10.1177/0146167291173011 10.1016/j.tics.2016.03.011 10.1109/TAFFC.2016.2591039 10.1080/02699930050117648 10.1167/9.6.15 10.1068/p7364 10.1109/ACII.2013.82 |
ContentType | Journal Article |
Copyright | Copyright © 2022 Presti, Ruzzon, Galasso, Avanzini, Caruana and Vecchiato. 2022. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. Copyright © 2022 Presti, Ruzzon, Galasso, Avanzini, Caruana and Vecchiato. 2022 Presti, Ruzzon, Galasso, Avanzini, Caruana and Vecchiato |
Copyright_xml | – notice: Copyright © 2022 Presti, Ruzzon, Galasso, Avanzini, Caruana and Vecchiato. – notice: 2022. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. – notice: Copyright © 2022 Presti, Ruzzon, Galasso, Avanzini, Caruana and Vecchiato. 2022 Presti, Ruzzon, Galasso, Avanzini, Caruana and Vecchiato |
DBID | AAYXX CITATION NPM 3V. 7XB 88I 8FE 8FH 8FK ABUWG AFKRA AZQEC BBNVY BENPR BHPHI CCPQU DWQXO GNUQQ HCIFZ LK8 M2P M7P PHGZM PHGZT PIMPY PKEHL PQEST PQGLB PQQKQ PQUKI PRINS Q9U 7X8 5PM DOA |
DOI | 10.3389/fnins.2022.842433 |
DatabaseName | CrossRef PubMed ProQuest Central (Corporate) ProQuest Central (purchase pre-March 2016) Science Database (Alumni Edition) ProQuest SciTech Collection ProQuest Natural Science Journals ProQuest Central (Alumni) (purchase pre-March 2016) ProQuest Central (Alumni) ProQuest Central ProQuest Central Essentials Biological Science Collection ProQuest Central Natural Science Collection ProQuest One Community College ProQuest Central ProQuest Central Student SciTech Premium Collection Biological Sciences Science Database Biological Science Database ProQuest Central Premium ProQuest One Academic (New) Publicly Available Content Database (Proquest) ProQuest One Academic Middle East (New) ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic ProQuest One Academic UKI Edition ProQuest Central China ProQuest Central Basic MEDLINE - Academic PubMed Central (Full Participant titles) DOAJ Directory of Open Access Journals |
DatabaseTitle | CrossRef PubMed Publicly Available Content Database ProQuest Central Student ProQuest One Academic Middle East (New) ProQuest Central Essentials ProQuest Central (Alumni Edition) SciTech Premium Collection ProQuest One Community College ProQuest Natural Science Collection ProQuest Central China ProQuest Central ProQuest One Applied & Life Sciences Natural Science Collection ProQuest Central Korea Biological Science Collection ProQuest Central (New) ProQuest Science Journals (Alumni Edition) ProQuest Biological Science Collection ProQuest Central Basic ProQuest Science Journals ProQuest One Academic Eastern Edition Biological Science Database ProQuest SciTech Collection ProQuest One Academic UKI Edition ProQuest One Academic ProQuest One Academic (New) ProQuest Central (Alumni) MEDLINE - Academic |
DatabaseTitleList | PubMed Publicly Available Content Database MEDLINE - Academic CrossRef |
Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 3 dbid: BENPR name: ProQuest Central url: http://www.proquest.com/pqcentral?accountid=15518 sourceTypes: Aggregation Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Anatomy & Physiology |
EISSN | 1662-453X |
ExternalDocumentID | oai_doaj_org_article_9193bb390c62416093301f096b3356dc PMC9240741 35784850 10_3389_fnins_2022_842433 |
Genre | Journal Article |
GroupedDBID | --- 29H 2WC 53G 5GY 5VS 8FE 8FH 9T4 AAFWJ AAYXX ABUWG ACGFO ACGFS ACXDI ADRAZ AEGXH AENEX AFKRA AFPKN AIAGR ALMA_UNASSIGNED_HOLDINGS AZQEC BBNVY BENPR BHPHI BPHCQ CITATION CS3 DIK DU5 E3Z EBS EJD EMOBN F5P FRP GROUPED_DOAJ GX1 HCIFZ HYE KQ8 LK8 M2P M48 M7P O5R O5S OK1 OVT P2P PGMZT PIMPY PQQKQ PROAC RNS RPM W2D 88I C1A CCPQU DWQXO GNUQQ NPM PHGZM PHGZT PQGLB 3V. 7XB 8FK PKEHL PQEST PQUKI PRINS Q9U 7X8 5PM |
ID | FETCH-LOGICAL-c423t-71cfd209a2ef97f76cabc87738e6a42691d2c29c9f82bdea08cd28053a4e2c023 |
IEDL.DBID | M48 |
ISSN | 1662-453X 1662-4548 |
IngestDate | Wed Aug 27 01:25:29 EDT 2025 Thu Aug 21 13:38:15 EDT 2025 Fri Sep 05 05:50:23 EDT 2025 Mon Jun 30 09:36:10 EDT 2025 Mon Jul 21 05:58:27 EDT 2025 Thu Apr 24 22:56:52 EDT 2025 Tue Jul 01 01:39:39 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Keywords | virtual reality dynamic walking valence body posture arousal |
Language | English |
License | Copyright © 2022 Presti, Ruzzon, Galasso, Avanzini, Caruana and Vecchiato. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c423t-71cfd209a2ef97f76cabc87738e6a42691d2c29c9f82bdea08cd28053a4e2c023 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 Edited by: An ı l Ufuk Batmaz, Kadir Has University, Turkey Reviewed by: Christos Mousas, Purdue University, United States; Dominik M. Endres, University of Marburg, Germany; Christian Graff, Université Grenoble Alpes, France This article was submitted to Perception Science, a section of the journal Frontiers in Neuroscience |
OpenAccessLink | http://journals.scholarsportal.info/openUrl.xqy?doi=10.3389/fnins.2022.842433 |
PMID | 35784850 |
PQID | 2676690275 |
PQPubID | 4424402 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_9193bb390c62416093301f096b3356dc pubmedcentral_primary_oai_pubmedcentral_nih_gov_9240741 proquest_miscellaneous_2685032065 proquest_journals_2676690275 pubmed_primary_35784850 crossref_citationtrail_10_3389_fnins_2022_842433 crossref_primary_10_3389_fnins_2022_842433 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2022-06-15 |
PublicationDateYYYYMMDD | 2022-06-15 |
PublicationDate_xml | – month: 06 year: 2022 text: 2022-06-15 day: 15 |
PublicationDecade | 2020 |
PublicationPlace | Switzerland |
PublicationPlace_xml | – name: Switzerland – name: Lausanne |
PublicationTitle | Frontiers in neuroscience |
PublicationTitleAlternate | Front Neurosci |
PublicationYear | 2022 |
Publisher | Frontiers Research Foundation Frontiers Media S.A |
Publisher_xml | – name: Frontiers Research Foundation – name: Frontiers Media S.A |
References | de Gelder (B22) 2006; 7 Faul (B30) 2007; 39 Randhavane (B61); 25 Cheng (B13) 2020 Kret (B47) 2010; 203 de Gelder (B23) 2009; 364 Palan (B56) 2018; 17 Kuppens (B50) 2013; 139 Bailey (B4) 2017 Montepare (B54) 1987; 11 Deligianni (B26) 2019; 23 Kleinsmith (B42) 2007 Presti (B60) 2021 Karg (B40) 2010; 40 Fourati (B33) 2018; 9 Zhao (B73) 2019; 14 Barliya (B5) 2013; 225 Brody (B9) 2008 Coulson (B18) 2004; 28 Sanchez-Vives (B66) 2005; 6 Vinayagamoorthy (B71) 2008; 14 De Silva (B25) 2004; 15 Kleinsmith (B44) 2006; 18 Poyo Solanas (B59) 2020; 30 Randhavane (B63) 2021; 27 Fourati (B31) 2014 Roether (B64) 2009; 9 Kring (B49) 1998; 74 Aviezer (B3) 2012; 338 de Gelder (B24) 2010; 34 Bhattacharya (B8) 2020 Chaplin (B12) 2013; 139 Grossman (B35) 1993; 65 Hicheur (B39) 2013 Halovic (B36); 57 Kragel (B45) 2016; 20 Crenn (B19) 2016 Samadani (B65) 2012 Buisine (B10) 2014; 30 Nakagawa (B55) 2009 Lindquist (B51) 2012; 35 Hess (B38) 2000; 14 Colombetti (B17) 2005; 12 Fourati (B32) 2015 Kleinsmith (B43) 2013; 4 Bernhardt (B7) 2007 Dael (B21) 2012; 36 Castellano (B11) 2007 Karg (B41) 2013; 4 McHugh (B53) 2010; 204 Deng (B27) 2016; 11 Halovic (B37); 57 Atkinson (B2) 2004; 33 Slater (B68) 2016; 3 Stephens-Fripp (B69) 2017; 9 Kreibig (B46) 2010; 84 Clavel (B14) 2009; 5773 Randhavane (B62) McColl (B52) 2014 Yang (B72) 2021; 15 Sanghvi (B67) 2011 Dael (B20) 2013; 42 Abdi (B1) 2010; 2 Gross (B34) 2012; 31 DeWester (B28) 2009 Paterson (B57) 2001 Bernardet (B6) 2019; 14 Codispoti (B15) 2008; 69 Venture (B70) 2014; 6 Kret (B48) 2012; 50 Zibrek (B74) 2015; 11 Pollick (B58) 2001; 82 Eagly (B29) 1991; 17 Cohn (B16) 2007 |
References_xml | – volume: 36 start-page: 97 year: 2012 ident: B21 article-title: The Body Action and Posture Coding System (BAP): Development and Reliability. publication-title: J. Nonverbal. Behav. doi: 10.1007/s10919-012-0130-0 – volume: 14 year: 2019 ident: B73 article-title: See your mental state from your walk: Recognizing anxiety and depression through Kinect-recorded gait data. publication-title: PLoS One doi: 10.1371/journal.pone.0216591 – volume: 4 start-page: 15 year: 2013 ident: B43 article-title: Affective Body Expression Perception and Recognition: A Survey. publication-title: IEEE Trans. Affect. Comput. doi: 10.1109/T-AFFC.2012.16 – volume: 6 start-page: 332 year: 2005 ident: B66 article-title: From presence to consciousness through virtual reality. publication-title: Nat. Rev. Neurosci. doi: 10.1038/nrn1651 – volume: 27 start-page: 2967 year: 2021 ident: B63 article-title: Modeling Data-Driven Dominance Traits for Virtual Characters Using Gait Analysis. publication-title: IEEE Trans. Vis. Comput. Graph. doi: 10.1109/TVCG.2019.2953063 – start-page: 1 year: 2016 ident: B19 article-title: Body expression recognition from animated 3D skeleton publication-title: 2016 International Conference on 3D Imaging (IC3D) doi: 10.1109/IC3D.2016.7823448 – volume: 50 start-page: 1211 year: 2012 ident: B48 article-title: A review on sex differences in processing emotional signals. publication-title: Neuropsychologia doi: 10.1016/j.neuropsychologia.2011.12.022 – volume: 23 start-page: 2302 year: 2019 ident: B26 article-title: From Emotions to Mood Disorders: A Survey on Gait Analysis Methodology. publication-title: IEEE J. Biomed. Health Inform. doi: 10.1109/JBHI.2019.2938111 – volume: 139 start-page: 917 year: 2013 ident: B50 article-title: The relation between valence and arousal in subjective experience. publication-title: Psychol. Bull. doi: 10.1037/a0030811 – volume: 139 start-page: 735 year: 2013 ident: B12 article-title: Gender Differences in Emotion Expression in Children: A Meta-Analytic Review. publication-title: Psychol. Bull. doi: 10.1037/a0030737 – volume: 11 year: 2015 ident: B74 article-title: Exploring the Effect of Motion Type and Emotions on the Perception of Gender in Virtual Humans. publication-title: ACM Trans. Appl. Percept doi: 10.1145/2767130 – volume: 17 start-page: 22 year: 2018 ident: B56 article-title: Prolific.ac—A subject pool for online experiments. publication-title: J. Behav. Exp. Finance doi: 10.1016/j.jbef.2017.12.004 – volume: 364 start-page: 3475 year: 2009 ident: B23 article-title: Why bodies? Twelve reasons for including bodily expressions in affective neuroscience. publication-title: Philos. Trans. R. Soc. Lond. B. Biol. Sci. doi: 10.1098/rstb.2009.0190 – start-page: 3486 year: 2014 ident: B31 article-title: Emilya: Emotional body expression in daily actions database publication-title: Proceedings of the Ninth International Conference on Language Resources and Evaluation (LREC’14) – volume: 204 start-page: 361 year: 2010 ident: B53 article-title: Perceiving emotion in crowds: the role of dynamic body postures on the perception of emotion in crowded scenes. publication-title: Exp. Brain Res. doi: 10.1007/s00221-009-2037-5 – year: 2021 ident: B60 article-title: Dynamic experience of architectural forms affects arousal and valence perception in virtual environments publication-title: Res. Square doi: 10.21203/rs.3.rs-910384/v1 – start-page: 65 year: 2012 ident: B65 article-title: Gender Differences in the Perception of Affective Movements publication-title: Human Behavior Understanding Lecture Notes in Computer Science doi: 10.1007/978-3-642-34014-7_6 – volume: 84 start-page: 394 year: 2010 ident: B46 article-title: Autonomic nervous system activity in emotion: A review. publication-title: Biol. Psychol. doi: 10.1016/j.biopsycho.2010.03.010 – start-page: 305 year: 2011 ident: B67 article-title: Automatic analysis of affective postures and body motion to detect engagement with a game companion publication-title: Proceedings of the 6th international conference on Human-robot interaction HRI ’11 doi: 10.1145/1957656.1957781 – start-page: 24 year: 2020 ident: B8 article-title: Generating Emotive Gaits for Virtual Agents Using Affect-Based Autoregression publication-title: 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) doi: 10.1109/ISMAR50242.2020.00020 – volume: 11 year: 2016 ident: B27 article-title: Gender Differences in Emotional Response: Inconsistency between Experience and Expressivity. publication-title: PLoS One doi: 10.1371/journal.pone.0158666 – start-page: 1 ident: B62 article-title: EVA: Generating Emotional Behavior of Virtual Agents using Expressive Features of Gait and Gaze publication-title: ACM Symposium on Applied Perception 2019 SAP ’19 doi: 10.1145/3343036.3343129 – volume: 28 start-page: 117 year: 2004 ident: B18 article-title: Attributing Emotion to Static Body Postures: Recognition Accuracy, Confusions, and Viewpoint Dependence. publication-title: J. Nonverbal Behav. doi: 10.1023/B:JONB.0000023655.25550.be – volume: 5773 year: 2009 ident: B14 article-title: Combining Facial and Postural Expressions of Emotions in a Virtual Character publication-title: Intelligent Virtual Agents. IVA 2009. Lecture Notes in Computer Science doi: 10.1007/978-3-642-04380-2_31 – start-page: 1 year: 2017 ident: B4 article-title: Gender and the perception of emotions in avatars publication-title: Proceedings of the Australasian Computer Science Week Multiconference doi: 10.1145/3014812.3014876 – volume: 82 start-page: B51 year: 2001 ident: B58 article-title: Perceiving affect from arm movement. publication-title: Cognition doi: 10.1016/S0010-0277(01)00147-0 – volume: 65 start-page: 1010 year: 1993 ident: B35 article-title: Sex differences in intensity of emotional experience: A social role interpretation. publication-title: J. Pers. Soc. Psychol. doi: 10.1037/0022-3514.65.5.1010 – volume: 57 start-page: 478 ident: B36 article-title: Not all is noticed: Kinematic cues of emotion-specific gait. publication-title: Hum. Mov. Sci. doi: 10.1016/j.humov.2017.11.008 – start-page: 395 year: 2008 ident: B9 article-title: Gender and emotion in context publication-title: Handbook of Emotions: third edition – volume: 25 start-page: 3135 ident: B61 article-title: FVA: Modeling Perceived Friendliness of Virtual Agents Using Movement Characteristics. publication-title: IEEE Trans. Vis. Comput. Graph. doi: 10.1109/TVCG.2019.2932235 – volume: 2 start-page: 433 year: 2010 ident: B1 article-title: Principal component analysis. publication-title: Wiley Interdiscip. Rev. Comput. Stat. doi: 10.1002/wics.101 – volume: 3 year: 2016 ident: B68 article-title: Enhancing Our Lives with Immersive Virtual Reality. publication-title: Front. Robot. AI. doi: 10.3389/frobt.2016.00074 – start-page: 203 year: 2007 ident: B16 article-title: Observer-based measurement of facial expression with the Facial Action Coding System publication-title: handbook of Emotion Elicitation and Assessment Series in Affective Science doi: 10.1093/oso/9780195169157.003.0014 – start-page: 267 year: 2015 ident: B32 article-title: Relevant body cues for the classification of emotional body expression in daily actions publication-title: 2015 International Conference on Affective Computing and Intelligent Interaction (ACII) doi: 10.1109/ACII.2015.7344582 – volume: 40 start-page: 1050 year: 2010 ident: B40 article-title: Recognition of Affect Based on Gait Patterns. publication-title: IEEE Trans. Syst. Man Cybern. Syst. doi: 10.1109/TSMCB.2010.2044040 – volume: 225 start-page: 159 year: 2013 ident: B5 article-title: Expression of emotion in the kinematics of locomotion. publication-title: Exp. Brain Res. doi: 10.1007/s00221-012-3357-4 – volume: 18 start-page: 1371 year: 2006 ident: B44 article-title: Cross-cultural differences in recognizing affect from body posture. publication-title: Interact. Comput. doi: 10.1016/j.intcom.2006.04.003 – volume: 338 start-page: 1225 year: 2012 ident: B3 article-title: Body Cues, Not Facial Expressions, Discriminate Between Intense Positive and Negative Emotions. publication-title: Science doi: 10.1126/science.1224313 – start-page: 2633 year: 2014 ident: B52 article-title: Determining the affective body language of older adults during socially assistive HRI publication-title: 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems doi: 10.1109/IROS.2014.6942922 – start-page: 59 year: 2007 ident: B7 article-title: Detecting Affect from Non-stylised Body Motions publication-title: Affective Computing and Intelligent Interaction Lecture Notes in Computer Science doi: 10.1007/978-3-540-74889-2_6 – volume: 15 start-page: 129 year: 2021 ident: B72 article-title: Do We Perceive Emotional Gender Gesture of Virtual Avatar As Intended? publication-title: Int. J. Educ. Technol. – volume: 4 start-page: 341 year: 2013 ident: B41 article-title: Body Movements for Affective Expression: A Survey of Automatic Recognition and Generation. publication-title: IEEE Trans. Affect. Comput. doi: 10.1109/T-AFFC.2013.29 – volume: 74 start-page: 686 year: 1998 ident: B49 article-title: Sex differences in emotion: Expression, experience, and physiology. - PsycNET. publication-title: J. Pers. Soc. Psychol. doi: 10.1037//0022-3514.74.3.686 – volume: 34 start-page: 513 year: 2010 ident: B24 article-title: Standing up for the body. Recent progress in uncovering the networks involved in the perception of bodies and bodily expressions. publication-title: Neurosci. Biobehav. Rev. doi: 10.1016/j.neubiorev.2009.10.008 – volume: 35 start-page: 121 year: 2012 ident: B51 article-title: The brain basis of emotion: A meta-analytic review. publication-title: Behav. Brain Sci. doi: 10.1017/S0140525X11000446 – volume: 11 start-page: 33 year: 1987 ident: B54 article-title: The identification of emotions from gait information. publication-title: J. Nonverbal. Behav. doi: 10.1007/BF00999605 – volume: 203 start-page: 169 year: 2010 ident: B47 article-title: Social context influences recognition of bodily expressions. publication-title: Exp. Brain Res. doi: 10.1007/s00221-010-2220-8 – volume: 69 start-page: 90 year: 2008 ident: B15 article-title: Watching emotional movies: Affective reactions and gender differences. publication-title: Int. J. Psychophysiol. doi: 10.1016/j.ijpsycho.2008.03.004 – volume: 6 start-page: 621 year: 2014 ident: B70 article-title: Recognizing Emotions Conveyed by Human Gait. publication-title: Int. J. Soc. Robot. doi: 10.1007/s12369-014-0243-1 – volume: 14 start-page: 965 year: 2008 ident: B71 article-title: The Impact of a Character Posture Model on the Communication of Affect in an Immersive Virtual Environment. publication-title: IEEE Trans. Vis. Comput. Graph. doi: 10.1109/TVCG.2008.62 – start-page: 169 year: 2020 ident: B13 article-title: The Effects of Body Gestures and Gender on Viewer’s Perception of Animated Pedagogical Agent’s Emotions publication-title: Human-Computer Interaction. Multimodal and Natural Interaction Lecture Notes in Computer Science doi: 10.1007/978-3-030-49062-1_11 – start-page: 71 year: 2007 ident: B11 article-title: Recognising Human Emotions from Body Movement and Gesture Dynamics publication-title: Affective Computing and Intelligent Interaction Lecture Notes in Computer Science doi: 10.1007/978-3-540-74889-2_7 – volume: 39 start-page: 175 year: 2007 ident: B30 article-title: G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. publication-title: Behav. Res. Methods doi: 10.3758/BF03193146 – start-page: 5003 year: 2009 ident: B55 article-title: Motion modification method to control affective nuances for robots publication-title: 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems doi: 10.1109/IROS.2009.5354205 – volume: 57 start-page: 461 ident: B37 article-title: Walking my way? Walker gender and display format Confounds the perception of specific emotions. publication-title: Hum. Mov. Sci. doi: 10.1016/j.humov.2017.10.012 – volume: 30 start-page: 52 year: 2014 ident: B10 article-title: The Role of Body Postures in the Recognition of Emotions in Contextually Rich Scenarios. publication-title: Int. J. Hum. Comput. Interact. doi: 10.1080/10447318.2013.802200 – volume: 12 start-page: 103 year: 2005 ident: B17 article-title: Appraising Valence. publication-title: J. Conscious. Stud. – volume: 33 start-page: 717 year: 2004 ident: B2 article-title: Emotion Perception from Dynamic and Static Body Expressions in Point-Light and Full-Light Displays. publication-title: Perception doi: 10.1068/p5096 – volume: 15 start-page: 269 year: 2004 ident: B25 article-title: Modeling human affective postures: an information theoretic characterization of posture features. publication-title: Comput. Animat. Virtual Worlds doi: 10.1002/cav.29 – volume: 31 start-page: 202 year: 2012 ident: B34 article-title: Effort-Shape and kinematic assessment of bodily expression of emotion during gait. publication-title: Hum. Mov. Sci. doi: 10.1016/j.humov.2011.05.001 – volume: 9 start-page: 617 year: 2017 ident: B69 article-title: Automatic Affect Perception Based on Body Gait and Posture: A Survey. publication-title: Int. J. Soc. Robot. doi: 10.1007/s12369-017-0427-6 – start-page: 48 year: 2007 ident: B42 article-title: Recognizing Affective Dimensions from Body Posture publication-title: Affective Computing and Intelligent Interaction Lecture Notes in Computer Science doi: 10.1007/978-3-540-74889-2_5 – start-page: 386 year: 2009 ident: B28 article-title: Are Male and Female Avatars Perceived Equally in 3d Virtual Worlds? publication-title: AMCIS 2009 Proceedings – volume: 30 start-page: 6376 year: 2020 ident: B59 article-title: Computation-based feature representation of body expressions in the human brain. publication-title: Cereb. Cortex doi: 10.1093/cercor/bhaa196 – volume: 14 year: 2019 ident: B6 article-title: Assessing the reliability of the Laban Movement Analysis system. publication-title: PLoS One doi: 10.1371/journal.pone.0218179 – volume: 7 start-page: 242 year: 2006 ident: B22 article-title: Towards the neurobiology of emotional body language. publication-title: Nat. Rev. Neurosci. doi: 10.1038/nrn1872 – year: 2001 ident: B57 article-title: The Role of Velocity in Affect Discrimination publication-title: Proceedings of the Annual Meeting of the Cognitive Science Society 23. – volume: 17 start-page: 306 year: 1991 ident: B29 article-title: Explaining Sex Differences in Social Behavior: A Meta-Analytic Perspective. publication-title: Pers. Soc. Psychol. Bull. doi: 10.1177/0146167291173011 – volume: 20 start-page: 444 year: 2016 ident: B45 article-title: Decoding the Nature of Emotion in the Brain. publication-title: Trends Cogn. Sci. doi: 10.1016/j.tics.2016.03.011 – volume: 9 start-page: 90 year: 2018 ident: B33 article-title: Perception of Emotions and Body Movement in the Emilya Database. publication-title: IEEE Trans. Affect. Comput. doi: 10.1109/TAFFC.2016.2591039 – volume: 14 start-page: 609 year: 2000 ident: B38 article-title: Emotional expressivity in men and women: Stereotypes and self-perceptions. publication-title: Cogn. Emot. doi: 10.1080/02699930050117648 – volume: 9 year: 2009 ident: B64 article-title: Critical features for the perception of emotion from gait. publication-title: J. Vis. doi: 10.1167/9.6.15 – volume: 42 start-page: 642 year: 2013 ident: B20 article-title: Perceived Gesture Dynamics in Nonverbal Expression of Emotion. publication-title: Perception doi: 10.1068/p7364 – start-page: 460 year: 2013 ident: B39 article-title: Perception of Emotional Gaits Using Avatar Animation of Real and Artificially Synthesized Gaits publication-title: 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction doi: 10.1109/ACII.2013.82 |
SSID | ssj0062842 |
Score | 2.3068705 |
Snippet | Dynamic virtual representations of the human being can communicate a broad range of affective states through body movements, thus effectively studying emotion... |
SourceID | doaj pubmedcentral proquest pubmed crossref |
SourceType | Open Website Open Access Repository Aggregation Database Index Database Enrichment Source |
StartPage | 842433 |
SubjectTerms | Affect (Psychology) Arousal Behavior body posture Cognition & reasoning Cognitive ability dynamic walking Emotional behavior Emotions Experiments Kinematics Motion capture Neuroscience Posture valence virtual reality |
SummonAdditionalLinks | – databaseName: DOAJ Directory of Open Access Journals dbid: DOA link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Lb9QwELaqnrhUQCmEFmQk1ANS6MbxI-a2BZZVpXKiorfIT1GpTaptFtQbf4O_xy9hxs6uugjBhWvsKI7n9X3xZIaQl77holHBl3WtbcmdqMDmuC2FAr4lLXjDVDL_9KOcn_GTc3F-p9UX5oTl8sB54440IAxrgZk7CcFGJgJeRQDetq6F9A6970RPVmQq-2AJTpflM0ygYPoodhcd1uZm7DUM8LreiEKpWP-fEObviZJ3Is_sPtkZISOd5qU-IFuhe0h2px3Q5atbekhTEmf6Or5LIoidTr-awSx-fv9xQz-AFN_Qef-NDj1NcSmGBZ2mJA7wcxTdQd9hMgWdLfor-i43qKefzSV-Q8e7EI7ClePe31Js7bsEgv6InM3ef3o7L8dWCqUDvDSUqnLRwz4ZFqJWUUlnrGuUqpsgDf7NWnnmmHY6Nsz6YCaN86wBAzU8MAdxfY9sd7CcJ4QKZ7UXwgMwrLnl3lRWRA-OglsHcNMVZLLa2taNdcax3cVlC3wDpdEmabQojTZLoyCv1rdc5yIbf5t8jPJaT8T62OkCaE07ak37L60pyMFK2u1otPAQqaTUeI5bkBfrYTA3PEMxXeiXOKcR2HNewpzHWTnWK8HCQRyGC6I21GZjqZsj3cWXVNJbI7Hm1dP_8W775B5uF-azVeKAbA-LZXgGyGmwz5OR_AJU8BWK priority: 102 providerName: Directory of Open Access Journals – databaseName: ProQuest Central dbid: BENPR link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1Lb9QwELbK9sIFAeWx0CIjIQ5IoYnjR4xUoV3oskJihRAVvUV-JFCpTUrIgnrjb_Tv9Zcw4zzEItSrPVZGmYe_scczhDzzGReZKnyUptpG3IkEbI7bSCiIt6QFbxhK5n9YyeURf38sjrfIangLg2mVg08MjtrXDs_I95lUEiI5psTr8-8Rdo3C29WhhYbpWyv4g1Bi7AbZBpcs4gnZnh-uPn4afLMEZxzuPyW-FQKw3t1zQpim98vqpML63Yy9BCKephs7VSjo_z8U-m8y5V-70-I2udXDSjrr9OAO2Sqqu2RnVkFIfXZBn9OQ6BlO0HdICapBZz9Na5qr35c_6DuQ9Cu6rH_RtqZh7yqLhs5Cogf4Qoouo64w4YIumvqMvu2a2NMv5hTP2XEVQlYYmdf-gmL73zUE8ffI0eLw85tl1LdbiBxgqjZSiSs9i7VhRalVqaQz1mVKpVkhDb54TTxzTDtdZsz6wsSZ8ywDIza8YA72_vtkUgE7DwkVzmovhAfwmHLLvUmsKD04E24dQFI3JfHwa3PX1yLHlhinOcQkKI08SCNHaeSdNKbkxbjkvCvEcR3xHOU1EmIN7TBQN1_z3iRzDdjV2lTHTgKMkeFoJykhpLNpKqQHJncHaee9YcNHRjWckqfjNJgk3rOYqqjXSJMJ7EsvgeZBpxwjJ1hciMP0lKgNtdlgdXOmOvkWyn5rDL558uh6th6Tm_gjMJstEbtk0jbrYg9wU2uf9MbwB1z4F10 priority: 102 providerName: ProQuest |
Title | The Avatar’s Gist: How to Transfer Affective Components From Dynamic Walking to Static Body Postures |
URI | https://www.ncbi.nlm.nih.gov/pubmed/35784850 https://www.proquest.com/docview/2676690275 https://www.proquest.com/docview/2685032065 https://pubmed.ncbi.nlm.nih.gov/PMC9240741 https://doaj.org/article/9193bb390c62416093301f096b3356dc |
Volume | 16 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrR1Ni9NAdFh3L15EXT-qaxlBPAhZm8nMJCOItLrdIriIWOwtzFd0oZtoNlV782_49_wlvjdJg5XiyUsOmTcwmff9kfcIeeQyLrLUuyhJlIm4FTHwHDeRSMHfkgakYWiZ_-ZMzub89UIs9shmvFV3gZc7XTucJzWvl8ffv6xfAMM_R48T9O3TojwvsfM2Y8cZZzxJrpCDkC7CSj7eJxUkSGLWJjZ3b9tSTaGD_y6z8-_qyT_U0fQ6udbZkXTcIv4G2fPlTXI4LsGHvljTxzRUdoaQ-SEpgBbo-KtudP3rx89LegqofUZn1TfaVDQoq8LXdBwqO0D4UZQRVYkVFnRaVxf0VTu1nn7QSwys4y60UeHNpHJrivN-V-C13yLz6cn7l7Oom68QWTCimiiNbeHYSGnmC5UWqbTa2CxNk8xLjb-4xo5ZpqwqMmac16PMOpYB12rumQVlf5vsl3Ccu4QKa5QTwoG1mHDDnY6NKBxID24s2KB2QEabq81t13wcZ2Asc3BCEBt5wEaO2MhbbAzIk37L57bzxr-AJ4ivHhCbZocXVf0x73gwV2CsGpOokZVAHjLEcuICfDiTJEI6OOTRBtv5hhBzJlMpFSZ3B-Rhvww8iIkVXfpqhTCZwEH0EmDutMTRnwS7CXFYHpB0i2y2jrq9Up5_Cn2-FXrbPL73P77tPrmK14VFbrE4IvtNvfIPwJxqzJAcTE7O3r4bhnAEPE8X8TAwzm9KaSJU |
linkProvider | Scholars Portal |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1fb9MwELem7gFeEDD-lA0wEvCAFNY4tpMgTahlKx3bKoQ2sbfgf2GV1mSkLVPf-Bp8GT4Mn4Q7N6koQnvba2wrJ9_57nfn8x0hz23CRRI7G0RRqgNuRAhnjutAxOBvSQ3a0JfMPxrKwQn_cCpO18iv5i0MplU2OtEralsajJFvMxlL8ORYLN5efAuwaxTerjYtNFTdWsHu-BJj9cOOAze_BBdusrO_C_x-wVh_7_jdIKi7DAQGoMQ0iEOTW9ZJFXN5GuexNEqbJI6jxEmFDz1DywxLTZonTFunOomxLAHZVdwx4wsfgAlY5xhAaZH13t7w46fGFkhQ_v6-VeLbJHAOFveq4Bam23kxKrBeOGOvYRKPohXL6BsI_A_1_pu8-Zc17N8mt2oYS7sLubtD1lxxl2x0C3Dhx3P6kvrEUh-x3yA5iCLtfldTVf3-8XNC34NkvaGD8pJOS-ptZe4q2vWJJaB7KaqossAED9qvyjHdnRdqPDL0szrHuD6uQogMX3qlnVNsNzyr3OQeObmWjb9PWgWQ85BQYXRqhbAAViOuuVWhFrkF5cW1AQhs2qTTbG1m6trn2ILjPAMfCLmReW5kyI1swY02ebVccrEo_HHV5B7yazkRa3b7D2X1NatVQJYCVtY6SjtGAmySPpQU5uBC6igS0gKRWw23s1qRwE-WYt8mz5bDoALwXkcVrpzhnER0IgZgsk0eLIRjSQkWM-Iw3CbxitiskLo6UozOfJnxFJ19Hj66mqyn5Mbg-OgwO9wfHmySm7gpmEkXii3SmlYz9xgw21Q_qQ8GJV-u-yz-AXadVNI |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3dbtMwFLamTULcIGD8lA0wEnCBFNo4sZ0gTailKx2DakJM7C74L6zSmoy0Zeodr8Er8Rg8Cee4SUUR2t1uY1s58vn7jn18DiFPbRLzRDobRFGqg9jwEHQu1gGXEG8JDdbQl8z_MBLD4_jdCT_ZIL-atzCYVtnYRG-obWnwjLzNhBQQyTHJ23mdFnHUH7w-_xZgBym8aW3aaai6zYLd8-XG6kceh25xAeHcdO-gD7x_xthg_9ObYVB3HAgMwIpZIEOTW9ZJFXN5KnMpjNImkTJKnFD46DO0zLDUpHnCtHWqkxjLEpBjFTtmfBEEcAdbErw-BIJbvf3R0cfGLwhwBP7uVeA7JQgUlnesECKm7bwYF1g7nLGXMCmOojUv6ZsJ_A8B_5vI-ZdnHNwkN2pIS7tLGbxFNlxxm2x3CwjnJwv6nPokU396v01yEEva_a5mqvr94-eUvgUpe0WH5QWdldT7zdxVtOuTTMAOUzRXZYHJHnRQlRPaXxRqMjb0szrDM35chXAZvvRKu6DYenheuekdcnwlG3-XbBZAzn1CudGp5dwCcI1iHVsVap5bMGSxNgCHTYt0mq3NTF0HHdtxnGUQDyE3Ms-NDLmRLbnRIi9WS86XRUAum9xDfq0mYv1u_6Gsvma1OchSwM1aR2nHCIBQwh8rhTmEkzqKuLBA5G7D7aw2KvCTlQq0yJPVMJgDvONRhSvnOCfhnYgBsGyRe0vhWFGChY1iGG4RuSY2a6SujxTjU19yPMXAPw4fXE7WY3INdDJ7fzA63CHXcU8wqS7ku2RzVs3dQ4BvM_2o1gtKvly1Kv4BeQhZFg |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=The+Avatar%E2%80%99s+Gist%3A+How+to+Transfer+Affective+Components+From+Dynamic+Walking+to+Static+Body+Postures&rft.jtitle=Frontiers+in+neuroscience&rft.au=Paolo+Presti&rft.au=Paolo+Presti&rft.au=Davide+Ruzzon&rft.au=Davide+Ruzzon&rft.date=2022-06-15&rft.pub=Frontiers+Media+S.A&rft.eissn=1662-453X&rft.volume=16&rft_id=info:doi/10.3389%2Ffnins.2022.842433&rft.externalDBID=DOA&rft.externalDocID=oai_doaj_org_article_9193bb390c62416093301f096b3356dc |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1662-453X&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1662-453X&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1662-453X&client=summon |