Computational approaches to apply the String Edit Algorithm to create accurate visual scan paths

Eye movement detection algorithms (e.g., I-VT) require the selection of thresholds to identify eye fixations and saccadic movements from gaze data. The choice of threshold is important, as thresholds too low or large may fail to accurately identify eye fixations and saccades. An inaccurate threshold...

Full description

Saved in:
Bibliographic Details
Published inJournal of eye movement research Vol. 17; no. 4
Main Authors Palma Fraga, Ricardo, Kang, Ziho
Format Journal Article
LanguageEnglish
Published Switzerland Bern Open Publishing 01.01.2024
MDPI AG
Subjects
Online AccessGet full text
ISSN1995-8692
1995-8692
DOI10.16910/jemr.17.4.4

Cover

Abstract Eye movement detection algorithms (e.g., I-VT) require the selection of thresholds to identify eye fixations and saccadic movements from gaze data. The choice of threshold is important, as thresholds too low or large may fail to accurately identify eye fixations and saccades. An inaccurate threshold might also affect the resulting visual scan path, the time-ordered sequence of eye fixations and saccades, carried out by the participant. Commonly used approaches to evaluate threshold accuracy can be manually laborious, or require information about the expected visual scan paths of participants, which might not be available. To address this issue, we propose two different computational approaches, labeled as “between-participants comparisons” and “within-participants comparisons.” The approaches were evaluated using the open-source Gazebase dataset, which contained a bullseye-target tracking task, where participants were instructed to follow the movements of a bullseye-target. The predetermined path of the bullseye-target enabled us to evaluate our proposed approaches against the expected visual scan path. The approaches identified threshold values (220°/s and 210°/s) that were 83% similar to the expected visual scan path, outperforming a 30°/s benchmark threshold (41.5%). These methods might assist researchers in identifying accurate threshold values for the I-VT algorithm or potentially other eye movement detection algorithms.
AbstractList Eye movement detection algorithms (e.g., I-VT) require the selection of thresholds to identify eye fixations and saccadic movements from gaze data. The choice of threshold is important, as thresholds too low or large may fail to accurately identify eye fixations and saccades. An inaccurate threshold might also affect the resulting visual scan path, the time-ordered sequence of eye fixations and saccades, carried out by the participant. Commonly used approaches to evaluate threshold accuracy can be manually laborious, or require information about the expected visual scan paths of participants, which might not be available. To address this issue, we propose two different computational approaches, labeled as “between-participants comparisons” and “within-participants comparisons.” The approaches were evaluated using the open-source Gazebase dataset, which contained a bullseye-target tracking task, where participants were instructed to follow the movements of a bullseye-target. The predetermined path of the bullseye-target enabled us to evaluate our proposed approaches against the expected visual scan path. The approaches identified threshold values (220°/s and 210°/s) that were 83% similar to the expected visual scan path, outperforming a 30°/s benchmark threshold (41.5%). These methods might assist researchers in identifying accurate threshold values for the I-VT algorithm or potentially other eye movement detection algorithms.
Eye movement detection algorithms (e.g., I-VT) require the selection of thresholds to identify eye fixations and saccadic movements from gaze data. The choice of threshold is important, as thresholds too low or large may fail to accurately identify eye fixations and saccades. An inaccurate threshold might also affect the resulting visual scan path, the time-ordered sequence of eye fixations and saccades, carried out by the participant. Commonly used approaches to evaluate threshold accuracy can be manually laborious, or require information about the expected visual scan paths of participants, which might not be available. To address this issue, we propose two different computational approaches, labeled as "between-participants comparisons" and "within-participants comparisons." The approaches were evaluated using the open-source Gazebase dataset, which contained a bullseyetarget tracking task, where participants were instructed to follow the movements of a bullseye-target. The predetermined path of the bullseye-target enabled us to evaluate our proposed approaches against the expected visual scan path. The approaches identified threshold values (220°/s and 210°/s) that were 83% similar to the expected visual scan path, outperforming a 30°/s benchmark threshold (41.5%). These methods might assist researchers in identifying accurate threshold values for the IVT algorithm or potentially other eye movement detection algorithms.Eye movement detection algorithms (e.g., I-VT) require the selection of thresholds to identify eye fixations and saccadic movements from gaze data. The choice of threshold is important, as thresholds too low or large may fail to accurately identify eye fixations and saccades. An inaccurate threshold might also affect the resulting visual scan path, the time-ordered sequence of eye fixations and saccades, carried out by the participant. Commonly used approaches to evaluate threshold accuracy can be manually laborious, or require information about the expected visual scan paths of participants, which might not be available. To address this issue, we propose two different computational approaches, labeled as "between-participants comparisons" and "within-participants comparisons." The approaches were evaluated using the open-source Gazebase dataset, which contained a bullseyetarget tracking task, where participants were instructed to follow the movements of a bullseye-target. The predetermined path of the bullseye-target enabled us to evaluate our proposed approaches against the expected visual scan path. The approaches identified threshold values (220°/s and 210°/s) that were 83% similar to the expected visual scan path, outperforming a 30°/s benchmark threshold (41.5%). These methods might assist researchers in identifying accurate threshold values for the IVT algorithm or potentially other eye movement detection algorithms.
Eye movement detection algorithms (e.g., I-VT) require the selection of thresholds to identify eye fixations and saccadic movements from gaze data. The choice of threshold is important, as thresholds too low or large may fail to accurately identify eye fixations and saccades. An inaccurate threshold might also affect the resulting visual scan path, the time-ordered sequence of eye fixations and saccades, carried out by the participant. Commonly used approaches to evaluate threshold accuracy can be manually laborious, or require information about the expected visual scan paths of participants, which might not be available. To address this issue, we propose two different computational approaches, labeled as "between-participants comparisons" and "within-participants comparisons." The approaches were evaluated using the open-source Gazebase dataset, which contained a bullseyetarget tracking task, where participants were instructed to follow the movements of a bullseye-target. The predetermined path of the bullseye-target enabled us to evaluate our proposed approaches against the expected visual scan path. The approaches identified threshold values (220°/s and 210°/s) that were 83% similar to the expected visual scan path, outperforming a 30°/s benchmark threshold (41.5%). These methods might assist researchers in identifying accurate threshold values for the IVT algorithm or potentially other eye movement detection algorithms.
Eye movement detection algorithms (e.g., I-VT) require the selection of thresholds to identify eye fixations and saccadic movements from gaze data. The choice of threshold is important, as thresholds too low or large may fail to accurately identify eye fixations and saccades. An inaccurate threshold might also affect the resulting visual scan path, the time-ordered sequence of eye fixations and saccades, carried out by the participant. Commonly used approaches to evaluate threshold accuracy can be manually laborious, or require information about the expected visual scan paths of participants, which might not be available. To address this issue, we propose two different computational approaches, labeled as “between-participants comparisons” and “within-participants comparisons.” The approaches were evaluated using the open-source Gazebase dataset, which contained a bullseyetarget tracking task, where participants were instructed to follow the movements of a bullseye-target. The predetermined path of the bullseye-target enabled us to evaluate our proposed approaches against the expected visual scan path. The approaches identified threshold values (220°/s and 210°/s) that were 83% similar to the expected visual scan path, outperforming a 30°/s benchmark threshold (41.5%). These methods might assist researchers in identifying accurate threshold values for the IVT algorithm or potentially other eye movement detection algorithms.
Author Palma Fraga, Ricardo
Kang, Ziho
Author_xml – sequence: 1
  givenname: Ricardo
  surname: Palma Fraga
  fullname: Palma Fraga, Ricardo
– sequence: 2
  givenname: Ziho
  surname: Kang
  fullname: Kang, Ziho
BackLink https://www.ncbi.nlm.nih.gov/pubmed/39877930$$D View this record in MEDLINE/PubMed
BookMark eNp9kc1v3CAQxVGVqkm3ufVc-dhDvWXwB3CqolXSRorUQ9IzHWO8ZoWNCzjV_vfxZtMouZQLM_D0m8fjPTkZ_WgI-Qh0DbUE-nVnhrAGvi7X5RtyBlJWuaglO3lRn5LzGHd0WTXjJavfkdNCCs5lQc_I740fpjlhsn5El-E0BY-6NzFL_tC5fZZ6k92mYMdtdtnalF24rQ829cNBooPBZDLUeg6H4t7GeeFEjWM2YerjB_K2QxfN-dO-Ir-uLu82P_Kbn9-vNxc3uS6pSDnnlRC8rhgC6LbuoO3ainOmddcAq1pEVrECOllqoJJD2QjQoq4L0xiJYIoVuT5yW487NQU7YNgrj1Y9HviwVRiS1c6oCnQhQfC2A1byigsOjApNC9FRbhq6sPIjax4n3P9F556BQNVj8OoQvAKuSlUu-m9H_TQ3g2m1GVNA98rE65vR9mrr7xXA8hJWiYXw-YkQ_J_ZxKQGG7VxDkfj56gKqKlcTC8ZrMinl8Oep_z700Xw5SjQwccYTPd_8w_UobSt
ContentType Journal Article
Copyright Copyright (©) 2024 Ricardo Palma Fraga, Ziho Kang.
Copyright (©) 2024 Ricardo Palma Fraga, Ziho Kang 2024 Palma Fraga, R., & Kang, Z.
Copyright_xml – notice: Copyright (©) 2024 Ricardo Palma Fraga, Ziho Kang.
– notice: Copyright (©) 2024 Ricardo Palma Fraga, Ziho Kang 2024 Palma Fraga, R., & Kang, Z.
DBID AAYXX
CITATION
NPM
7X8
5PM
ADTOC
UNPAY
DOA
DOI 10.16910/jemr.17.4.4
DatabaseName CrossRef
PubMed
MEDLINE - Academic
PubMed Central (Full Participant titles)
Unpaywall for CDI: Periodical Content
Unpaywall
DOAJ Directory of Open Access Journals
DatabaseTitle CrossRef
PubMed
MEDLINE - Academic
DatabaseTitleList CrossRef
MEDLINE - Academic
PubMed


Database_xml – sequence: 1
  dbid: DOA
  name: DOAJ Open Access Full Text
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 2
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 3
  dbid: UNPAY
  name: Unpaywall
  url: https://proxy.k.utb.cz/login?url=https://unpaywall.org/
  sourceTypes: Open Access Repository
DeliveryMethod fulltext_linktorsrc
Discipline Anatomy & Physiology
EISSN 1995-8692
ExternalDocumentID oai_doaj_org_article_51c39187df124757871208c038f07eb0
10.16910/jemr.17.4.4
PMC11714258
39877930
10_16910_jemr_17_4_4
Genre Journal Article
GroupedDBID 53G
AAYXX
ABDBF
ACIOP
ACUHS
ADBBV
AENEX
ALMA_UNASSIGNED_HOLDINGS
BAWUL
BCNDV
CITATION
DIK
GROUPED_DOAJ
M~E
OK1
PGMZT
RPM
H13
MODMG
NPM
7X8
5PM
ADTOC
UNPAY
ID FETCH-LOGICAL-c408t-775887652a11cd6f1dfd5772ccfb125daa25231f94c109714b81c8663ebe9a1e3
IEDL.DBID DOA
ISSN 1995-8692
IngestDate Fri Oct 03 12:52:50 EDT 2025
Sun Oct 26 03:42:19 EDT 2025
Thu Aug 21 18:40:26 EDT 2025
Thu Jul 10 22:01:46 EDT 2025
Wed Feb 19 02:12:24 EST 2025
Tue Jul 01 05:18:33 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 4
Keywords algorithms
thresholds
scan path
gaze
eye movement
eye tracking
Language English
License https://creativecommons.org/licenses/by/4.0
Copyright (©) 2024 Ricardo Palma Fraga, Ziho Kang.
This work is licensed under a Creative Commons Attribution 4.0 International License, ( https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use and redistribution provided that the original author and source are credited.
cc-by
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c408t-775887652a11cd6f1dfd5772ccfb125daa25231f94c109714b81c8663ebe9a1e3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
OpenAccessLink https://doaj.org/article/51c39187df124757871208c038f07eb0
PMID 39877930
PQID 3160939152
PQPubID 23479
ParticipantIDs doaj_primary_oai_doaj_org_article_51c39187df124757871208c038f07eb0
unpaywall_primary_10_16910_jemr_17_4_4
pubmedcentral_primary_oai_pubmedcentral_nih_gov_11714258
proquest_miscellaneous_3160939152
pubmed_primary_39877930
crossref_primary_10_16910_jemr_17_4_4
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2024-01-01
PublicationDateYYYYMMDD 2024-01-01
PublicationDate_xml – month: 01
  year: 2024
  text: 2024-01-01
  day: 01
PublicationDecade 2020
PublicationPlace Switzerland
PublicationPlace_xml – name: Switzerland
– name: Bern, Switzerland
PublicationTitle Journal of eye movement research
PublicationTitleAlternate J Eye Mov Res
PublicationYear 2024
Publisher Bern Open Publishing
MDPI AG
Publisher_xml – name: Bern Open Publishing
– name: MDPI AG
SSID ssj0000627426
Score 2.2685602
Snippet Eye movement detection algorithms (e.g., I-VT) require the selection of thresholds to identify eye fixations and saccadic movements from gaze data. The choice...
Eye movement detection algorithms (e.g., I-VT) require the selection of thresholds to identify eye fixations and saccadic movements from gaze data. The choice...
SourceID doaj
unpaywall
pubmedcentral
proquest
pubmed
crossref
SourceType Open Website
Open Access Repository
Aggregation Database
Index Database
SubjectTerms algorithms
eye movement
eye tracking
gaze
scan path
thresholds
SummonAdditionalLinks – databaseName: Unpaywall
  dbid: UNPAY
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwELbK9gAXXuURXjJS6S1L7MSv44K2qpCokGClcjK247QLu9lqk4CWX884j1W3IOCYxE5szzj-PDP-BqHD3ClBk9zGhoQUZtyK2LKUxly6nGfWccHDaeT3p_xklr07Y2d76HA4C3PVf89hKXv91S_XYyLG2Ti7gfY5A8Q9Qvuz0w-Tz63DWLFYckX7mPbrVXZWm5aU_09I8veAyJtNeWk2P8xicWW1Ob6DpkM7uyCTb-OmtmP38xqF4786chfd7uEmnnT6cQ_t-fI-OpiUsNVebvARbgNAW8v6AfrSZXjorYN4YBv3Fa5X4WqxwYAW8cc6mALxNJ_XeLI4X63n9cUyFGnxp8fGuSbwT-Dv86qB91QgPBwSH1cP0Ox4-untSdwnYIhdlsgakDeDfxBn1BACoitIXuQM4LhzhQVglBtDYR9LCpW54MgmmZXEScAwoBnKEJ8-RKNyVfrHCDPqpBWmAJQcCPyNVJakmaWKesBEhYnQq0FQ-rLj2dBhfxKGToeh00ToTGcRehOkuC0T2LHbGzDWup9smhGXKiJFXgB6CYT9gtBEuiSVRSK8TSL0ctABDbMpuEhM6VdNpVPCExU482mEHnU6sf1UqqSAvxnUljvastOW3Sfl_KJl7CYhzTxlMkJHW8X6azef_G_Bp-gWBZjVGYWeoVG9bvxzgEm1fdHPkl9eCxEy
  priority: 102
  providerName: Unpaywall
Title Computational approaches to apply the String Edit Algorithm to create accurate visual scan paths
URI https://www.ncbi.nlm.nih.gov/pubmed/39877930
https://www.proquest.com/docview/3160939152
https://pubmed.ncbi.nlm.nih.gov/PMC11714258
https://doi.org/10.16910/jemr.17.4.4
https://doaj.org/article/51c39187df124757871208c038f07eb0
UnpaywallVersion publishedVersion
Volume 17
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVAON
  databaseName: DOAJ Open Access Full Text
  customDbUrl:
  eissn: 1995-8692
  dateEnd: 99991231
  omitProxy: true
  ssIdentifier: ssj0000627426
  issn: 1995-8692
  databaseCode: DOA
  dateStart: 20070101
  isFulltext: true
  titleUrlDefault: https://www.doaj.org/
  providerName: Directory of Open Access Journals
– providerCode: PRVEBS
  databaseName: EBSCOhost Academic Search Ultimate
  customDbUrl: https://search.ebscohost.com/login.aspx?authtype=ip,shib&custid=s3936755&profile=ehost&defaultdb=asn
  eissn: 1995-8692
  dateEnd: 99991231
  omitProxy: true
  ssIdentifier: ssj0000627426
  issn: 1995-8692
  databaseCode: ABDBF
  dateStart: 20170301
  isFulltext: true
  titleUrlDefault: https://search.ebscohost.com/direct.asp?db=asn
  providerName: EBSCOhost
– providerCode: PRVBFR
  databaseName: Free Medical Journals
  customDbUrl:
  eissn: 1995-8692
  dateEnd: 99991231
  omitProxy: true
  ssIdentifier: ssj0000627426
  issn: 1995-8692
  databaseCode: DIK
  dateStart: 20070101
  isFulltext: true
  titleUrlDefault: http://www.freemedicaljournals.com
  providerName: Flying Publisher
– providerCode: PRVHPJ
  databaseName: ROAD: Directory of Open Access Scholarly Resources
  customDbUrl:
  eissn: 1995-8692
  dateEnd: 99991231
  omitProxy: true
  ssIdentifier: ssj0000627426
  issn: 1995-8692
  databaseCode: M~E
  dateStart: 20070101
  isFulltext: true
  titleUrlDefault: https://road.issn.org
  providerName: ISSN International Centre
– providerCode: PRVAQN
  databaseName: PubMed Central
  customDbUrl:
  eissn: 1995-8692
  dateEnd: 99991231
  omitProxy: true
  ssIdentifier: ssj0000627426
  issn: 1995-8692
  databaseCode: RPM
  dateStart: 20170101
  isFulltext: true
  titleUrlDefault: https://www.ncbi.nlm.nih.gov/pmc/
  providerName: National Library of Medicine
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Lj9MwELbQcoALApZHeKyMBHtrN3Yc2zmmsNUKiRUSVFpOxnYctqhNV00C6r9nJkmrViC4cHTiJLZnbH8znnxDyOvCZ4rHhRtZhinMpFMjlyZ8JLUvpHBeKol_I3-4lBcz8f4qvdpL9YUxYT09cD9wZynzSca0KkrYiZB8XTEeax8nuoxVcJ21Hutsz5jq12A8gpRDpLuEPfHse1iux0yNxVgc7EEdVf-f8OXvYZJ32urGbn7axWJvD5reJ_cG8EjzvtEPyK1QPSTHeQWG83JDT2kXztn5yY_J1z5fw-Dro1vu8FDTZoWlxYYC9qOfGnTs0fNi3tB88W21njfXS6zSoclArfctsknQH_O6hffUIAqKaYzrR2Q2Pf_89mI0pFMYeRHrBnB0CiuKTLllDARRsqIsUgDX3pcOYE5hLQerlJWZ8HgszYTTzGtAJCDnzLKQPCZH1aoKTwlNuddO2RIwL9LxW505lgjHMx4A4ZQ2Im-2A2xuetYMg9YGCsKgIAxTRhgRkQmO_q4Ocl13F0ADzKAB5l8aEJFXW9kZmBt44GGrsGprkzAZZ8iAzyPypJfl7lNJphWsTfC0PpDyQVsO71Tz645_m2HSeJ7qiJzuFOKv3Xz2P7r5nNzlAKh6988LctSs2_ASAFHjTsjtfPJuMj3p5gCUZpcf8y-_AKuODBs
linkProvider Directory of Open Access Journals
linkToUnpaywall http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwELbK9gAXXuURXjJS6S1L7MSv44K2qpCokGClcjK247QLu9lqk4CWX884j1W3IOCYxE5szzj-PDP-BqHD3ClBk9zGhoQUZtyK2LKUxly6nGfWccHDaeT3p_xklr07Y2d76HA4C3PVf89hKXv91S_XYyLG2Ti7gfY5A8Q9Qvuz0w-Tz63DWLFYckX7mPbrVXZWm5aU_09I8veAyJtNeWk2P8xicWW1Ob6DpkM7uyCTb-OmtmP38xqF4786chfd7uEmnnT6cQ_t-fI-OpiUsNVebvARbgNAW8v6AfrSZXjorYN4YBv3Fa5X4WqxwYAW8cc6mALxNJ_XeLI4X63n9cUyFGnxp8fGuSbwT-Dv86qB91QgPBwSH1cP0Ox4-untSdwnYIhdlsgakDeDfxBn1BACoitIXuQM4LhzhQVglBtDYR9LCpW54MgmmZXEScAwoBnKEJ8-RKNyVfrHCDPqpBWmAJQcCPyNVJakmaWKesBEhYnQq0FQ-rLj2dBhfxKGToeh00ToTGcRehOkuC0T2LHbGzDWup9smhGXKiJFXgB6CYT9gtBEuiSVRSK8TSL0ctABDbMpuEhM6VdNpVPCExU482mEHnU6sf1UqqSAvxnUljvastOW3Sfl_KJl7CYhzTxlMkJHW8X6azef_G_Bp-gWBZjVGYWeoVG9bvxzgEm1fdHPkl9eCxEy
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Computational+approaches+to+apply+the+String+Edit+Algorithm+to+create+accurate+visual+scan+paths&rft.jtitle=Journal+of+eye+movement+research&rft.au=Palma+Fraga%2C+Ricardo&rft.au=Kang%2C+Ziho&rft.date=2024-01-01&rft.issn=1995-8692&rft.eissn=1995-8692&rft.volume=17&rft.issue=4&rft_id=info:doi/10.16910%2Fjemr.17.4.4&rft.externalDBID=NO_FULL_TEXT
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1995-8692&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1995-8692&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1995-8692&client=summon