Learning to Decode Cognitive States from Brain Images

Over the past decade, functional Magnetic Resonance Imaging (fMRI) has emerged as a powerful new instrument to collect vast quantities of data about activity in the human brain. A typical fMRI experiment can produce a three-dimensional image related to the human subject's brain activity every h...

Full description

Saved in:
Bibliographic Details
Published inMachine learning Vol. 57; no. 1-2; pp. 145 - 175
Main Authors Mitchell, Tom M., Hutchinson, Rebecca, Niculescu, Radu S., Pereira, Francisco, Wang, Xuerui, Just, Marcel, Newman, Sharlene
Format Journal Article
LanguageEnglish
Published Dordrecht Springer Nature B.V 01.10.2004
Subjects
Online AccessGet full text
ISSN0885-6125
1573-0565
1573-0565
DOI10.1023/B:MACH.0000035475.85309.1b

Cover

Abstract Over the past decade, functional Magnetic Resonance Imaging (fMRI) has emerged as a powerful new instrument to collect vast quantities of data about activity in the human brain. A typical fMRI experiment can produce a three-dimensional image related to the human subject's brain activity every half second, at a spatial resolution of a few millimeters. As in other modern empirical sciences, this new instrumentation has led to a flood of new data, and a corresponding need for new data analysis methods. We describe recent research applying machine learning methods to the problem of classifying the cognitive state of a human subject based on fRMI data observed over a single time interval. In particular, we present case studies in which we have successfully trained classifiers to distinguish cognitive states such as (1) whether the human subject is looking at a picture or a sentence, (2) whether the subject is reading an ambiguous or non-ambiguous sentence, and (3) whether the word the subject is viewing is a word describing food, people, buildings, etc. This learning problem provides an interesting case study of classifier learning from extremely high dimensional (10 super(5) features), extremely sparse (tens of training examples), noisy data. This paper summarizes the results obtained in these three case studies, as well as lessons learned about how to successfully apply machine learning methods to train classifiers in such settings.
AbstractList Over the past decade, functional Magnetic Resonance Imaging (fMRI) has emerged as a powerful new instrument to collect vast quantities of data about activity in the human brain. A typical fMRI experiment can produce a three-dimensional image related to the human subject's brain activity every half second, at a spatial resolution of a few millimeters. As in other modern empirical sciences, this new instrumentation has led to a flood of new data, and a corresponding need for new data analysis methods. We describe recent research applying machine learning methods to the problem of classifying the cognitive state of a human subject based on fRMI data observed over a single time interval. In particular, we present case studies in which we have successfully trained classifiers to distinguish cognitive states such as (1) whether the human subject is looking at a picture or a sentence, (2) whether the subject is reading an ambiguous or non-ambiguous sentence, and (3) whether the word the subject is viewing is a word describing food, people, buildings, etc. This learning problem provides an interesting case study of classifier learning from extremely high dimensional (10 super(5) features), extremely sparse (tens of training examples), noisy data. This paper summarizes the results obtained in these three case studies, as well as lessons learned about how to successfully apply machine learning methods to train classifiers in such settings.
Issue Title: Special Issue: Data Mining Lessons Learned Over the past decade, functional Magnetic Resonance Imaging (fMRI) has emerged as a powerful new instrument to collect vast quantities of data about activity in the human brain. A typical fMRI experiment can produce a three-dimensional image related to the human subject's brain activity every half second, at a spatial resolution of a few millimeters. As in other modern empirical sciences, this new instrumentation has led to a flood of new data, and a corresponding need for new data analysis methods. We describe recent research applying machine learning methods to the problem of classifying the cognitive state of a human subject based on fRMI data observed over a single time interval. In particular, we present case studies in which we have successfully trained classifiers to distinguish cognitive states such as (1) whether the human subject is looking at a picture or a sentence, (2) whether the subject is reading an ambiguous or non-ambiguous sentence, and (3) whether the word the subject is viewing is a word describing food, people, buildings, etc. This learning problem provides an interesting case study of classifier learning from extremely high dimensional (10^sup 5^ features), extremely sparse (tens of training examples), noisy data. This paper summarizes the results obtained in these three case studies, as well as lessons learned about how to successfully apply machine learning methods to train classifiers in such settings.[PUBLICATION ABSTRACT]
Over the past decade, functional Magnetic Resonance Imaging (fMRI) has emerged as a powerful new instrument to collect vast quantities of data about activity in the human brain. A typical fMRI experiment can produce a three-dimensional image related to the human subject's brain activity every half second, at a spatial resolution of a few millimeters. As in other modern empirical sciences, this new instrumentation has led to a flood of new data, and a corresponding need for new data analysis methods. We describe recent research applying machine learning methods to the problem of classifying the cognitive state of a human subject based on fRMI data observed over a single time interval. In particular, we present case studies in which we have successfully trained classifiers to distinguish cognitive states such as (1) whether the human subject is looking at a picture or a sentence, (2) whether the subject is reading an ambiguous or non-ambiguous sentence, and (3) whether the word the subject is viewing is a word describing food, people, buildings, etc. This learning problem provides an interesting case study of classifier learning from extremely high dimensional (105 features), extremely sparse (tens of training examples), noisy data. This paper summarizes the results obtained in these three case studies, as well as lessons learned about how to successfully apply machine learning methods to train classifiers in such settings.
Author Just, Marcel
Wang, Xuerui
Mitchell, Tom M.
Newman, Sharlene
Pereira, Francisco
Hutchinson, Rebecca
Niculescu, Radu S.
Author_xml – sequence: 1
  givenname: Tom M.
  surname: Mitchell
  fullname: Mitchell, Tom M.
– sequence: 2
  givenname: Rebecca
  surname: Hutchinson
  fullname: Hutchinson, Rebecca
– sequence: 3
  givenname: Radu S.
  surname: Niculescu
  fullname: Niculescu, Radu S.
– sequence: 4
  givenname: Francisco
  surname: Pereira
  fullname: Pereira, Francisco
– sequence: 5
  givenname: Xuerui
  surname: Wang
  fullname: Wang, Xuerui
– sequence: 6
  givenname: Marcel
  surname: Just
  fullname: Just, Marcel
– sequence: 7
  givenname: Sharlene
  surname: Newman
  fullname: Newman, Sharlene
BookMark eNqFkc1P2zAchi3USbTA_xBxQFzS-etnO-VC232AVMSB7Ww5jlOCErvY6Sb--6XrxKRKgC--PO8rv48naOSDdwidEzwlmLLPi9ndfHkzxbvDgEuYKmC4mJLyCI0JSJZjEDBCY6wU5IJQOEaTlJ4GnAolxghWzkTf-HXWh-yLs6Fy2TKsfdM3v1z20JvepayOocsW0TQ-u-3M2qVT9Kk2bXJn_-4T9PPb1x_Lm3x1__12OV_lFjDpcw5Y1GWhOFBpS0WgrsEajmtKDFSSUlIZR5SsmOE1pRUQZbkFywtmygIoO0FX-96t35iX36Zt9SY2nYkvmmC9M6BL3Rn7qP8b0H8NaFIO6Yt9ehPD89alXndNsq5tjXdhmzRVQkgmyQBevgsOTwTFC6zkh51ECkUV5QN4fgA-hW30gy0tQWKCBdvtu95DNoaUoqu1bQblTfD9YLt9HbnQu29-Y-TsoOLQ0DvhP8yuqqM
CitedBy_id crossref_primary_10_1016_j_mri_2012_11_009
crossref_primary_10_1186_2196_0089_1_1
crossref_primary_10_1016_j_cub_2006_04_003
crossref_primary_10_1007_s11682_018_9901_5
crossref_primary_10_1016_j_neuroimage_2007_04_009
crossref_primary_10_1038_nn_4450
crossref_primary_10_1073_pnas_2003480117
crossref_primary_10_1016_j_neuropsychologia_2015_04_009
crossref_primary_10_1038_s41598_020_61737_1
crossref_primary_10_1016_j_brainres_2011_05_054
crossref_primary_10_1016_j_neuroimage_2009_01_032
crossref_primary_10_1080_02643294_2016_1182480
crossref_primary_10_1016_j_neuron_2019_09_040
crossref_primary_10_1002_ima_20225
crossref_primary_10_4236_am_2011_21010
crossref_primary_10_1016_j_neuroimage_2017_06_033
crossref_primary_10_1016_j_patcog_2011_04_006
crossref_primary_10_1007_s10586_014_0369_9
crossref_primary_10_1016_j_artint_2012_06_005
crossref_primary_10_1155_2018_2740817
crossref_primary_10_1016_j_mri_2012_07_010
crossref_primary_10_1016_j_jneumeth_2017_05_004
crossref_primary_10_1093_cercor_bhu057
crossref_primary_10_1016_j_mri_2009_12_021
crossref_primary_10_1093_cercor_bhx322
crossref_primary_10_1177_1357034X15623363
crossref_primary_10_1371_journal_pone_0069566
crossref_primary_10_1016_j_neuroimage_2010_05_051
crossref_primary_10_1109_TAMD_2015_2434733
crossref_primary_10_1016_j_neuron_2019_05_026
crossref_primary_10_1371_journal_pone_0079271
crossref_primary_10_1016_j_jneumeth_2015_10_001
crossref_primary_10_1016_j_neuroimage_2008_06_024
crossref_primary_10_1016_j_neunet_2014_01_006
crossref_primary_10_1016_j_cortex_2019_11_021
crossref_primary_10_1016_j_physa_2016_09_058
crossref_primary_10_1109_TBME_2009_2025866
crossref_primary_10_1016_j_neuroimage_2010_09_062
crossref_primary_10_3233_XST_160565
crossref_primary_10_1016_j_neuroimage_2005_06_070
crossref_primary_10_1016_j_neuroimage_2011_03_081
crossref_primary_10_1016_j_neuroimage_2009_01_052
crossref_primary_10_1016_j_neuropsychologia_2019_05_032
crossref_primary_10_1142_S0219622006002283
crossref_primary_10_1016_j_neuroimage_2013_03_041
crossref_primary_10_1109_ACCESS_2018_2884739
crossref_primary_10_1016_j_brainres_2009_05_090
crossref_primary_10_1016_j_neuroimage_2010_05_081
crossref_primary_10_1177_1550059418782093
crossref_primary_10_1016_j_tics_2006_07_005
crossref_primary_10_3389_fnhum_2017_00445
crossref_primary_10_4018_jkdb_2011040102
crossref_primary_10_1016_j_neuroimage_2011_06_053
crossref_primary_10_1109_TOH_2016_2593727
crossref_primary_10_1016_j_neuroimage_2008_06_037
crossref_primary_10_1080_19312458_2018_1520823
crossref_primary_10_1523_JNEUROSCI_3392_15_2016
crossref_primary_10_1523_JNEUROSCI_0371_22_2022
crossref_primary_10_1016_j_dcan_2021_03_004
crossref_primary_10_1016_j_neuroimage_2007_02_045
crossref_primary_10_1371_journal_pone_0014277
crossref_primary_10_1016_j_neuroimage_2008_11_007
crossref_primary_10_1016_j_neuroimage_2013_11_043
crossref_primary_10_1002_hbm_22842
crossref_primary_10_1093_scan_nss063
crossref_primary_10_1016_j_neuroimage_2010_04_271
crossref_primary_10_1016_j_nicl_2016_02_018
crossref_primary_10_3389_fninf_2023_1154835
crossref_primary_10_1016_j_neuroimage_2006_11_040
crossref_primary_10_1016_j_neuron_2009_08_011
crossref_primary_10_1016_j_neuroimage_2011_12_068
crossref_primary_10_1007_s12021_014_9238_1
crossref_primary_10_1109_TMI_2014_2298856
crossref_primary_10_3389_fncom_2014_00131
crossref_primary_10_1038_s41598_019_54280_1
crossref_primary_10_1002_hbm_20735
crossref_primary_10_1002_wics_1282
crossref_primary_10_1007_s10618_010_0198_2
crossref_primary_10_1016_j_compbiomed_2014_02_003
crossref_primary_10_1147_JRD_2017_2648699
crossref_primary_10_1038_s41598_019_55887_0
crossref_primary_10_1371_journal_pone_0183295
crossref_primary_10_1016_j_neuroimage_2016_04_063
crossref_primary_10_1371_journal_pone_0097296
crossref_primary_10_1007_s11517_024_03080_5
crossref_primary_10_1117_1_3606494
crossref_primary_10_1016_j_neuroimage_2011_09_051
crossref_primary_10_3389_fonc_2022_924245
crossref_primary_10_1016_j_neuroimage_2013_07_026
crossref_primary_10_1016_j_jneumeth_2015_09_032
crossref_primary_10_7554_eLife_79370
crossref_primary_10_1016_j_neuroimage_2010_07_044
crossref_primary_10_1093_cercor_bhx243
crossref_primary_10_1016_j_neuroimage_2020_117408
crossref_primary_10_1523_JNEUROSCI_1676_14_2014
crossref_primary_10_4061_2011_280289
crossref_primary_10_1002_hbm_21498
crossref_primary_10_1093_cercor_bhac149
crossref_primary_10_1016_j_neuroimage_2011_12_059
crossref_primary_10_1080_13825585_2021_2019184
crossref_primary_10_1142_S0219635216500035
crossref_primary_10_1016_j_neuroimage_2007_02_020
crossref_primary_10_1016_j_neuroimage_2011_08_011
crossref_primary_10_1016_j_neuroimage_2007_02_022
crossref_primary_10_1109_TPAMI_2018_2815524
crossref_primary_10_5392_IJoC_2012_8_4_056
crossref_primary_10_1016_j_neuroimage_2011_09_025
crossref_primary_10_1016_j_bbr_2016_06_043
crossref_primary_10_1002_hbm_21296
crossref_primary_10_1016_j_neuroimage_2006_06_032
crossref_primary_10_1109_TNSRE_2011_2163145
crossref_primary_10_1371_journal_pone_0017191
crossref_primary_10_1007_s10994_019_05865_4
crossref_primary_10_1109_ACCESS_2020_3006521
crossref_primary_10_1162_neco_2007_09_06_340
crossref_primary_10_3233_IDA_170881
crossref_primary_10_1007_s11571_014_9291_3
crossref_primary_10_3758_s13415_013_0165_7
crossref_primary_10_1016_j_neuron_2007_06_015
crossref_primary_10_3389_fnins_2017_00669
crossref_primary_10_1016_j_neuroimage_2016_04_044
crossref_primary_10_1371_journal_pone_0203520
crossref_primary_10_1016_j_patcog_2011_04_025
crossref_primary_10_1002_hbm_22490
crossref_primary_10_1016_j_neuroimage_2015_06_005
crossref_primary_10_1126_sciadv_adn2776
crossref_primary_10_3389_fnbeh_2018_00297
crossref_primary_10_1002_hbm_23102
crossref_primary_10_1111_j_1749_6632_2009_04420_x
crossref_primary_10_1007_s11042_018_5901_0
crossref_primary_10_1007_s11045_013_0254_3
crossref_primary_10_1016_j_neuroimage_2008_12_074
crossref_primary_10_4018_ijehmc_2014040101
crossref_primary_10_1016_j_neuroimage_2011_01_061
crossref_primary_10_1007_s10548_011_0213_y
crossref_primary_10_1016_j_neuroimage_2009_02_041
crossref_primary_10_1016_j_neuroimage_2010_08_050
crossref_primary_10_1016_j_patcog_2011_04_015
crossref_primary_10_1038_s41598_017_17314_0
crossref_primary_10_1152_jn_01103_2011
crossref_primary_10_1162_jocn_2007_19_11_1735
crossref_primary_10_1162_neco_a_01196
crossref_primary_10_5465_amp_2017_0159
crossref_primary_10_3389_fnins_2017_00434
crossref_primary_10_1016_j_jad_2017_06_061
crossref_primary_10_1016_j_neuroimage_2007_06_017
crossref_primary_10_1016_j_neuroimage_2017_08_049
crossref_primary_10_1016_j_jneumeth_2008_04_008
crossref_primary_10_1371_journal_pone_0189508
crossref_primary_10_1007_s11682_014_9292_1
crossref_primary_10_1016_j_neuroimage_2008_05_050
crossref_primary_10_1016_j_neuroimage_2013_05_072
crossref_primary_10_1016_j_neuroimage_2010_07_073
crossref_primary_10_1016_j_neuroimage_2015_05_078
crossref_primary_10_1016_j_neuroimage_2010_07_074
crossref_primary_10_1073_pnas_0705654104
crossref_primary_10_3390_bioengineering10121341
crossref_primary_10_1142_S0219635209002186
crossref_primary_10_1016_j_csda_2019_02_005
crossref_primary_10_3389_fnhum_2015_00327
crossref_primary_10_1093_scan_nss138
crossref_primary_10_1109_TMI_2012_2206047
crossref_primary_10_1007_s10548_024_01034_6
crossref_primary_10_1109_TAMD_2015_2427341
crossref_primary_10_1126_science_1152876
crossref_primary_10_1016_j_neuroimage_2015_08_051
crossref_primary_10_1177_1745691612469037
crossref_primary_10_1371_journal_pone_0001394
crossref_primary_10_1016_j_nicl_2013_05_001
crossref_primary_10_1155_2011_350838
crossref_primary_10_1049_iet_spr_2012_0315
crossref_primary_10_1523_JNEUROSCI_0400_09_2009
crossref_primary_10_3389_fnhum_2024_1305164
crossref_primary_10_1016_j_neuroimage_2009_03_054
crossref_primary_10_1371_journal_pone_0003690
crossref_primary_10_1016_j_neuroimage_2010_03_057
crossref_primary_10_1038_nn_2303
crossref_primary_10_1016_j_cortex_2018_02_006
crossref_primary_10_1109_TNN_2010_2060353
crossref_primary_10_1007_s10115_010_0348_2
crossref_primary_10_1016_j_neuroimage_2010_03_059
crossref_primary_10_1162_jocn_a_01335
crossref_primary_10_1016_j_neuroimage_2006_08_016
crossref_primary_10_1002_hbm_20243
crossref_primary_10_1109_THMS_2013_2296871
crossref_primary_10_1109_TMI_2014_2374074
crossref_primary_10_1523_JNEUROSCI_3352_13_2014
crossref_primary_10_1016_j_media_2014_01_006
crossref_primary_10_1016_j_cortex_2017_01_011
crossref_primary_10_1016_j_neuroimage_2015_01_036
crossref_primary_10_1109_TITB_2010_2055574
crossref_primary_10_1016_j_media_2012_02_005
crossref_primary_10_1109_TNSRE_2008_926701
crossref_primary_10_1016_j_neuroimage_2018_09_031
crossref_primary_10_1142_S0219635214500241
crossref_primary_10_1088_1741_2552_ad8839
crossref_primary_10_1016_j_neuroimage_2020_116634
crossref_primary_10_1016_j_physbeh_2018_06_017
crossref_primary_10_1038_ncomms2374
crossref_primary_10_1007_s10548_013_0322_x
crossref_primary_10_1016_j_neuroimage_2020_116752
crossref_primary_10_1016_j_ijpsycho_2006_03_016
crossref_primary_10_1002_sam_11152
crossref_primary_10_1371_journal_pone_0035860
crossref_primary_10_1109_ACCESS_2017_2698068
crossref_primary_10_1016_j_neuroimage_2011_04_042
crossref_primary_10_1016_j_mri_2008_01_052
crossref_primary_10_1007_s10994_009_5159_x
crossref_primary_10_3390_e21121228
crossref_primary_10_1016_j_chaos_2019_04_019
crossref_primary_10_1016_j_neuroimage_2007_03_072
crossref_primary_10_1093_scan_nsz097
crossref_primary_10_1109_TPAMI_2008_215
crossref_primary_10_1109_TNSRE_2015_2396495
crossref_primary_10_1016_j_neuroimage_2014_03_074
crossref_primary_10_1088_0031_9155_59_10_2517
crossref_primary_10_1371_journal_pone_0189541
crossref_primary_10_1038_nrn1931
crossref_primary_10_1002_mrm_23106
crossref_primary_10_1162_NECO_a_00024
crossref_primary_10_1016_j_neuroimage_2014_06_022
crossref_primary_10_1016_j_neuroimage_2010_02_040
crossref_primary_10_1016_j_neuroimage_2018_11_022
crossref_primary_10_1016_j_neuropsychologia_2019_107140
crossref_primary_10_1016_j_neuroimage_2012_12_062
crossref_primary_10_1016_j_neuroscience_2020_07_040
crossref_primary_10_1371_journal_pone_0165612
crossref_primary_10_1002_hbm_22087
crossref_primary_10_1186_s40708_020_00120_2
crossref_primary_10_1038_s41582_020_0377_8
crossref_primary_10_1038_s42003_023_04804_3
crossref_primary_10_1016_j_neuroimage_2010_05_026
crossref_primary_10_1371_journal_pone_0104586
crossref_primary_10_1109_TSIPN_2017_2679491
crossref_primary_10_5155_eurjchem_1_1_54_60_2
crossref_primary_10_1016_j_cogsys_2008_08_010
crossref_primary_10_1111_ejn_13770
crossref_primary_10_1016_j_neuroimage_2012_08_005
crossref_primary_10_1007_s13042_011_0030_3
crossref_primary_10_1088_1741_2552_abf2e5
crossref_primary_10_1007_s12021_013_9204_3
crossref_primary_10_1109_TMI_2011_2113378
crossref_primary_10_1093_brain_awx274
crossref_primary_10_1016_j_mri_2014_04_004
crossref_primary_10_1016_j_neucom_2014_12_082
crossref_primary_10_3389_fnins_2016_00619
crossref_primary_10_1016_j_pscychresns_2017_03_003
crossref_primary_10_1016_j_neuroimage_2009_03_016
crossref_primary_10_1038_nmeth_1635
crossref_primary_10_1109_JSTSP_2008_2007788
crossref_primary_10_7554_eLife_21397
crossref_primary_10_1146_annurev_psych_120710_100344
crossref_primary_10_1007_s11055_018_0676_3
crossref_primary_10_1371_journal_pone_0086314
crossref_primary_10_3389_fnbot_2018_00069
crossref_primary_10_1016_j_bspc_2017_08_021
crossref_primary_10_1016_j_jocs_2013_04_003
crossref_primary_10_1007_s11682_013_9238_z
crossref_primary_10_1002_mrm_22159
crossref_primary_10_1186_s40708_019_0094_5
crossref_primary_10_1109_JBHI_2021_3052044
crossref_primary_10_1371_journal_pone_0058632
crossref_primary_10_1371_journal_pone_0015065
crossref_primary_10_1002_brb3_549
crossref_primary_10_1002_hbm_20569
crossref_primary_10_1002_hbm_20326
crossref_primary_10_1016_j_neuron_2014_06_001
crossref_primary_10_1016_j_neuroimage_2009_11_064
crossref_primary_10_1016_j_neuroimage_2017_08_026
crossref_primary_10_1002_hbm_25013
crossref_primary_10_1016_j_neunet_2015_04_009
crossref_primary_10_1016_j_tics_2018_03_003
crossref_primary_10_1016_j_neuroimage_2015_05_057
crossref_primary_10_1109_TMI_2012_2216543
crossref_primary_10_1007_s11682_014_9294_z
crossref_primary_10_1038_srep18893
crossref_primary_10_1002_sam_10141
crossref_primary_10_1016_j_neuroimage_2012_05_057
crossref_primary_10_1109_TMI_2009_2037756
crossref_primary_10_1093_cercor_bhy080
crossref_primary_10_1007_s11042_023_15935_4
crossref_primary_10_1088_1741_2560_6_1_016003
crossref_primary_10_1016_j_neuroimage_2010_06_052
crossref_primary_10_3389_fninf_2016_00027
crossref_primary_10_1088_1741_2560_6_5_058002
crossref_primary_10_1111_cogs_13388
crossref_primary_10_4303_ijbdm_235531
crossref_primary_10_3390_e21100989
crossref_primary_10_1080_23273798_2023_2166679
crossref_primary_10_1016_j_neucom_2015_02_034
crossref_primary_10_1016_j_neuroimage_2014_02_006
crossref_primary_10_1016_j_neuropsychologia_2012_07_007
crossref_primary_10_1093_cercor_bhu155
crossref_primary_10_1371_journal_pone_0066032
crossref_primary_10_1523_JNEUROSCI_1546_16_2016
crossref_primary_10_1002_wcs_1650
crossref_primary_10_1016_j_neuroimage_2009_09_059
crossref_primary_10_1016_j_cmpb_2020_105730
crossref_primary_10_1186_1471_2121_8_S1_S5
crossref_primary_10_1016_j_aiia_2022_08_002
crossref_primary_10_1016_j_jpsychires_2016_03_001
crossref_primary_10_1016_j_nicl_2013_09_003
crossref_primary_10_1523_JNEUROSCI_6319_11_2013
crossref_primary_10_1371_journal_pbio_1002180
crossref_primary_10_1016_j_mri_2009_11_009
crossref_primary_10_1097_j_pain_0000000000002534
crossref_primary_10_3758_s13428_019_01344_9
crossref_primary_10_1016_j_procs_2013_05_186
crossref_primary_10_1080_01621459_2013_852978
crossref_primary_10_1016_j_neuroimage_2007_11_024
crossref_primary_10_1016_j_neuroimage_2013_05_009
crossref_primary_10_1016_j_jneumeth_2011_04_032
Cites_doi 10.1037/0278-7393.29.6.1319
10.1023/A:1007692713085
10.1145/383952.383974
10.1162/jocn.1992.4.4.352
10.1038/13217
10.1002/(SICI)1097-0193(1999)8:2/3<128::AID-HBM10>3.0.CO;2-G
10.1162/jocn.1996.8.6.566
10.1126/science.281.5380.1188
10.1080/10618600.1999.10474832
10.1037/h0027577
10.1016/S0896-6273(00)80546-2
10.1037/e537102012-530
10.1093/oso/9780198504856.003.0011
10.1006/nimg.2001.1033
10.1016/S1053-8119(03)00049-1
10.1006/nimg.2001.1034
10.1073/pnas.96.16.9379
10.1126/science.1063736
10.1023/A:1009715923555
10.1093/cercor/12.5.545
10.1073/pnas.95.18.10902
10.1006/nimg.1995.1007
10.1002/hbm.460020402
ContentType Journal Article
Copyright Kluwer Academic Publishers 2004
Copyright_xml – notice: Kluwer Academic Publishers 2004
DBID AAYXX
CITATION
3V.
7SC
7XB
88I
8AL
8AO
8FD
8FE
8FG
8FK
ABUWG
AFKRA
ARAPS
AZQEC
BENPR
BGLVJ
CCPQU
DWQXO
GNUQQ
HCIFZ
JQ2
K7-
L7M
L~C
L~D
M0N
M2P
P5Z
P62
PHGZM
PHGZT
PKEHL
PQEST
PQGLB
PQQKQ
PQUKI
PRINS
Q9U
7QO
FR3
P64
ADTOC
UNPAY
DOI 10.1023/B:MACH.0000035475.85309.1b
DatabaseName CrossRef
ProQuest Central (Corporate)
Computer and Information Systems Abstracts
ProQuest Central (purchase pre-March 2016)
Science Database (Alumni Edition)
Computing Database (Alumni Edition)
ProQuest Pharma Collection
Technology Research Database
ProQuest SciTech Collection
ProQuest Technology Collection
ProQuest Central (Alumni) (purchase pre-March 2016)
ProQuest Central (Alumni)
ProQuest Central UK/Ireland
Advanced Technologies & Computer Science Collection
ProQuest Central Essentials - QC
ProQuest Central
Technology Collection
ProQuest One Community College
ProQuest Central
ProQuest Central Student
SciTech Premium Collection
ProQuest Computer Science Collection
Computer Science Database
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
Computing Database
Science Database (Proquest)
Advanced Technologies & Aerospace Database
ProQuest Advanced Technologies & Aerospace Collection
ProQuest Central Premium
ProQuest One Academic
ProQuest One Academic Middle East (New)
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Applied & Life Sciences
ProQuest One Academic
ProQuest One Academic UKI Edition
ProQuest Central China
ProQuest Central Basic
Biotechnology Research Abstracts
Engineering Research Database
Biotechnology and BioEngineering Abstracts
Unpaywall for CDI: Periodical Content
Unpaywall
DatabaseTitle CrossRef
Computer Science Database
ProQuest Central Student
Technology Collection
Technology Research Database
Computer and Information Systems Abstracts – Academic
ProQuest One Academic Middle East (New)
ProQuest Advanced Technologies & Aerospace Collection
ProQuest Central Essentials
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
ProQuest Central (Alumni Edition)
SciTech Premium Collection
ProQuest One Community College
ProQuest Pharma Collection
ProQuest Central China
ProQuest Central
ProQuest One Applied & Life Sciences
ProQuest Central Korea
ProQuest Central (New)
Advanced Technologies Database with Aerospace
Advanced Technologies & Aerospace Collection
ProQuest Computing
ProQuest Science Journals (Alumni Edition)
ProQuest Central Basic
ProQuest Science Journals
ProQuest Computing (Alumni Edition)
ProQuest One Academic Eastern Edition
ProQuest Technology Collection
ProQuest SciTech Collection
Computer and Information Systems Abstracts Professional
Advanced Technologies & Aerospace Database
ProQuest One Academic UKI Edition
ProQuest One Academic
ProQuest Central (Alumni)
ProQuest One Academic (New)
Engineering Research Database
Biotechnology Research Abstracts
Biotechnology and BioEngineering Abstracts
DatabaseTitleList Engineering Research Database
Computer Science Database
Engineering Research Database
Computer and Information Systems Abstracts
Database_xml – sequence: 1
  dbid: UNPAY
  name: Unpaywall
  url: https://proxy.k.utb.cz/login?url=https://unpaywall.org/
  sourceTypes: Open Access Repository
– sequence: 2
  dbid: 8FG
  name: ProQuest Technology Collection
  url: https://search.proquest.com/technologycollection1
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Computer Science
EISSN 1573-0565
EndPage 175
ExternalDocumentID 10.1023/b:mach.0000035475.85309.1b
2157418521
10_1023_B_MACH_0000035475_85309_1b
Genre Feature
GroupedDBID -Y2
-~C
-~X
.4S
.86
.DC
.VR
06D
0R~
0VY
199
1N0
1SB
2.D
203
28-
29M
2J2
2JN
2JY
2KG
2KM
2LR
2P1
2VQ
2~H
30V
4.4
406
408
409
40D
40E
5GY
5QI
5VS
67Z
6NX
6TJ
78A
88I
8AO
8FE
8FG
8TC
8UJ
95-
95.
95~
96X
AAAVM
AABHQ
AACDK
AAEWM
AAHNG
AAIAL
AAJBT
AAJKR
AANZL
AAOBN
AAPKM
AARHV
AARTL
AASML
AATNV
AATVU
AAUYE
AAWCG
AAYIU
AAYQN
AAYTO
AAYXX
AAYZH
ABAKF
ABBBX
ABBRH
ABBXA
ABDBE
ABDZT
ABECU
ABFSG
ABFTV
ABHLI
ABHQN
ABIVO
ABJNI
ABJOX
ABKCH
ABKTR
ABMNI
ABMQK
ABNWP
ABQBU
ABQSL
ABRTQ
ABSXP
ABTEG
ABTHY
ABTKH
ABTMW
ABULA
ABUWG
ABWNU
ABXPI
ACAOD
ACBXY
ACDTI
ACGFS
ACGOD
ACHSB
ACHXU
ACKNC
ACMDZ
ACMLO
ACNCT
ACOKC
ACOMO
ACPIV
ACSTC
ACZOJ
ADHHG
ADHIR
ADHKG
ADIMF
ADKFA
ADKNI
ADKPE
ADMLS
ADRFC
ADTPH
ADURQ
ADYFF
ADZKW
AEBTG
AEFIE
AEFQL
AEGAL
AEGNC
AEJHL
AEJRE
AEKMD
AEMSY
AENEX
AEOHA
AEPYU
AESKC
AETLH
AEVLU
AEXYK
AEZWR
AFBBN
AFDZB
AFEXP
AFGCZ
AFHIU
AFKRA
AFLOW
AFOHR
AFQWF
AFWTZ
AFZKB
AGAYW
AGDGC
AGJBK
AGMZJ
AGQEE
AGQMX
AGQPQ
AGRTI
AGWIL
AGWZB
AGYKE
AHAVH
AHBYD
AHKAY
AHPBZ
AHSBF
AHWEU
AHYZX
AIAKS
AIGIU
AIIXL
AILAN
AITGF
AIXLP
AJBLW
AJRNO
AJZVZ
ALMA_UNASSIGNED_HOLDINGS
ALWAN
AMKLP
AMVHM
AMXSW
AMYLF
AMYQR
AOCGG
ARAPS
ARCSS
ARMRJ
ASPBG
ATHPR
AVWKF
AXYYD
AYFIA
AYJHY
AZFZN
AZQEC
B-.
BA0
BBWZM
BDATZ
BENPR
BGLVJ
BGNMA
BPHCQ
BSONS
CAG
CCPQU
CITATION
COF
CS3
CSCUP
DDRTE
DL5
DNIVK
DPUIP
DU5
DWQXO
EBLON
EBS
EIOEI
EJD
ESBYG
F5P
FEDTE
FERAY
FFXSO
FIGPU
FINBP
FNLPD
FRRFC
FSGXE
FWDCC
GGCAI
GGRSB
GJIRD
GNUQQ
GNWQR
GQ7
GQ8
GXS
H13
HCIFZ
HF~
HG5
HG6
HMJXF
HQYDN
HRMNR
HVGLF
HZ~
I-F
I09
IHE
IJ-
IKXTQ
ITG
ITH
ITM
IWAJR
IXC
IZIGR
IZQ
I~X
I~Y
I~Z
J-C
J0Z
JBSCW
JCJTX
JZLTJ
K6V
K7-
KDC
KOV
KOW
LAK
LLZTM
M2P
M4Y
MA-
MVM
N2Q
N9A
NB0
NDZJH
NPVJJ
NQJWS
NU0
O9-
O93
O9G
O9I
O9J
OAM
OVD
P19
P2P
P62
P9O
PF-
PHGZM
PHGZT
PQGLB
PQQKQ
PROAC
PT4
PUEGO
Q2X
QF4
QM1
QN7
QO4
QOK
QOS
R4E
R89
R9I
RHV
RNI
RNS
ROL
RPX
RSV
RZC
RZE
S16
S1Z
S26
S27
S28
S3B
SAP
SCJ
SCLPG
SCO
SDH
SHX
SISQX
SJYHP
SNE
SNPRN
SNX
SOHCF
SOJ
SPISZ
SRMVM
SSLCW
STPWE
SZN
T13
T16
TAE
TEORI
TN5
TSG
TSK
TSV
TUC
TUS
U2A
UG4
UOJIU
UTJUX
UZXMN
VC2
VFIZW
W23
W48
WH7
WIP
WK8
XJT
YLTOR
Z45
Z8Z
ZMTXR
3V.
7SC
7XB
8AL
8FD
8FK
JQ2
L7M
L~C
L~D
M0N
PKEHL
PQEST
PQUKI
PRINS
Q9U
7QO
FR3
P64
ADTOC
UNPAY
ID FETCH-LOGICAL-c501t-4506fb984527cb815ff5ca40f21a5d7221dae187d3a4f22d518c4c5c493ab9523
IEDL.DBID UNPAY
ISSN 0885-6125
1573-0565
IngestDate Sun Oct 26 04:08:24 EDT 2025
Fri Sep 05 10:28:33 EDT 2025
Tue Oct 07 09:56:11 EDT 2025
Mon Oct 06 18:02:34 EDT 2025
Sun Jul 13 05:20:12 EDT 2025
Wed Oct 01 05:46:23 EDT 2025
Thu Apr 24 22:55:38 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 1-2
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c501t-4506fb984527cb815ff5ca40f21a5d7221dae187d3a4f22d518c4c5c493ab9523
Notes SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-2
content type line 23
ObjectType-Article-1
ObjectType-Feature-2
OpenAccessLink https://proxy.k.utb.cz/login?url=https://link.springer.com/content/pdf/10.1023/B:MACH.0000035475.85309.1b.pdf
PQID 757010632
PQPubID 23462
PageCount 31
ParticipantIDs unpaywall_primary_10_1023_b_mach_0000035475_85309_1b
proquest_miscellaneous_28667371
proquest_miscellaneous_1875849087
proquest_miscellaneous_17682824
proquest_journals_757010632
crossref_citationtrail_10_1023_B_MACH_0000035475_85309_1b
crossref_primary_10_1023_B_MACH_0000035475_85309_1b
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2004-10-00
20041001
PublicationDateYYYYMMDD 2004-10-01
PublicationDate_xml – month: 10
  year: 2004
  text: 2004-10-00
PublicationDecade 2000
PublicationPlace Dordrecht
PublicationPlace_xml – name: Dordrecht
PublicationTitle Machine learning
PublicationYear 2004
Publisher Springer Nature B.V
Publisher_xml – name: Springer Nature B.V
References A.D. Wagner (5273775_CR35) 1998; 281
W. Eddy (5273775_CR12) 1998; 8
5273775_CR15
5273775_CR2
W. F. Battig (5273775_CR3) 1968; 80
D.D. Cox (5273775_CR11) 2003; 19
5273775_CR16
K. Nigam (5273775_CR29) 2000; 39
5273775_CR5
A. Ishai (5273775_CR19) 1999; 96
5273775_CR36
5273775_CR31
5273775_CR10
C. Strother S (5273775_CR33) 2002; 15
5273775_CR30
V. S. Caviness (5273775_CR7) 1996; 8
Y. Yang (5273775_CR37) 1999; 1
J. Haxby (5273775_CR17) 2001; 293
C. Burges (5273775_CR6) 1998; 2
K. J. Friston (5273775_CR13) 1995; 2
M. A. Just (5273775_CR21) 1999; 8
P. Hojen-Sorensen (5273775_CR18) 1999
L. Chao (5273775_CR9) 2002; 12
5273775_CR28
U. Kjems (5273775_CR23) 2002; 15
5273775_CR27
5273775_CR24
5273775_CR22
M. J. McKeown (5273775_CR25) 1998; 6
R. S. Menon (5273775_CR26) 1998; 95
5273775_CR20
J. Rademacher (5273775_CR32) 1992; 4
J. Talairach (5273775_CR34) 1988
K. J. Friston (5273775_CR14) 1995; 2
B. Blankertz (5273775_CR4) 2002; 14
L. Chao (5273775_CR8) 1999; 2
G. K. Aguirre (5273775_CR1) 1998; 21
References_xml – ident: 5273775_CR24
  doi: 10.1037/0278-7393.29.6.1319
– volume: 6
  start-page: 3
  year: 1998
  ident: 5273775_CR25
  publication-title: {tiHuman Brain Mapping
– volume: 1
  start-page: 67
  issue: 1
  year: 1999
  ident: 5273775_CR37
  publication-title: Journal of Information Retrieval
– volume: 39
  start-page: 103
  year: 2000
  ident: 5273775_CR29
  publication-title: Machine Learning
  doi: 10.1023/A:1007692713085
– ident: 5273775_CR20
  doi: 10.1145/383952.383974
– volume: 4
  start-page: 352
  year: 1992
  ident: 5273775_CR32
  publication-title: Journal of Cognitive Neuroscience
  doi: 10.1162/jocn.1992.4.4.352
– ident: 5273775_CR27
– volume: 2
  start-page: 913
  year: 1999
  ident: 5273775_CR8
  publication-title: Nature Neuroscience
  doi: 10.1038/13217
– volume-title: Co-planar stereotaxic atlas of the human brain
  year: 1988
  ident: 5273775_CR34
– volume-title: Bayesian modeling of fMRI time series
  year: 1999
  ident: 5273775_CR18
– volume: 8
  start-page: 128
  year: 1999
  ident: 5273775_CR21
  publication-title: Human Brain Mapping
  doi: 10.1002/(SICI)1097-0193(1999)8:2/3<128::AID-HBM10>3.0.CO;2-G
– volume: 8
  start-page: 566
  year: 1996
  ident: 5273775_CR7
  publication-title: Journal of Cognitive Neuroscience
  doi: 10.1162/jocn.1996.8.6.566
– volume: 281
  start-page: 1188
  year: 1998
  ident: 5273775_CR35
  publication-title: Science
  doi: 10.1126/science.281.5380.1188
– ident: 5273775_CR2
– volume: 8
  start-page: 545
  issue: 3
  year: 1998
  ident: 5273775_CR12
  publication-title: Journal of Computational and Graphical Statistics
  doi: 10.1080/10618600.1999.10474832
– ident: 5273775_CR30
– volume: 80
  start-page: 1
  issue: 3
  year: 1968
  ident: 5273775_CR3
  publication-title: Journal of Experimental Psychology Monograph
  doi: 10.1037/h0027577
– ident: 5273775_CR36
– volume: 21
  start-page: 373
  year: 1998
  ident: 5273775_CR1
  publication-title: Neuron
  doi: 10.1016/S0896-6273(00)80546-2
– ident: 5273775_CR22
  doi: 10.1037/e537102012-530
– ident: 5273775_CR15
  doi: 10.1093/oso/9780198504856.003.0011
– volume: 15
  start-page: 772
  year: 2002
  ident: 5273775_CR23
  publication-title: NeuroImage
  doi: 10.1006/nimg.2001.1033
– ident: 5273775_CR5
– volume: 19
  start-page: 261
  year: 2003
  ident: 5273775_CR11
  publication-title: NeuroImage
  doi: 10.1016/S1053-8119(03)00049-1
– volume: 15
  start-page: 747
  year: 2002
  ident: 5273775_CR33
  publication-title: Neuroimage
  doi: 10.1006/nimg.2001.1034
– volume: 96
  start-page: 9379
  year: 1999
  ident: 5273775_CR19
  publication-title: Proc. Nat. Acad. Sci.
  doi: 10.1073/pnas.96.16.9379
– volume: 293
  start-page: 2425
  year: 2001
  ident: 5273775_CR17
  publication-title: Science
  doi: 10.1126/science.1063736
– ident: 5273775_CR28
– volume: 2
  start-page: 121
  issue: 2
  year: 1998
  ident: 5273775_CR6
  publication-title: Journal of data Mining and Knowledge Discovery
  doi: 10.1023/A:1009715923555
– volume: 12
  start-page: 545
  year: 2002
  ident: 5273775_CR9
  publication-title: Cerebral Cortex
  doi: 10.1093/cercor/12.5.545
– volume: 95
  start-page: 10902
  year: 1998
  ident: 5273775_CR26
  publication-title: {tiProc. Natl. Acad. Sci (U.S.A.)}
  doi: 10.1073/pnas.95.18.10902
– volume: 14
  start-page: 157
  year: 2002
  ident: 5273775_CR4
  publication-title: Advances in Neural Inf. Proc. Systems (NIPS 2001)
– volume: 2
  start-page: 45
  year: 1995
  ident: 5273775_CR14
  publication-title: NeuroImage
  doi: 10.1006/nimg.1995.1007
– volume: 2
  start-page: 189
  year: 1995
  ident: 5273775_CR13
  publication-title: Human Brain Mapping
  doi: 10.1002/hbm.460020402
– ident: 5273775_CR31
– ident: 5273775_CR10
– ident: 5273775_CR16
SSID ssj0002686
Score 2.361327
Snippet Issue Title: Special Issue: Data Mining Lessons Learned Over the past decade, functional Magnetic Resonance Imaging (fMRI) has emerged as a powerful new...
Over the past decade, functional Magnetic Resonance Imaging (fMRI) has emerged as a powerful new instrument to collect vast quantities of data about activity...
SourceID unpaywall
proquest
crossref
SourceType Open Access Repository
Aggregation Database
Enrichment Source
Index Database
StartPage 145
SubjectTerms Brain
Data mining
Human subjects
SummonAdditionalLinks – databaseName: ProQuest Central
  dbid: BENPR
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LTxsxEB7RcGgvbelDBNriSr0a1q99IKGKpKCAFFRVReJmee01HMImJYkQ_76ejXdp1YI47GnX1mjsee3Y3wfwhaU-NaWxtPLSUJk6QYvEMypc5W14vLDY0R2fpaNzeXqhLtZg3N6FwWOVrU9sHLWbWvxHvpepDMsXwb_OflEkjcLmasugYSKzgjtoEMaewTpHYKwerA-Ozr7_6FwzTxvqx2BZimJob1FIudgb7I8Ph6MIZ6hkpnZDIEuKXVb-HbHu09Dny3pm7m7NZPJHRDp-DS9jKkkOV2u_AWtV_QZetTQNJFrtW1ARQ_WSLKbkW4WX2MmwPTVEVtkmwWsmZIB8EeTkOviY-Ts4Pz76ORzRyJZArUrYgkqVpL4scql4ZsucKe-VNTLxnBnlMs6ZMxXLMyeM9Jy7oCcrrbKyEKYsQj36Hnr1tK42gUhjhbROpb5g0rLEMBsyMcuMy10SMpY-FK1GtI1Q4shoMdFNS5sLPdCoTX2vTd1oU7OyD6IbO1sBajxp1HareB2NbK67LdGHne5tsA5seZi6mi7nmoVqKhSVQeLPD30RCrYcu5_Zw7PwHMlRM9YH2a35P8KX-trYq_8Jv_Wo8NvwYgUeiQcEP0BvcbOsPoZEZ1F-itv3N5KX9oQ
  priority: 102
  providerName: ProQuest
Title Learning to Decode Cognitive States from Brain Images
URI https://www.proquest.com/docview/757010632
https://www.proquest.com/docview/17682824
https://www.proquest.com/docview/1875849087
https://www.proquest.com/docview/28667371
https://link.springer.com/content/pdf/10.1023/B:MACH.0000035475.85309.1b.pdf
UnpaywallVersion publishedVersion
Volume 57
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVLSH
  databaseName: SpringerLink Journals
  customDbUrl:
  mediaType: online
  eissn: 1573-0565
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0002686
  issn: 1573-0565
  databaseCode: AFBBN
  dateStart: 19970101
  isFulltext: true
  providerName: Library Specific Holdings
– providerCode: PRVPQU
  databaseName: ProQuest Central
  customDbUrl: http://www.proquest.com/pqcentral?accountid=15518
  eissn: 1573-0565
  dateEnd: 20171231
  omitProxy: true
  ssIdentifier: ssj0002686
  issn: 1573-0565
  databaseCode: BENPR
  dateStart: 19970101
  isFulltext: true
  titleUrlDefault: https://www.proquest.com/central
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: ProQuest Technology Collection
  customDbUrl:
  eissn: 1573-0565
  dateEnd: 20241102
  omitProxy: true
  ssIdentifier: ssj0002686
  issn: 1573-0565
  databaseCode: 8FG
  dateStart: 19970101
  isFulltext: true
  titleUrlDefault: https://search.proquest.com/technologycollection1
  providerName: ProQuest
– providerCode: PRVAVX
  databaseName: SpringerLINK - Czech Republic Consortium
  customDbUrl:
  eissn: 1573-0565
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0002686
  issn: 1573-0565
  databaseCode: AGYKE
  dateStart: 19970101
  isFulltext: true
  titleUrlDefault: http://link.springer.com
  providerName: Springer Nature
– providerCode: PRVAVX
  databaseName: SpringerLink Journals (ICM)
  customDbUrl:
  eissn: 1573-0565
  dateEnd: 99991231
  omitProxy: true
  ssIdentifier: ssj0002686
  issn: 1573-0565
  databaseCode: U2A
  dateStart: 19970101
  isFulltext: true
  titleUrlDefault: http://www.springerlink.com/journals/
  providerName: Springer Nature
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Lj9MwEB6x7YETu8tDFNhiJK7JxrEdJ3trulsK0lYrRKXlZPkRc6BNK5oKwa_Hbpwuj10JIQ5RDrEtP8aZbzQz3wC8xpnNpJI6qiyVEc0MiYrE4oiYymr3WKK9R_dylk3n9N01uw7p0Zsu2r1zSbY5DZ6lqW5O18Z2VAOn5dnlaDwN3IOMchY7rZMUMVaxa3YA_Yw5ZN6D_nx2NfrYAkkWeWW-o0_lPmQtYx0HqRtQnS2lE6tbBvxVX92A0Pvbei2_fZWLxU_6aHIIy24lbRjK53jbqFh__43k8X8t9QgeBOCKRq2kHcO9qn4Ih11RCBT-EY-ABcbWT6hZofPKp8yjcRejhFpsi3xSCyp9dQr0dun-aJvHMJ9cfBhPo1CbIdIswU1EWZJZVeSUpVyrHDNrmZY0sSmWzPA0xUZWOOeGSGrT1DCca6qZpgWRqnDW7xPo1au6egqISk2oNiyzBaYaJxJrh_s0liY3icNHAyi6ExA6EJf7-hkLsXOgp0SUwu-RuNkjsdsjgdUAyL7vuqXv-Ktez7uDFuFKbwRn3NvPJB3Ay_1Xdxe9g0XW1Wq7EdjZbs6EdTN-dVcLZx7m3tfK7x4lzX0pVo4HQPcy9sfklfACe9vkn_1btxfQa75sqxMHsRo1hIN88mYI_dGkLGfuXV7Mrt4Pw336AX5uHwg
linkProvider Unpaywall
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1JTxsxFH5CcKCXLrRVU9riSu3RMN5mqYQqEkBJIVFVgcTN9XihhzBJm0SIH9f_VnviGUCUqhcOc5qx9fTst82zvw_gA0ldqkqlsXVcYZ4ahovEEcyMddo_junQ0R2O0v4p_3Imzlbgd3MXJhyrbHxi7ajNRId_5DuZyEL5wujn6U8cSKNCc7Vh0FCRWcHs1ghj8V7Hkb269BXcbHew75f7I6WHBye9Po4kA1iLhMwxF0nqyiLngma6zIlwTmjFE0eJEiajlBhlSZ4Zprij1PjpNddC84KpshAB98BHgDXOeOFrv7XuwejrtzYU0LSmmvSWLHBIJRrUU8p2up-Ge71-hE8UPBPbPnAmxTYpb0fI67R3fVFN1dWlGo9vRMDDp_A4pq5ob7nXnsGKrTbgSUMLgaKXeA4iYraeo_kE7dtwaR71mlNKaJndonCtBXUDPwUaXHifNnsBpw-iuJewWk0q-woQV5pxbUTqCsI1SRTRPvPTRJncJD5D6kDRaETqCF0eGDTGsm6hUya7MmhTXmtT1tqUpOwAa8dOlwAe_zVqs1G8jEY9k-0W7MBW-9ZbY2ixqMpOFjNJfPXmi1gv8fv7vvAFYh66rdn9s9A8kLFmpAO8XfM7wpfyQukffxP-9T-F34L1_snwWB4PRkeb8GgJXBkOJ76B1fmvhX3rk6x5-S5uZQTfH9p6_gBEwzLT
linkToPdf http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9NAEF5VRQIu5S1CC10kOG7jfXltJISahJBQWnGgUm_Leh_0kDqBJKr60_rv2PGrICji0oNPfmg0O0_PzDcIvaJpSE1hLPFBGCJSx0meBEq488HGK3ALFd3Do3RyLD6eyJMNdNnOwkBbZWsTK0Pt5hb-kfeVVJC-cNYPTVfE59H43eI7gQVSUGhtt2nUEnLgL85j9rZ8Ox3Fo37N2Pj9l-GENAsGiJUJXREhkzQUeSYkU7bIqAxBWiOSwKiRTjFGnfE0U44bERhzkmZWWGlFzk2RS8A8iNb_lgIQdxhSH3_onABLqyWTUYclgSCixTtlvD94c7g_nDTAiVIouRddZpLv0eJ333gV8N5ZlwtzcW5ms1983_g-2mqCVrxfS9kDtOHLh-heuxACN_bhEZINWus3vJrjkYdxeTxs-5NwHddiGGjBA9hMgadn0ZotH6PjG2HbE7RZzkv_FGFhLBfWyTTkVFiaGGpjzGepcZlLYmzUQ3nLEW0b0HLYnTHTVfGccT3QwE19xU1dcVPTood49-6ihu74r7e2W8brRp2XuhO-Htrt7kY9hOKKKf18vdQ05m0xfY0Uv7zuiZgaZlBnVdd_hWWwhlXRHhLdmf9BfKHPjD39G_HP_kn8LroddUZ_mh4dbKO7NWIldCXuoM3Vj7V_HqOrVfGikmOMvt604vwEvbIwbQ
linkToUnpaywall http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1LbxMxEB5BeuBEeYq0PIzEdbfr13q3tyRQBaRWHIhUTpYfaw4km4hsVLW_vnbWTnm0EkIc9rS25cfY841m5huAd7h0pdLKZI1jKmOlpVldOJxR2zjjP0dN8OienpXTGft0zs9jevQ6Rbsnl2Sf0xBYmtruaGVdoho4Gh-fjibTyD3ImeC51zpFnWOd-2b3Ya_kHpkPYG929nn0tQeSPAvKfEufKkLIWskTB6kfUB8vlBerWwb8VV_dgNAHm3alLi_UfP6TPjrZh0VaSR-G8j3fdDo3V7-RPP6vpT6ChxG4olEvaY_hXtM-gf1UFALFN-Ip8MjY-g11S_S-CSnzaJJilFCPbVFIakHjUJ0CfVz4F239DGYnH75MplmszZAZXuAuY7wona4rxokwusLcOW4UKxzBiltBCLaqwZWwVDFHiOW4Msxww2qqdO2t3-cwaJdt8wIQU4YyY3npaswMLhQ2HvcZrGxlC4-PhlCnE5AmEpeH-hlzuXWgEyrHMuyRvNkjud0jifUQ6K7vqqfv-Kteh-mgZbzSaym4CPYzJUN4s_vr72JwsKi2WW7WEnvbzZuwfsZv72rhzcMq-FrF3aOQKpRiFXgIbCdjf0xeyyCwt03-4N-6vYRB92PTvPIQq9Ov4825BgnCGww
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Learning+to+Decode+Cognitive+States+from+Brain+Images&rft.jtitle=Machine+learning&rft.au=Mitchell%2C+T+M&rft.au=Hutchinson%2C+R&rft.au=Niculescu%2C+R+S&rft.au=Pereira%2C+F&rft.date=2004-10-01&rft.issn=0885-6125&rft.volume=57&rft.issue=1-2&rft.spage=145&rft.epage=175&rft_id=info:doi/10.1023%2FB%3AMACH.0000035475.85309.1b&rft.externalDBID=NO_FULL_TEXT
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0885-6125&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0885-6125&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0885-6125&client=summon