Contrast-enhanced to non-contrast-enhanced image translation to exploit a clinical data warehouse of T1-weighted brain MRI
Background Clinical data warehouses provide access to massive amounts of medical images, but these images are often heterogeneous. They can for instance include images acquired both with or without the injection of a gadolinium-based contrast agent. Harmonizing such data sets is thus fundamental to...
Saved in:
Published in | BMC medical imaging Vol. 24; no. 1; pp. 67 - 15 |
---|---|
Main Authors | , , , , , , |
Format | Journal Article |
Language | English |
Published |
London
BioMed Central
20.03.2024
BioMed Central Ltd BMC |
Subjects | |
Online Access | Get full text |
ISSN | 1471-2342 1471-2342 |
DOI | 10.1186/s12880-024-01242-3 |
Cover
Abstract | Background
Clinical data warehouses provide access to massive amounts of medical images, but these images are often heterogeneous. They can for instance include images acquired both with or without the injection of a gadolinium-based contrast agent. Harmonizing such data sets is thus fundamental to guarantee unbiased results, for example when performing differential diagnosis. Furthermore, classical neuroimaging software tools for feature extraction are typically applied only to images without gadolinium. The objective of this work is to evaluate how image translation can be useful to exploit a highly heterogeneous data set containing both contrast-enhanced and non-contrast-enhanced images from a clinical data warehouse.
Methods
We propose and compare different 3D U-Net and conditional GAN models to convert contrast-enhanced T1-weighted (T1ce) into non-contrast-enhanced (T1nce) brain MRI. These models were trained using 230 image pairs and tested on 77 image pairs from the clinical data warehouse of the Greater Paris area.
Results
Validation using standard image similarity measures demonstrated that the similarity between real and synthetic T1nce images was higher than between real T1nce and T1ce images for all the models compared. The best performing models were further validated on a segmentation task. We showed that tissue volumes extracted from synthetic T1nce images were closer to those of real T1nce images than volumes extracted from T1ce images.
Conclusion
We showed that deep learning models initially developed with research quality data could synthesize T1nce from T1ce images of clinical quality and that reliable features could be extracted from the synthetic images, thus demonstrating the ability of such methods to help exploit a data set coming from a clinical data warehouse. |
---|---|
AbstractList | Clinical data warehouses provide access to massive amounts of medical images, but these images are often heterogeneous. They can for instance include images acquired both with or without the injection of a gadolinium-based contrast agent. Harmonizing such data sets is thus fundamental to guarantee unbiased results, for example when performing differential diagnosis. Furthermore, classical neuroimaging software tools for feature extraction are typically applied only to images without gadolinium. The objective of this work is to evaluate how image translation can be useful to exploit a highly heterogeneous data set containing both contrast-enhanced and non-contrast-enhanced images from a clinical data warehouse.BACKGROUNDClinical data warehouses provide access to massive amounts of medical images, but these images are often heterogeneous. They can for instance include images acquired both with or without the injection of a gadolinium-based contrast agent. Harmonizing such data sets is thus fundamental to guarantee unbiased results, for example when performing differential diagnosis. Furthermore, classical neuroimaging software tools for feature extraction are typically applied only to images without gadolinium. The objective of this work is to evaluate how image translation can be useful to exploit a highly heterogeneous data set containing both contrast-enhanced and non-contrast-enhanced images from a clinical data warehouse.We propose and compare different 3D U-Net and conditional GAN models to convert contrast-enhanced T1-weighted (T1ce) into non-contrast-enhanced (T1nce) brain MRI. These models were trained using 230 image pairs and tested on 77 image pairs from the clinical data warehouse of the Greater Paris area.METHODSWe propose and compare different 3D U-Net and conditional GAN models to convert contrast-enhanced T1-weighted (T1ce) into non-contrast-enhanced (T1nce) brain MRI. These models were trained using 230 image pairs and tested on 77 image pairs from the clinical data warehouse of the Greater Paris area.Validation using standard image similarity measures demonstrated that the similarity between real and synthetic T1nce images was higher than between real T1nce and T1ce images for all the models compared. The best performing models were further validated on a segmentation task. We showed that tissue volumes extracted from synthetic T1nce images were closer to those of real T1nce images than volumes extracted from T1ce images.RESULTSValidation using standard image similarity measures demonstrated that the similarity between real and synthetic T1nce images was higher than between real T1nce and T1ce images for all the models compared. The best performing models were further validated on a segmentation task. We showed that tissue volumes extracted from synthetic T1nce images were closer to those of real T1nce images than volumes extracted from T1ce images.We showed that deep learning models initially developed with research quality data could synthesize T1nce from T1ce images of clinical quality and that reliable features could be extracted from the synthetic images, thus demonstrating the ability of such methods to help exploit a data set coming from a clinical data warehouse.CONCLUSIONWe showed that deep learning models initially developed with research quality data could synthesize T1nce from T1ce images of clinical quality and that reliable features could be extracted from the synthetic images, thus demonstrating the ability of such methods to help exploit a data set coming from a clinical data warehouse. Background Clinical data warehouses provide access to massive amounts of medical images, but these images are often heterogeneous. They can for instance include images acquired both with or without the injection of a gadolinium-based contrast agent. Harmonizing such data sets is thus fundamental to guarantee unbiased results, for example when performing differential diagnosis. Furthermore, classical neuroimaging software tools for feature extraction are typically applied only to images without gadolinium. The objective of this work is to evaluate how image translation can be useful to exploit a highly heterogeneous data set containing both contrast-enhanced and non-contrast-enhanced images from a clinical data warehouse. Methods We propose and compare different 3D U-Net and conditional GAN models to convert contrast-enhanced T1-weighted (T1ce) into non-contrast-enhanced (T1nce) brain MRI. These models were trained using 230 image pairs and tested on 77 image pairs from the clinical data warehouse of the Greater Paris area. Results Validation using standard image similarity measures demonstrated that the similarity between real and synthetic T1nce images was higher than between real T1nce and T1ce images for all the models compared. The best performing models were further validated on a segmentation task. We showed that tissue volumes extracted from synthetic T1nce images were closer to those of real T1nce images than volumes extracted from T1ce images. Conclusion We showed that deep learning models initially developed with research quality data could synthesize T1nce from T1ce images of clinical quality and that reliable features could be extracted from the synthetic images, thus demonstrating the ability of such methods to help exploit a data set coming from a clinical data warehouse. Keywords: Brain MRI, Clinical data warehouse, Image translation Clinical data warehouses provide access to massive amounts of medical images, but these images are often heterogeneous. They can for instance include images acquired both with or without the injection of a gadolinium-based contrast agent. Harmonizing such data sets is thus fundamental to guarantee unbiased results, for example when performing differential diagnosis. Furthermore, classical neuroimaging software tools for feature extraction are typically applied only to images without gadolinium. The objective of this work is to evaluate how image translation can be useful to exploit a highly heterogeneous data set containing both contrast-enhanced and non-contrast-enhanced images from a clinical data warehouse. We propose and compare different 3D U-Net and conditional GAN models to convert contrast-enhanced T1-weighted (T1ce) into non-contrast-enhanced (T1nce) brain MRI. These models were trained using 230 image pairs and tested on 77 image pairs from the clinical data warehouse of the Greater Paris area. Validation using standard image similarity measures demonstrated that the similarity between real and synthetic T1nce images was higher than between real T1nce and T1ce images for all the models compared. The best performing models were further validated on a segmentation task. We showed that tissue volumes extracted from synthetic T1nce images were closer to those of real T1nce images than volumes extracted from T1ce images. We showed that deep learning models initially developed with research quality data could synthesize T1nce from T1ce images of clinical quality and that reliable features could be extracted from the synthetic images, thus demonstrating the ability of such methods to help exploit a data set coming from a clinical data warehouse. Abstract Background Clinical data warehouses provide access to massive amounts of medical images, but these images are often heterogeneous. They can for instance include images acquired both with or without the injection of a gadolinium-based contrast agent. Harmonizing such data sets is thus fundamental to guarantee unbiased results, for example when performing differential diagnosis. Furthermore, classical neuroimaging software tools for feature extraction are typically applied only to images without gadolinium. The objective of this work is to evaluate how image translation can be useful to exploit a highly heterogeneous data set containing both contrast-enhanced and non-contrast-enhanced images from a clinical data warehouse. Methods We propose and compare different 3D U-Net and conditional GAN models to convert contrast-enhanced T1-weighted (T1ce) into non-contrast-enhanced (T1nce) brain MRI. These models were trained using 230 image pairs and tested on 77 image pairs from the clinical data warehouse of the Greater Paris area. Results Validation using standard image similarity measures demonstrated that the similarity between real and synthetic T1nce images was higher than between real T1nce and T1ce images for all the models compared. The best performing models were further validated on a segmentation task. We showed that tissue volumes extracted from synthetic T1nce images were closer to those of real T1nce images than volumes extracted from T1ce images. Conclusion We showed that deep learning models initially developed with research quality data could synthesize T1nce from T1ce images of clinical quality and that reliable features could be extracted from the synthetic images, thus demonstrating the ability of such methods to help exploit a data set coming from a clinical data warehouse. Background Clinical data warehouses provide access to massive amounts of medical images, but these images are often heterogeneous. They can for instance include images acquired both with or without the injection of a gadolinium-based contrast agent. Harmonizing such data sets is thus fundamental to guarantee unbiased results, for example when performing differential diagnosis. Furthermore, classical neuroimaging software tools for feature extraction are typically applied only to images without gadolinium. The objective of this work is to evaluate how image translation can be useful to exploit a highly heterogeneous data set containing both contrast-enhanced and non-contrast-enhanced images from a clinical data warehouse. Methods We propose and compare different 3D U-Net and conditional GAN models to convert contrast-enhanced T1-weighted (T1ce) into non-contrast-enhanced (T1nce) brain MRI. These models were trained using 230 image pairs and tested on 77 image pairs from the clinical data warehouse of the Greater Paris area. Results Validation using standard image similarity measures demonstrated that the similarity between real and synthetic T1nce images was higher than between real T1nce and T1ce images for all the models compared. The best performing models were further validated on a segmentation task. We showed that tissue volumes extracted from synthetic T1nce images were closer to those of real T1nce images than volumes extracted from T1ce images. Conclusion We showed that deep learning models initially developed with research quality data could synthesize T1nce from T1ce images of clinical quality and that reliable features could be extracted from the synthetic images, thus demonstrating the ability of such methods to help exploit a data set coming from a clinical data warehouse. Clinical data warehouses provide access to massive amounts of medical images, but these images are often heterogeneous. They can for instance include images acquired both with or without the injection of a gadolinium-based contrast agent. Harmonizing such data sets is thus fundamental to guarantee unbiased results, for example when performing differential diagnosis. Furthermore, classical neuroimaging software tools for feature extraction are typically applied only to images without gadolinium. The objective of this work is to evaluate how image translation can be useful to exploit a highly heterogeneous data set containing both contrast-enhanced and non-contrast-enhanced images from a clinical data warehouse. We propose and compare different 3D U-Net and conditional GAN models to convert contrast-enhanced T1-weighted (T1ce) into non-contrast-enhanced (T1nce) brain MRI. These models were trained using 230 image pairs and tested on 77 image pairs from the clinical data warehouse of the Greater Paris area. Validation using standard image similarity measures demonstrated that the similarity between real and synthetic T1nce images was higher than between real T1nce and T1ce images for all the models compared. The best performing models were further validated on a segmentation task. We showed that tissue volumes extracted from synthetic T1nce images were closer to those of real T1nce images than volumes extracted from T1ce images. We showed that deep learning models initially developed with research quality data could synthesize T1nce from T1ce images of clinical quality and that reliable features could be extracted from the synthetic images, thus demonstrating the ability of such methods to help exploit a data set coming from a clinical data warehouse. BackgroundClinical data warehouses provide access to massive amounts of medical images, but these images are often heterogeneous. They can for instance include images acquired both with or without the injection of a gadolinium-based contrast agent. Harmonizing such data sets is thus fundamental to guarantee unbiased results, for example when performing differential diagnosis. Furthermore, classical neuroimaging software tools for feature extraction are typically applied only to images without gadolinium. The objective of this work is to evaluate how image translation can be useful to exploit a highly heterogeneous data set containing both contrast-enhanced and non-contrast-enhanced images from a clinical data warehouse.MethodsWe propose and compare different 3D U-Net and conditional GAN models to convert contrast-enhanced T1-weighted (T1ce) into non-contrast-enhanced (T1nce) brain MRI. These models were trained using 230 image pairs and tested on 77 image pairs from the clinical data warehouse of the Greater Paris area.ResultsValidation using standard image similarity measures demonstrated that the similarity between real and synthetic T1nce images was higher than between real T1nce and T1ce images for all the models compared. The best performing models were further validated on a segmentation task. We showed that tissue volumes extracted from synthetic T1nce images were closer to those of real T1nce images than volumes extracted from T1ce images.ConclusionWe showed that deep learning models initially developed with research quality data could synthesize T1nce from T1ce images of clinical quality and that reliable features could be extracted from the synthetic images, thus demonstrating the ability of such methods to help exploit a data set coming from a clinical data warehouse. |
ArticleNumber | 67 |
Audience | Academic |
Author | Maire, Aurélien Colliot, Olivier Bottani, Simona Thibeau-Sutre, Elina Ströer, Sebastian Burgos, Ninon Dormont, Didier |
Author_xml | – sequence: 1 givenname: Simona surname: Bottani fullname: Bottani, Simona organization: Sorbonne Université, Institut du Cerveau - Paris Brain Institute - ICM, CNRS, Inria, Inserm, AP-HP, Hôpital de la Pitié-Salpêtrière – sequence: 2 givenname: Elina surname: Thibeau-Sutre fullname: Thibeau-Sutre, Elina organization: Sorbonne Université, Institut du Cerveau - Paris Brain Institute - ICM, CNRS, Inria, Inserm, AP-HP, Hôpital de la Pitié-Salpêtrière – sequence: 3 givenname: Aurélien surname: Maire fullname: Maire, Aurélien organization: Innovation & Données – Département des Services Numériques, AP-HP – sequence: 4 givenname: Sebastian surname: Ströer fullname: Ströer, Sebastian organization: Hôpital Pitié Salpêtrière, Department of Neuroradiology, AP-HP – sequence: 5 givenname: Didier surname: Dormont fullname: Dormont, Didier organization: Sorbonne Université, Institut du Cerveau - Paris Brain Institute - ICM, CNRS, Inria, Inserm, AP-HP, Hôpital de la Pitié-Salpêtrière, DMU DIAMENT – sequence: 6 givenname: Olivier surname: Colliot fullname: Colliot, Olivier organization: Sorbonne Université, Institut du Cerveau - Paris Brain Institute - ICM, CNRS, Inria, Inserm, AP-HP, Hôpital de la Pitié-Salpêtrière – sequence: 7 givenname: Ninon surname: Burgos fullname: Burgos, Ninon email: ninon.burgos@cnrs.fr organization: Sorbonne Université, Institut du Cerveau - Paris Brain Institute - ICM, CNRS, Inria, Inserm, AP-HP, Hôpital de la Pitié-Salpêtrière |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/38504179$$D View this record in MEDLINE/PubMed https://hal.science/hal-03497645$$DView record in HAL |
BookMark | eNp9kk1v1DAQhiNURD_gD3BAkbjAIcVfSewTWq2ArrQICZWzNbGdXa-y9hJnW-DXM2lK2y0I5eBo_MzrmVfvaXYUYnBZ9pKSc0pl9S5RJiUpCBMFoUywgj_JTqioacG4YEcP_o-z05Q2hNBacvEsO-ayJILW6iT7NY9h6CENhQtrCMbZfIg5PlSYvy78FlYux2JIHQw-hhF1P3Zd9EMOuel88Aa63MIA-TX0bh33yeWxzS9pce38aj2gStODD_nnr4vn2dMWuuRe3J5n2bePHy7nF8Xyy6fFfLYsTKn4UDCAyjaVJbRxYFrbtG1JVSUbykAaU7WUKcqVIByslUwg6giXTNVC2ZIbfpYtJl0bYaN3Pa7R_9QRvL4pxH6loR-86ZyWCtuUslIYJwxVSkBrDBGmrQRRAKj1ftLa7Zuts8aNHnUHooc3wa_1Kl5pSlTJqeCo8HZSWD_qu5gt9VgjXKi6EuUVQ_bN7Wt9_L53adBbn4zrOggOrdU4LKuJQDsQff0I3cR9H9BXzQmvZSVwgntqBbitD23EIc0oqmfIyFpVhCJ1_g8KP-u2HmPhWo_1g4ZXD125W-tPzBBgE2D6mFLv2juEEj1mWU9Z1phlfZNlPTrFp6aEcFi5_n6l_3T9BvO79Ps |
Cites_doi | 10.1002/hbm.24750 10.1109/TCI.2016.2644865 10.1002/mp.12155 10.1088/1361-6560/aac763 10.1109/TMI.2019.2945521 10.1088/1361-6560/ab0606 10.1016/j.media.2019.05.001 10.1109/ACCESS.2019.2918926 10.3389/fninf.2014.00044 10.2967/jnumed.117.199414 10.1007/s00330-019-06229-1 10.1016/j.compbiomed.2018.06.010 10.1016/j.media.2019.101552 10.1016/j.media.2021.102219 10.1016/j.media.2007.06.004 10.1097/RLI.0000000000000583 10.1109/WACV48630.2021.00402 10.1007/978-3-319-24574-4_28 10.1016/j.ijrobp.2018.05.058 10.3389/fnins.2020.00853 10.1109/ISBI.2017.7950500 10.1186/1471-2342-8-9 10.1109/TMI.2019.2901750 10.1038/sdata.2016.44 10.1007/978-3-030-32251-9_87 10.1109/42.906424 10.3389/fninf.2013.00027 10.1007/s11604-018-0758-8 10.1109/ISBI45749.2020.9098323 10.1109/ISBI48211.2021.9434029 10.1109/ICCV.2017.304 10.3348/kjr.2020.0518 10.1109/CVPR.2017.632 10.1080/0284186X.2019.1630754 10.1093/bib/bbaa310 10.1002/mp.12945 10.1016/j.media.2019.101546 10.1109/ACCESS.2019.2929230 10.1186/s13195-020-00757-5 10.1007/978-3-319-46723-8_49 10.1007/s12021-011-9109-y 10.1016/j.neuroimage.2011.09.015 10.1109/TMI.2010.2046908 10.3389/fnins.2018.01005 10.1016/j.mri.2019.05.041 10.3389/fninf.2021.689675 10.1016/j.neuroimage.2005.02.018 10.1007/978-3-030-00928-1_11 10.1109/ACCESS.2020.2968395 10.1109/TIP.2003.819861 10.1007/s00234-022-02898-w 10.1109/TMI.2019.2895894 10.1109/TBME.2018.2814538 10.3233/JAD-190594 10.1007/978-3-030-87193-2_11 10.1016/j.compbiomed.2020.104115 10.1007/978-3-030-32161-1_8 10.1109/3DV.2016.79 10.1016/j.neuroimage.2018.08.042 10.1109/ICCV.2017.244 10.1016/j.media.2020.101694 10.1016/j.media.2023.102799 10.1016/j.media.2017.07.006 10.1016/j.media.2021.101976 10.1007/978-3-319-68127-6_2 10.1007/978-3-319-46493-0_38 10.1016/j.nicl.2016.02.019 10.2967/jnumed.118.214320 10.1002/mp.13047 |
ContentType | Journal Article |
Copyright | The Author(s) 2024 2024. The Author(s). COPYRIGHT 2024 BioMed Central Ltd. 2024. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. Attribution |
Copyright_xml | – notice: The Author(s) 2024 – notice: 2024. The Author(s). – notice: COPYRIGHT 2024 BioMed Central Ltd. – notice: 2024. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. – notice: Attribution |
CorporateAuthor | APPRIMAGE Study Group |
CorporateAuthor_xml | – name: APPRIMAGE Study Group |
DBID | C6C AAYXX CITATION NPM 3V. 7QO 7RV 7X7 7XB 88E 8FD 8FE 8FG 8FH 8FI 8FJ 8FK ABUWG AFKRA ARAPS AZQEC BBNVY BENPR BGLVJ BHPHI CCPQU DWQXO FR3 FYUFA GHDGH GNUQQ HCIFZ K9. KB0 LK8 M0S M1P M7P NAPCQ P5Z P62 P64 PHGZM PHGZT PIMPY PJZUB PKEHL PPXIY PQEST PQGLB PQQKQ PQUKI 7X8 1XC VOOES 5PM DOA |
DOI | 10.1186/s12880-024-01242-3 |
DatabaseName | Springer Nature OA Free Journals CrossRef PubMed ProQuest Central (Corporate) Biotechnology Research Abstracts Nursing & Allied Health Database Health & Medical Collection ProQuest Central (purchase pre-March 2016) Medical Database (Alumni Edition) Technology Research Database ProQuest SciTech Collection ProQuest Technology Collection ProQuest Natural Science Collection Hospital Premium Collection Hospital Premium Collection (Alumni Edition) ProQuest Central (Alumni) (purchase pre-March 2016) ProQuest Central (Alumni) ProQuest Central UK/Ireland Health Research Premium Collection ProQuest Central Essentials Biological Science Collection (subscription) ProQuest Central Technology Collection Natural Science Collection ProQuest One ProQuest Central Engineering Research Database Proquest Health Research Premium Collection Health Research Premium Collection (Alumni) ProQuest Central Student SciTech Premium Collection ProQuest Health & Medical Complete (Alumni) Nursing & Allied Health Database (Alumni Edition) Biological Sciences ProQuest Health & Medical Collection Medical Database Biological Science Database Nursing & Allied Health Premium Advanced Technologies & Aerospace Database ProQuest Advanced Technologies & Aerospace Collection Biotechnology and BioEngineering Abstracts Proquest Central Premium ProQuest One Academic (New) Publicly Available Content Database ProQuest Health & Medical Research Collection ProQuest One Academic Middle East (New) ProQuest One Health & Nursing ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic ProQuest One Academic UKI Edition MEDLINE - Academic Hyper Article en Ligne (HAL) Hyper Article en Ligne (HAL) (Open Access) PubMed Central (Full Participant titles) DOAJ Directory of Open Access Journals |
DatabaseTitle | CrossRef PubMed Publicly Available Content Database ProQuest Central Student Technology Collection Technology Research Database ProQuest One Academic Middle East (New) ProQuest Advanced Technologies & Aerospace Collection ProQuest Central Essentials ProQuest Health & Medical Complete (Alumni) ProQuest Central (Alumni Edition) SciTech Premium Collection ProQuest One Community College ProQuest One Health & Nursing ProQuest Natural Science Collection ProQuest Central ProQuest One Applied & Life Sciences ProQuest Health & Medical Research Collection Health Research Premium Collection Biotechnology Research Abstracts Health and Medicine Complete (Alumni Edition) Natural Science Collection ProQuest Central Korea Health & Medical Research Collection Biological Science Collection ProQuest Central (New) ProQuest Medical Library (Alumni) Advanced Technologies & Aerospace Collection ProQuest Biological Science Collection ProQuest One Academic Eastern Edition ProQuest Nursing & Allied Health Source ProQuest Hospital Collection ProQuest Technology Collection Health Research Premium Collection (Alumni) Biological Science Database ProQuest SciTech Collection ProQuest Hospital Collection (Alumni) Biotechnology and BioEngineering Abstracts Advanced Technologies & Aerospace Database Nursing & Allied Health Premium ProQuest Health & Medical Complete ProQuest Medical Library ProQuest One Academic UKI Edition ProQuest Nursing & Allied Health Source (Alumni) Engineering Research Database ProQuest One Academic ProQuest One Academic (New) ProQuest Central (Alumni) MEDLINE - Academic |
DatabaseTitleList | MEDLINE - Academic PubMed Publicly Available Content Database |
Database_xml | – sequence: 1 dbid: C6C name: Springer Nature OA Free Journals url: http://www.springeropen.com/ sourceTypes: Publisher – sequence: 2 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 3 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 4 dbid: 8FG name: ProQuest Technology Collection url: https://search.proquest.com/technologycollection1 sourceTypes: Aggregation Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Medicine Computer Science |
EISSN | 1471-2342 |
EndPage | 15 |
ExternalDocumentID | oai_doaj_org_article_8929799d84ce4c1994afcc04cf6409aa PMC10953143 oai_HAL_hal_03497645v2 A786879601 38504179 10_1186_s12880_024_01242_3 |
Genre | Journal Article |
GeographicLocations | France |
GeographicLocations_xml | – name: France |
GrantInformation_xml | – fundername: Agence Nationale de la Recherche grantid: ANR-10-IAIHU-06; ANR-10-IAIHU-06 funderid: http://dx.doi.org/10.13039/501100001665 – fundername: Agence Nationale de la Recherche grantid: ANR-10-IAIHU-06; ANR-10-IAIHU-06; ANR-10-IAIHU-06 – fundername: Agence Nationale de la Recherche grantid: ANR-10-IAIHU-06 |
GroupedDBID | --- 0R~ 23N 2WC 53G 5VS 6J9 7RV 7X7 88E 8FE 8FG 8FH 8FI 8FJ AAFWJ AAJSJ AASML ABUWG ACGFO ACGFS ACIHN ACIWK ACPRK ADBBV ADRAZ ADUKV AEAQA AENEX AFKRA AFPKN AFRAH AHBYD AHMBA AHYZX ALMA_UNASSIGNED_HOLDINGS AMKLP AMTXH AOIJS ARAPS BAPOH BAWUL BBNVY BCNDV BENPR BFQNJ BGLVJ BHPHI BMC BPHCQ BVXVI C6C CCPQU CS3 DIK DU5 E3Z EBD EBLON EBS EMB EMOBN F5P FYUFA GROUPED_DOAJ GX1 HCIFZ HMCUK HYE IAO IHR INH INR ITC KQ8 LK8 M1P M48 M7P M~E NAPCQ O5R O5S OK1 OVT P2P P62 PGMZT PHGZM PHGZT PIMPY PJZUB PPXIY PQGLB PQQKQ PROAC PSQYO PUEGO RBZ RNS ROL RPM RSV SMD SOJ SV3 TR2 UKHRP W2D WOQ WOW XSB AAYXX ALIPV CITATION -A0 3V. ACRMQ ADINQ C24 NPM PMFND 7QO 7XB 8FD 8FK AZQEC DWQXO FR3 GNUQQ K9. P64 PKEHL PQEST PQUKI 7X8 1XC VOOES 5PM |
ID | FETCH-LOGICAL-c593t-2aa6db6d01beacfdbff51968b12a8cc6f129139403add824b6de03829749d53c3 |
IEDL.DBID | M48 |
ISSN | 1471-2342 |
IngestDate | Wed Aug 27 01:30:29 EDT 2025 Thu Aug 21 18:34:43 EDT 2025 Fri Sep 12 12:40:19 EDT 2025 Thu Sep 04 17:50:50 EDT 2025 Fri Jul 25 09:58:31 EDT 2025 Tue Jun 17 22:17:46 EDT 2025 Tue Jun 10 21:08:43 EDT 2025 Wed Feb 19 02:13:03 EST 2025 Tue Jul 01 04:02:41 EDT 2025 Sat Sep 06 07:26:54 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 1 |
Keywords | Clinical data warehouse Image translation Brain MRI Deep learning Brain Anatomical MRI Gadolinium injection |
Language | English |
License | 2024. The Author(s). Attribution: http://creativecommons.org/licenses/by Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data. |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c593t-2aa6db6d01beacfdbff51968b12a8cc6f129139403add824b6de03829749d53c3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 PMCID: PMC10953143 |
ORCID | 0000-0002-9836-654X 0000-0002-4615-0237 0000-0002-4668-2006 0000-0002-6080-1286 |
OpenAccessLink | http://journals.scholarsportal.info/openUrl.xqy?doi=10.1186/s12880-024-01242-3 |
PMID | 38504179 |
PQID | 3037864109 |
PQPubID | 44833 |
PageCount | 15 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_8929799d84ce4c1994afcc04cf6409aa pubmedcentral_primary_oai_pubmedcentral_nih_gov_10953143 hal_primary_oai_HAL_hal_03497645v2 proquest_miscellaneous_2972704519 proquest_journals_3037864109 gale_infotracmisc_A786879601 gale_infotracacademiconefile_A786879601 pubmed_primary_38504179 crossref_primary_10_1186_s12880_024_01242_3 springer_journals_10_1186_s12880_024_01242_3 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2024-03-20 |
PublicationDateYYYYMMDD | 2024-03-20 |
PublicationDate_xml | – month: 03 year: 2024 text: 2024-03-20 day: 20 |
PublicationDecade | 2020 |
PublicationPlace | London |
PublicationPlace_xml | – name: London – name: England |
PublicationTitle | BMC medical imaging |
PublicationTitleAbbrev | BMC Med Imaging |
PublicationTitleAlternate | BMC Med Imaging |
PublicationYear | 2024 |
Publisher | BioMed Central BioMed Central Ltd BMC |
Publisher_xml | – name: BioMed Central – name: BioMed Central Ltd – name: BMC |
References | MA Morid (1242_CR65) 2021; 128 D Nie (1242_CR30) 2018; 65 H Choi (1242_CR63) 2018; 59 J Koikkalainen (1242_CR6) 2016; 11 J Yang (1242_CR22) 2019; 64 F Isensee (1242_CR69) 2019; 40 1242_CR44 A Benou (1242_CR35) 2017; 42 1242_CR40 AM Dinkla (1242_CR28) 2018; 102 Z Wang (1242_CR68) 2004; 13 1242_CR45 1242_CR47 1242_CR48 H Salman (1242_CR67) 2020; 33 F Hashimoto (1242_CR38) 2019; 7 SU Dar (1242_CR31) 2019; 38 KJ Gorgolewski (1242_CR50) 2016; 3 H Emami (1242_CR29) 2018; 45 Q Yu (1242_CR4) 2021; 13 J Samper-González (1242_CR70) 2018; 183 J Du (1242_CR39) 2020; 8 C Xu (1242_CR43) 2021; 69 J Gu (1242_CR26) 2019; 7 CN Ladefoged (1242_CR20) 2019; 12 1242_CR55 H Wang (1242_CR14) 2013; 7 1242_CR16 1242_CR56 S Bottani (1242_CR49) 2022; 75 1242_CR57 1242_CR58 1242_CR15 1242_CR59 A Routier (1242_CR51) 2021; 15 J Ashburner (1242_CR11) 2005; 26 D Jiang (1242_CR36) 2018; 36 BB Avants (1242_CR53) 2008; 12 J Kleesiek (1242_CR46) 2019; 54 S Neppl (1242_CR23) 2019; 58 BE Dewey (1242_CR42) 2019; 64 A Morin (1242_CR2) 2020; 74 W Wei (1242_CR62) 2019; 58 1242_CR64 K Gong (1242_CR19) 2018; 63 1242_CR66 J Mark (1242_CR10) 2012; 62 1242_CR60 1242_CR61 Y Zhang (1242_CR12) 2001; 20 M Ran (1242_CR37) 2019; 55 KH Kim (1242_CR27) 2018; 45 1242_CR24 1242_CR25 A Sharma (1242_CR34) 2019; 39 X Yi (1242_CR71) 2019; 58 H Zhao (1242_CR74) 2016; 3 N Burgos (1242_CR72) 2021; 22 X Han (1242_CR17) 2017; 44 BB Avants (1242_CR9) 2014; 8 B Yu (1242_CR32) 2019; 38 1242_CR75 1242_CR76 1242_CR33 1242_CR77 1242_CR73 1242_CR8 J Wen (1242_CR54) 2020; 63 I Shiri (1242_CR18) 2019; 29 BB Avants (1242_CR13) 2011; 9 LA Zaki (1242_CR5) 2022; 64 JY Lee (1242_CR3) 2021; 22 RA Heckemann (1242_CR1) 2008; 8 NJ Tustison (1242_CR52) 2010; 29 D Ma (1242_CR7) 2020; 14 KD Spuhler (1242_CR21) 2019; 60 K Zeng (1242_CR41) 2018; 99 |
References_xml | – volume: 40 start-page: 4952 issue: 17 year: 2019 ident: 1242_CR69 publication-title: Hum Brain Mapp. doi: 10.1002/hbm.24750 – volume: 3 start-page: 47 issue: 1 year: 2016 ident: 1242_CR74 publication-title: IEEE Trans Comput Imaging. doi: 10.1109/TCI.2016.2644865 – volume: 44 start-page: 1408 issue: 4 year: 2017 ident: 1242_CR17 publication-title: Med Phys. doi: 10.1002/mp.12155 – volume: 63 start-page: 125011 issue: 12 year: 2018 ident: 1242_CR19 publication-title: Phys Med Biol. doi: 10.1088/1361-6560/aac763 – volume: 39 start-page: 1170 issue: 4 year: 2019 ident: 1242_CR34 publication-title: IEEE Trans Med Imaging. doi: 10.1109/TMI.2019.2945521 – volume: 64 start-page: 075019 issue: 7 year: 2019 ident: 1242_CR22 publication-title: Phys Med Biol. doi: 10.1088/1361-6560/ab0606 – volume: 55 start-page: 165 year: 2019 ident: 1242_CR37 publication-title: Med Image Anal. doi: 10.1016/j.media.2019.05.001 – ident: 1242_CR66 – volume: 7 start-page: 68290 year: 2019 ident: 1242_CR26 publication-title: IEEE Access. doi: 10.1109/ACCESS.2019.2918926 – volume: 8 start-page: 44 year: 2014 ident: 1242_CR9 publication-title: Front Neuroinformatics. doi: 10.3389/fninf.2014.00044 – volume: 59 start-page: 1111 issue: 7 year: 2018 ident: 1242_CR63 publication-title: J Nucl Med. doi: 10.2967/jnumed.117.199414 – volume: 29 start-page: 6867 issue: 12 year: 2019 ident: 1242_CR18 publication-title: Eur Radiol. doi: 10.1007/s00330-019-06229-1 – volume: 99 start-page: 133 year: 2018 ident: 1242_CR41 publication-title: Comput Biol Med. doi: 10.1016/j.compbiomed.2018.06.010 – volume: 58 start-page: 101552 year: 2019 ident: 1242_CR71 publication-title: Med Image Anal. doi: 10.1016/j.media.2019.101552 – volume: 75 start-page: 102219 year: 2022 ident: 1242_CR49 publication-title: Med Image Anal. doi: 10.1016/j.media.2021.102219 – volume: 12 start-page: 26 issue: 1 year: 2008 ident: 1242_CR53 publication-title: Med Image Anal. doi: 10.1016/j.media.2007.06.004 – volume: 54 start-page: 653 issue: 10 year: 2019 ident: 1242_CR46 publication-title: Investig Radiol. doi: 10.1097/RLI.0000000000000583 – ident: 1242_CR44 doi: 10.1109/WACV48630.2021.00402 – ident: 1242_CR15 doi: 10.1007/978-3-319-24574-4_28 – volume: 102 start-page: 801 issue: 4 year: 2018 ident: 1242_CR28 publication-title: Int J Radiat Oncol* Biol* Phys. doi: 10.1016/j.ijrobp.2018.05.058 – volume: 14 start-page: 853 year: 2020 ident: 1242_CR7 publication-title: Front Neurosci. doi: 10.3389/fnins.2020.00853 – ident: 1242_CR40 doi: 10.1109/ISBI.2017.7950500 – volume: 8 start-page: 1 issue: 1 year: 2008 ident: 1242_CR1 publication-title: BMC Med Imaging. doi: 10.1186/1471-2342-8-9 – volume: 38 start-page: 2375 issue: 10 year: 2019 ident: 1242_CR31 publication-title: IEEE Trans Med Imaging. doi: 10.1109/TMI.2019.2901750 – ident: 1242_CR48 – volume: 3 start-page: 1 issue: 1 year: 2016 ident: 1242_CR50 publication-title: Sci Data. doi: 10.1038/sdata.2016.44 – ident: 1242_CR33 doi: 10.1007/978-3-030-32251-9_87 – volume: 20 start-page: 45 issue: 1 year: 2001 ident: 1242_CR12 publication-title: IEEE Trans Med Imaging. doi: 10.1109/42.906424 – volume: 7 start-page: 27 year: 2013 ident: 1242_CR14 publication-title: Front Neuroinformatics. doi: 10.3389/fninf.2013.00027 – volume: 36 start-page: 566 issue: 9 year: 2018 ident: 1242_CR36 publication-title: Jpn J Radiol. doi: 10.1007/s11604-018-0758-8 – ident: 1242_CR47 doi: 10.1109/ISBI45749.2020.9098323 – ident: 1242_CR45 doi: 10.1109/ISBI48211.2021.9434029 – ident: 1242_CR64 doi: 10.1109/ICCV.2017.304 – volume: 22 start-page: 405 issue: 3 year: 2021 ident: 1242_CR3 publication-title: Korean J Radiol. doi: 10.3348/kjr.2020.0518 – ident: 1242_CR61 doi: 10.1109/CVPR.2017.632 – volume: 33 start-page: 3533 year: 2020 ident: 1242_CR67 publication-title: Adv Neural Inf Process Syst. – volume: 58 start-page: 1429 issue: 10 year: 2019 ident: 1242_CR23 publication-title: Acta Oncol. doi: 10.1080/0284186X.2019.1630754 – volume: 22 start-page: 1560 issue: 2 year: 2021 ident: 1242_CR72 publication-title: Brief Bioinforma. doi: 10.1093/bib/bbaa310 – volume: 45 start-page: 3120 issue: 7 year: 2018 ident: 1242_CR27 publication-title: Med Phys. doi: 10.1002/mp.12945 – volume: 58 start-page: 101546 year: 2019 ident: 1242_CR62 publication-title: Med Image Anal. doi: 10.1016/j.media.2019.101546 – volume: 7 start-page: 96594 year: 2019 ident: 1242_CR38 publication-title: IEEE Access. doi: 10.1109/ACCESS.2019.2929230 – volume: 13 start-page: 1 issue: 1 year: 2021 ident: 1242_CR4 publication-title: Alzheimer Res Ther. doi: 10.1186/s13195-020-00757-5 – ident: 1242_CR16 doi: 10.1007/978-3-319-46723-8_49 – ident: 1242_CR60 – volume: 9 start-page: 381 issue: 4 year: 2011 ident: 1242_CR13 publication-title: Neuroinformatics. doi: 10.1007/s12021-011-9109-y – volume: 62 start-page: 782 issue: 2 year: 2012 ident: 1242_CR10 publication-title: NeuroImage. doi: 10.1016/j.neuroimage.2011.09.015 – volume: 29 start-page: 1310 issue: 6 year: 2010 ident: 1242_CR52 publication-title: IEEE Trans Med Imaging. doi: 10.1109/TMI.2010.2046908 – volume: 12 start-page: 1005 year: 2019 ident: 1242_CR20 publication-title: Front Neurosci. doi: 10.3389/fnins.2018.01005 – volume: 64 start-page: 160 year: 2019 ident: 1242_CR42 publication-title: Magn Reson Imaging. doi: 10.1016/j.mri.2019.05.041 – volume: 15 start-page: 39 year: 2021 ident: 1242_CR51 publication-title: Front Neuroinformatics. doi: 10.3389/fninf.2021.689675 – volume: 26 start-page: 839 issue: 3 year: 2005 ident: 1242_CR11 publication-title: NeuroImage. doi: 10.1016/j.neuroimage.2005.02.018 – ident: 1242_CR25 doi: 10.1007/978-3-030-00928-1_11 – volume: 8 start-page: 18938 year: 2020 ident: 1242_CR39 publication-title: IEEE Access. doi: 10.1109/ACCESS.2020.2968395 – volume: 13 start-page: 600 issue: 4 year: 2004 ident: 1242_CR68 publication-title: IEEE Trans Image Process. doi: 10.1109/TIP.2003.819861 – volume: 64 start-page: 1 year: 2022 ident: 1242_CR5 publication-title: Neuroradiology. doi: 10.1007/s00234-022-02898-w – volume: 38 start-page: 1750 issue: 7 year: 2019 ident: 1242_CR32 publication-title: IEEE Trans Med Imaging. doi: 10.1109/TMI.2019.2895894 – volume: 65 start-page: 2720 issue: 12 year: 2018 ident: 1242_CR30 publication-title: IEEE Trans Biomed Eng. doi: 10.1109/TBME.2018.2814538 – ident: 1242_CR59 – volume: 74 start-page: 1157 issue: 4 year: 2020 ident: 1242_CR2 publication-title: J Alzheimer Dis. doi: 10.3233/JAD-190594 – ident: 1242_CR57 doi: 10.1007/978-3-030-87193-2_11 – volume: 128 start-page: 104115 year: 2021 ident: 1242_CR65 publication-title: Comput Biol Med. doi: 10.1016/j.compbiomed.2020.104115 – ident: 1242_CR8 – ident: 1242_CR73 doi: 10.1007/978-3-030-32161-1_8 – ident: 1242_CR55 doi: 10.1109/3DV.2016.79 – volume: 183 start-page: 504 year: 2018 ident: 1242_CR70 publication-title: NeuroImage. doi: 10.1016/j.neuroimage.2018.08.042 – ident: 1242_CR75 doi: 10.1109/ICCV.2017.244 – volume: 63 start-page: 101694 year: 2020 ident: 1242_CR54 publication-title: Med Image Anal. doi: 10.1016/j.media.2020.101694 – ident: 1242_CR76 doi: 10.1016/j.media.2023.102799 – volume: 42 start-page: 145 year: 2017 ident: 1242_CR35 publication-title: Med Image Anal. doi: 10.1016/j.media.2017.07.006 – volume: 69 start-page: 101976 year: 2021 ident: 1242_CR43 publication-title: Med Image Anal. doi: 10.1016/j.media.2021.101976 – ident: 1242_CR24 doi: 10.1007/978-3-319-68127-6_2 – ident: 1242_CR58 doi: 10.1007/978-3-319-46493-0_38 – volume: 11 start-page: 435 year: 2016 ident: 1242_CR6 publication-title: NeuroImage Clin. doi: 10.1016/j.nicl.2016.02.019 – volume: 60 start-page: 555 issue: 4 year: 2019 ident: 1242_CR21 publication-title: J Nuclear Med. doi: 10.2967/jnumed.118.214320 – volume: 45 start-page: 3627 issue: 8 year: 2018 ident: 1242_CR29 publication-title: Med Phys. doi: 10.1002/mp.13047 – ident: 1242_CR56 – ident: 1242_CR77 |
SSID | ssj0017834 |
Score | 2.3199918 |
Snippet | Background
Clinical data warehouses provide access to massive amounts of medical images, but these images are often heterogeneous. They can for instance... Clinical data warehouses provide access to massive amounts of medical images, but these images are often heterogeneous. They can for instance include images... Background Clinical data warehouses provide access to massive amounts of medical images, but these images are often heterogeneous. They can for instance... BackgroundClinical data warehouses provide access to massive amounts of medical images, but these images are often heterogeneous. They can for instance include... Abstract Background Clinical data warehouses provide access to massive amounts of medical images, but these images are often heterogeneous. They can for... |
SourceID | doaj pubmedcentral hal proquest gale pubmed crossref springer |
SourceType | Open Website Open Access Repository Aggregation Database Index Database Publisher |
StartPage | 67 |
SubjectTerms | Algorithms Artificial Intelligence Brain Brain MRI Brain research Clinical data warehouse Computer Science Contrast agents Contrast media Data warehouses Datasets Deep learning Differential diagnosis Evaluation Feature extraction Gadolinium Hospitals Image acquisition Image contrast Image enhancement Image quality Image translation Imaging Machine learning Magnetic resonance imaging Medical Imaging Medical imaging equipment Medicine Medicine & Public Health Neuroimaging Quality control Radiology Similarity Software Synthetic data Tomography Translation Volumetric analysis Warehouse stores |
SummonAdditionalLinks | – databaseName: DOAJ Directory of Open Access Journals dbid: DOA link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Lj9MwELZgD4gL4k1gQQYhcQBr49hx7GNBrAqiHNCutDfL8UPtgRa1WVbi1zPjJIVQIS7cqthyJzPjecTjbwh5KXlVheAbZnzUTIYY4ZdJjDvjOA8piNw_ZfFZzc_lx4v64rdWX1gT1sMD94w70eC_G2OClj5Kj0i2LnlfSp8UpCYuh0alKcdkajg_wPYR4xUZrU52YIV1ycAfQeoMTomJiRvKaP17m3x9iSWRh_HmYdnkH2en2SWd3ia3hliSzvp3uEOuxfVdcmMxnJbfIz8Qemrrdh2L62U-6afdhkK-z_zBwOorGBbaoefqq-NwasQCvVVHHR0vUFKsKKVXbhuXm8tdpJtEzzi7yt9XYZUWG07QxZcP98n56fuzd3M29FpgvjaiY5VzKrQqlLwFU5xCmxLEdkq3vHLae5UgLuDYRV2AQdSVhKmxFHgvV5pQCy8ekCOgPz4itOZt5atWGMiWJPo7F3SKUorgWyN1KsjrkfX2Ww-pYXMqopXtBWVBUDYLyoqCvEXp7GciHHZ-AEpiByWx_1KSgrxC2VrctMBJ74a7B0Awwl_ZWaOVbiCZ4wU5nsyEzeYnwy9AOybEzGefLD5DoJ9Gyfp7BWuMymMHi7CzECrAKpKXpiDP98O4PFa5rSMIzcJLVE0G_CnIw17X9n8ldF1it7iC6IkWTmiZjqxXy4wXzhFTEOLigrwZFfYXXX_n_OP_wfkn5GaVt50Ae3xMjrrtZXwKYVzXPss79idbd0Mm priority: 102 providerName: Directory of Open Access Journals – databaseName: ProQuest Central dbid: BENPR link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV3fb9MwELa2TkK8IH6TMZBBSDyAtTh2EucBoQ5tKohWaNqkvVmO7ax9IBlttkn89dy5SadQwVsVW-7Fd_7uHJ-_I-Sd5EninM1ZYb1i0nkPv4qKcVMYzl3lRKifMp1lk3P57SK92CGz_i4MplX2mBiA2jUWv5EfAtTmKpM8Lj5f_WJYNQpPV_sSGqYrreA-BYqxXbIHkJzGI7J3dDz7cbo5V8CyEv3VGZUdrgCdVczAT8GWGpwVEwP3FFj8N1i9O8dUye04dDud8q8z1eCqTh6SB12MScdro3hEdnz9mNybdqfoT8hvpKRamlXLfD0PGQC0bWjd1MxuNSx-AuDQFj3aOmsOu3pM3Fu01ND-YiXFTFN6a5Z-3lyvPG0qesbZbfjuCqOUWIiCTk-_PiXnJ8dnXyasq8HAbFqIliXGZK7MXMxLgOjKlVUFMV-mSp4YZW1WQbzAsbq6AKBUiYSuPhZ4X1cWLhVWPCMjkN-_IDTlZWKTUhSwi5LoB41TlZdSOFsWUlUR-dBPvb5aU23osEVRmV4rSoOidFCUFhE5Qu1seiJNdnjQLC91t-q0guAvLwqnpPXSIg2yqayNpa1AgsKYiLxH3WpczDCT1nR3EkBgpMXSYzA5lcMmj0fkYNATFqEdNL8F6xgIMxl_1_gMCYDyTKY3CYzRG4_ukGKl7-w6Im82zTg8Zr_VHpSm4SWSPBABReT52tY2fyVUGmMVuYiogRUOZBm21It54BHnyDUI8XJEPvYGeyfXv2d-__-v8ZLcT8KCEoDAB2TULq_9Kwjc2vJ1txr_AKTzQOk priority: 102 providerName: ProQuest – databaseName: Springer Nature OA Free Journals dbid: C6C link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Lj9MwELbYRUJcEG8CCzIIiQNYxLGT2MdSsSqIckC70t4sxw-1B1rUZlmJX8-M84BQOHCrYtedesYz48zMN4S8lLwovHc10y4oJn0I8ElHxq22nPvoReqfsvxcLc7lx4vyoofJwVqY3-P3XFVv96A_Vc7AksClF8wJE0fkegmKF6V5Xs3HiAE2jBiKYv76vYnhSfj8oxY-WmES5KGHeZgo-Ue0NBmh09vkVu890lnH7jvkWtjcJTeWfXz8HvmBYFM7u29Z2KxSbJ-2Wwo3fOYOBtZfQZXQFm1Vlw-HUwOm5K1baulQMkkxh5Re2V1YbS_3gW4jPePsKr1RhVUabDFBl18-3Cfnp-_P5gvWd1dgrtSiZYW1lW8qn_MGlG_0TYzgzVWq4YVVzlURPAGOfdMFqEBVSJgacoGVuFL7UjjxgBwD_eERoSVvClc0QsP9SKKFs17FIKXwrtFSxYy8HrbefOtANEy6fKjKdIwywCiTGGVERt4hd8aZCICdHoBcmP48GQVuXa21V9IF6RDg2EbncukiUKCtzcgr5K3BYwo76WxfbQAEI-CVmdWqUjVc33hGTiYz4Xi5yfALkI4JMYvZJ4PPENqnrmT5vYA1BuExvQ7YG3AOYBXJc52R5-MwLo95bZsATDPwJ4o6Qfxk5GEna-NPCVXm2B8uI2oihRNapiOb9SohhHNEEQRPOCNvBoH9Rde_d_7x_01_Qm4W6YAJ0LUn5LjdXYan4KK1zbN0Nn8C2zozJQ priority: 102 providerName: Springer Nature |
Title | Contrast-enhanced to non-contrast-enhanced image translation to exploit a clinical data warehouse of T1-weighted brain MRI |
URI | https://link.springer.com/article/10.1186/s12880-024-01242-3 https://www.ncbi.nlm.nih.gov/pubmed/38504179 https://www.proquest.com/docview/3037864109 https://www.proquest.com/docview/2972704519 https://hal.science/hal-03497645 https://pubmed.ncbi.nlm.nih.gov/PMC10953143 https://doaj.org/article/8929799d84ce4c1994afcc04cf6409aa |
Volume | 24 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlR1Nb9Mw1Fo3CXFBfBMYVUBIHMAs_kjiHBBqq5WB6ISqVaq4WI7j0EqQQpsx4Nfzntt0Ch0HLlESW-6r33f8Pgh5JhnnRWFTmlmnqCycg7uspMxkhrGiLITvnzI6TU4m8v00nu6Rpt3RZgNXV7p22E9qsvzy6uf3X2-A4V97hlfJ0QpkrIooaBtwjEHlUNEhB_68CEP55OWpAjaV8NlGKaNcSN4k0Vy5RktR-Xr-W6ndmWHQ5K5FuhtY-dfpqldaw5vkxsbaDHtr8rhF9lx1m1wbbc7T75DfWJxqaVY1ddXMxwKE9SKsFhW1OwPzryB6whp12zp-Dqc6DOGb16EJmxTLEGNOwwuzdLPF-cqFizI8Y_TCf4GFVXJsSRGOxu_uksnw-GxwQjfdGKiNM1FTbkxS5EkRsRyEdVnkZQnWX6Jyxo2yNinBcmDYZ12AyFRcwlQXCczclVkRCyvukX2A3z0gYcxybnkuMvCnJGpEU6jSSSkKm2dSlQF50Wy9_rYuuqG9s6ISvUaUBkRpjygtAtJH7GxnYsFs_2Kx_Kw3_KcVmIFplhVKWictFkQ2pbWRtCVAkBkTkOeIW42EBjtpzSY7AQDGAlm6l6pEpeDusYActmYCO9rW8FOgjhYwJ70PGt9hKaA0kfEPDms0xKMbktdgTMAqkkVZQJ5sh3F5jIOrHCBNw5_gqS8JFJD7a1rb_pRQcYT95AKiWlTYgqU9Us1nvqI4w6qDYDkH5GVDsJdw_XvnH_4Xnh6R69zzlwDRfEj26-W5ewwWXZ13SSedpnBVw7ddctA_Pv04hqdBMuj6byRdz8ZwHfc__QEZoklb |
linkProvider | Scholars Portal |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtR1Nb9Mw1No6CbggvgkMMAjEgVnLh5s4hwl1sKllbYWmTtrNOLZDeyAZbUYFP47fxntu0ilUcNstii3nxe_bfh-EvOZBGBqjE5ZqKxg31sJTmrNApSoITG4i1z9lNI77Z_zTefd8i_xucmEwrLKRiU5Qm1LjGfk-iNpExDzw0_cX3xl2jcLb1aaFhqpbK5gDV2KsTuw4sT-X4MItDgYfAd9vwvD4aPKhz-ouA0x306hioVKxyWLjBxkIodxkeQ5WTSyyIFRC6zgHjRhg__AIRIEIOUy1foQZqTw13UhHsO422eF4gNIhO4dH48-n63sMbGPRpOqIeH8B2kD4DPQiuPCgHFnUUoeua8BaN2xPMTRz0-7dDN_86w7XqcbjO-R2bdPS3ooI75ItW9wjN0b1rf198gtLYM3VomK2mLqIA1qVtCgLpjcGZt9AwNEKNegqSg-nWgwUnFVU0SaRk2JkK12quZ2WlwtLy5xOArZ057ywSoaNL-jodPCAnF0LNh6SDsBvHxPaDbJQh1mUgtfGUe8qI3LLeWR0lnKRe-Rds_XyYlXaQzqXSMRyhSgJiJIOUTLyyCFiZz0Ty3K7F-X8q6y5XAowNpM0NYJryzWWXVa51j7XOUCQKuWRt4hbicIDdlKrOgcCAMYyXLIHJC4ScCoDj-y2ZgLT69bwK6COFjD93lDiOyw4lMS8-yOENRrikbVkWsgrPvLIy_UwLo_RdoUFpEn4iTBxhYc88mhFa-tPRaLrY9c6j4gWFbZgaY8Us6mrWx5gbUOwzz2y1xDsFVz_3vkn__-NF-RmfzIayuFgfPKU3Aodc0Ug_XdJp5pf2mdgNFbZ85ozKfly3cLgD4aIfrA |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3Nb9MwFLfYkCYuiO8FBhiExAGsxbGTOMdSqDpYJ4Q2aTfL8QftgXZqMybx1_OekxTC4MCtil331e_T8Xu_R8grybPMOVuyynrFpPMePlWBcVMZzl1wIvZPmZ0U0zP58Tw__62KP2a791eSbU0DojQtm8MLF1oVV8XhBqyqShn4FzgKg5NhYofcRKwuTOobF-PtPQK2kehLZf76vYE7iqj9W9u8M8fUyOtx5_X0yT_uUKNrmtwht7uYko5aIbhLbvjlPbI3627N75MfCEG1NpuG-eU83vjTZkXh3M_stYHFNzAwtEEP1mbJ4VSPiXqLhhraF1JSzCylV2bt56vLjaerQE85u4rvWWGVGhtP0NmXowfkbPLhdDxlXc8FZvNKNCwzpnB14VJeg0kOrg4BYrxC1TwzytoiQHzAsZu6AMOoMglTfSqwPldWLhdWPCS7QL_fJzTndWazWlRwapLo94xTwUspnK0rqUJC3vRbry9aaA0djySq0C2jNDBKR0ZpkZB3yJ3tTITFjg9W66-60zKtINgrq8opab20CHtsgrWptAEoqIxJyGvkrUblhZ20pqtBAIIRBkuPSlWoEg51PCEHg5mgdHYw_BKkY0DMdHSs8RkC_pSFzL9nsEYvPLqzDBsNIQOsInlaJeTFdhiXx2y3pQemafgTWRmBfxLyqJW17U8JlafYNS4haiCFA1qGI8vFPOKGc8QWhPg4IW97gf1F1793_vH_TX9O9j6_n-jjo5NPT8itLOqaAGN8QHab9aV_CjFcUz-LavoTT0w-ZA |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Contrast-enhanced+to+non-contrast-enhanced+image+translation+to+exploit+a+clinical+data+warehouse+of+T1-weighted+brain+MRI&rft.jtitle=BMC+medical+imaging&rft.au=Bottani%2C+Simona&rft.au=Thibeau-Sutre%2C+Elina&rft.au=Maire%2C+Aur%C3%A9lien&rft.au=Str%C3%B6er%2C+Sebastian&rft.date=2024-03-20&rft.issn=1471-2342&rft.eissn=1471-2342&rft.volume=24&rft.issue=1&rft_id=info:doi/10.1186%2Fs12880-024-01242-3&rft.externalDBID=n%2Fa&rft.externalDocID=10_1186_s12880_024_01242_3 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1471-2342&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1471-2342&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1471-2342&client=summon |