Deep Networks as Denoising Algorithms: Sample-Efficient Learning of Diffusion Models in High-Dimensional Graphical Models

We investigate the efficiency of deep neural networks for approximating scoring functions in diffusion-based generative modeling. While existing approximation theories leverage the smoothness of score functions, they suffer from the curse of dimensionality for intrinsically high-dimensional data. Th...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on information theory Vol. 71; no. 4; pp. 2930 - 2954
Main Authors Mei, Song, Wu, Yuchen
Format Journal Article
LanguageEnglish
Published IEEE 01.04.2025
Subjects
Online AccessGet full text
ISSN0018-9448
1557-9654
DOI10.1109/TIT.2025.3535923

Cover

Abstract We investigate the efficiency of deep neural networks for approximating scoring functions in diffusion-based generative modeling. While existing approximation theories leverage the smoothness of score functions, they suffer from the curse of dimensionality for intrinsically high-dimensional data. This limitation is pronounced in graphical models such as Markov random fields, where the approximation efficiency of score functions remains unestablished. To address this, we note score functions can often be well-approximated in graphical models through variational inference denoising algorithms. Furthermore, these algorithms can be efficiently represented by neural networks. We demonstrate this through examples, including Ising models, conditional Ising models, restricted Boltzmann machines, and sparse encoding models. Combined with off-the-shelf discretization error bounds for diffusion-based sampling, we provide an efficient sample complexity bound for diffusion-based generative modeling when the score function is learned by deep neural networks.
AbstractList We investigate the efficiency of deep neural networks for approximating scoring functions in diffusion-based generative modeling. While existing approximation theories leverage the smoothness of score functions, they suffer from the curse of dimensionality for intrinsically high-dimensional data. This limitation is pronounced in graphical models such as Markov random fields, where the approximation efficiency of score functions remains unestablished. To address this, we note score functions can often be well-approximated in graphical models through variational inference denoising algorithms. Furthermore, these algorithms can be efficiently represented by neural networks. We demonstrate this through examples, including Ising models, conditional Ising models, restricted Boltzmann machines, and sparse encoding models. Combined with off-the-shelf discretization error bounds for diffusion-based sampling, we provide an efficient sample complexity bound for diffusion-based generative modeling when the score function is learned by deep neural networks.
Author Wu, Yuchen
Mei, Song
Author_xml – sequence: 1
  givenname: Song
  orcidid: 0000-0003-1713-2408
  surname: Mei
  fullname: Mei, Song
  email: songmei@berkeley.edu
  organization: Department of Statistics, Department of Electrical Engineering and Computer Sciences, University of California at Berkeley, Berkeley, CA, USA
– sequence: 2
  givenname: Yuchen
  orcidid: 0000-0002-9538-4558
  surname: Wu
  fullname: Wu, Yuchen
  email: wuyc14@wharton.upenn.edu
  organization: Department of Statistics and Data Science, Wharton School, University of Pennsylvania, Philadelphia, PA, USA
BookMark eNpNkD1PwzAQhi0EEm1hZ2DwH0jxR5zYbFVb2koFBsocucm5NSR2ZAeh_nsStQPT3el93hueMbp23gFCD5RMKSXqabfZTRlhYsoFF4rxKzSiQuSJykR6jUaEUJmoNJW3aBzjV3-mgrIROi0AWvwG3a8P3xHriBfgvI3WHfCsPvhgu2MTn_GHbtoakqUxtrTgOrwFHdxAeYMX1pifaL3Dr76COmLr8NoejsnCNuCGQNd4FXR7tGW_naE7dGN0HeH-Mifo82W5m6-T7ftqM59tk5LRvEsUVxrSnKg9LQmncr9PwQieE1apquKZyogmjFVGZmVFmCKMSiWlAqGIKXPNJ4ic_5bBxxjAFG2wjQ6ngpJiUFf06opBXXFR11cezxULAP9wKXJOc_4HiWJtHA
CODEN IETTAW
Cites_doi 10.1109/CVPR52688.2022.01042
10.1080/01621459.2017.1285773
10.1145/383952.384042
10.1214/aoms/1177706964
10.1137/0222066
10.1080/14786437708235992
10.1073/pnas.1802705116
10.1016/S0893-6080(09)80018-X
10.1109/ICCV.2015.179
10.1017/S0962492900002919
10.1093/acprof:oso/9780198570837.001.0001
10.1109/18.256500
10.1109/FOCS54457.2022.00038
10.1007/s00039-013-0214-y
10.5555/3104322.3104374
10.24963/ijcai.2023/751
10.1109/CVPR.2018.00196
10.1561/2200000001
10.1007/BF02551274
10.1007/s00039-018-0461-z
10.1109/TPAMI.2023.3261988
10.1561/2200000092
10.1017/9781108627771
10.1214/23-aop1675
10.1109/CVPR.2016.90
10.1007/s00440-009-0240-8
10.1088/1742-5468/ab4bbb
10.1007/s00365-010-9105-8
10.1073/pnas.0909892106
10.1023/A:1007665907178
10.1109/FOCS.2017.39
10.1145/1273496.1273577
10.1214/20-aop1443
10.1007/s00440-018-0845-x
10.1016/j.neunet.2017.07.002
10.1109/ALLERTON.2016.7852290
10.1145/1553374.1553505
10.1103/PhysRevLett.35.1792
10.1145/3501714.3501727
10.1007/3-540-45435-7_4
10.1007/s00440-021-01085-x
10.1109/MSP.2020.3016905
10.1016/0893-6080(89)90020-8
10.1016/j.aim.2016.05.017
10.1109/TSP.2017.2708040
10.1017/S0962492921000052
10.1088/0305-4470/15/6/035
10.1109/ISIT54713.2023.10206572
10.1088/1742-5468/acf8ba
10.1016/j.anihpb.2005.04.001
ContentType Journal Article
DBID 97E
RIA
RIE
AAYXX
CITATION
DOI 10.1109/TIT.2025.3535923
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005–Present
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
DatabaseTitle CrossRef
DatabaseTitleList
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Computer Science
EISSN 1557-9654
EndPage 2954
ExternalDocumentID 10_1109_TIT_2025_3535923
10857317
Genre orig-research
GrantInformation_xml – fundername: Office of Naval Research (ONR)
  grantid: N00014-24-S-B001
– fundername: Amazon Research Award
  funderid: 10.13039/100000006
– fundername: NSF
  grantid: DMS-2210827; CCF-2315725
– fundername: Faculty Early Career Development Program (CAREER)
  grantid: DMS-2339904
– fundername: Google Research Scholar Award
GroupedDBID -~X
.DC
0R~
29I
3EH
4.4
5GY
5VS
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABFSI
ABQJQ
ABVLG
ACGFO
ACGFS
ACGOD
ACIWK
AENEX
AETEA
AETIX
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ASUFR
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
E.L
EBS
EJD
F5P
HZ~
H~9
IAAWW
IBMZZ
ICLAB
IDIHD
IFIPE
IFJZH
IPLJI
JAVBF
LAI
M43
MS~
O9-
OCL
P2P
PQQKQ
RIA
RIE
RNS
RXW
TAE
TN5
VH1
VJK
AAYXX
CITATION
ID FETCH-LOGICAL-c217t-939ae4709b1c0318bb4ef53702d9dd36960a022df86cd02902189889e590fc7a3
IEDL.DBID RIE
ISSN 0018-9448
IngestDate Wed Oct 01 06:40:26 EDT 2025
Wed Aug 27 01:38:45 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 4
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c217t-939ae4709b1c0318bb4ef53702d9dd36960a022df86cd02902189889e590fc7a3
ORCID 0000-0003-1713-2408
0000-0002-9538-4558
PageCount 25
ParticipantIDs crossref_primary_10_1109_TIT_2025_3535923
ieee_primary_10857317
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2025-04-01
PublicationDateYYYYMMDD 2025-04-01
PublicationDate_xml – month: 04
  year: 2025
  text: 2025-04-01
  day: 01
PublicationDecade 2020
PublicationTitle IEEE transactions on information theory
PublicationTitleAbbrev TIT
PublicationYear 2025
Publisher IEEE
Publisher_xml – name: IEEE
References ref56
Chen (ref23) 2023
Liu (ref77) 2022
ref58
ref53
ref52
Austin (ref7)
Celentano (ref85) 2021
Talagrand (ref47) 2003; 46
Marwah (ref78)
Sohl-Dickstein (ref1)
ref51
Chen (ref60)
Bai (ref75) 2023
Popov (ref8)
ref42
Chen (ref15)
ref41
Fan (ref38) 2022
ref43
Giannou (ref76) 2023
Hoogeboom (ref89); 162
ref49
Papyan (ref96) 2017; 18
Marwah (ref79)
Song (ref3)
Eldan (ref63)
Geman (ref24)
Dhariwal (ref5); 34
DeVore (ref73) 2021; 30
ref35
Yuan (ref20) 2023
Ramesh (ref6) 2022
ref37
Chen (ref98)
ref31
Cybenko (ref64) 1989; 2
Pinkus (ref67) 1999; 8
ref30
ref33
ref32
Ho (ref2); 33
ref39
Montanari (ref55) 2023
Ghorbani (ref36)
Wainwright (ref40) 2019
Blei (ref82) 2017; 112
Eldan (ref44) 2018; 28
Lee (ref16)
ref26
ref25
Minka (ref81) 2013
ref21
Hornik (ref65) 1989; 2
ref28
Celentano (ref86) 2023
ref27
Mukherjee (ref46) 2021; 23
ref29
Sherrington (ref34) 1975; 35
Bach (ref72) 2014; 18
Ma (ref97)
(ref9) 2022
Chen (ref61)
ref13
Benton (ref18) 2023
Chatterjee (ref48) 2010; 148
ref99
ref10
Vignac (ref90) 2022
Oko (ref22) 2023
Ghio (ref102) 2023
Shah (ref19) 2023
Wei (ref74)
ref93
Blei (ref57) 2003; 3
ref92
ref95
ref94
ref88
ref87
Chen (ref14) 2022
Yang (ref12) 2022
Montanari (ref54) 2023
Yarotsky (ref70) 2017; 94
Chen (ref62) 2023
ref84
ref83
ref80
ref104
ref105
Hornik (ref66) 1993; 6
ref103
Saharia (ref11); 35
Yu (ref100) 2023
Liu (ref59) 2022
ref71
Niu (ref91)
DeVore (ref68) 2011; 33
Yu (ref101) 2023
Song (ref4) 2020
E (ref69) 2019
Jain (ref45); 75
Li (ref17) 2023
Wu (ref50) 2023
References_xml – year: 2023
  ident: ref19
  article-title: Learning mixtures of Gaussians using the DDPM objective
  publication-title: arXiv:2307.01178
– start-page: 2984
  volume-title: Proc. Conf. Learn. Theory
  ident: ref60
  article-title: Improved analysis for a proximal algorithm for sampling
– year: 2023
  ident: ref76
  article-title: Looped transformers as programmable computers
  publication-title: arXiv:2301.13196
– start-page: 4462
  volume-title: Proc. Int. Conf. Mach. Learn.
  ident: ref61
  article-title: Restoration-degradation beyond linear diffusions: A non-asymptotic analysis for DDIM-type samplers
– ident: ref10
  doi: 10.1109/CVPR52688.2022.01042
– volume: 112
  start-page: 859
  issue: 518
  year: 2017
  ident: ref82
  article-title: Variational inference: A review for statisticians
  publication-title: J. Amer. Stat. Assoc.
  doi: 10.1080/01621459.2017.1285773
– year: 2023
  ident: ref23
  article-title: Score approximation, estimation and distribution recovery of diffusion models on low-dimensional data
  publication-title: arXiv:2302.07194
– year: 2023
  ident: ref55
  article-title: Sampling, diffusions, and stochastic localization
  publication-title: arXiv:2305.10690
– ident: ref26
  doi: 10.1145/383952.384042
– ident: ref27
  doi: 10.1214/aoms/1177706964
– volume: 18
  start-page: 629
  issue: 1
  year: 2014
  ident: ref72
  article-title: Breaking the curse of dimensionality with convex neural networks
  publication-title: J. Mach. Learn. Res.
– start-page: 946
  volume-title: Proc. Int. Conf. Algorithmic Learn. Theory
  ident: ref16
  article-title: Convergence of score-based generative modeling for general data distributions
– ident: ref33
  doi: 10.1137/0222066
– ident: ref35
  doi: 10.1080/14786437708235992
– volume: 46
  volume-title: Spin Glasses: A Challenge for Mathematicians: Cavity and Mean Field Models
  year: 2003
  ident: ref47
– start-page: 4246
  volume-title: Proc. ICM
  ident: ref63
  article-title: Analysis of high-dimensional distributions using pathwise methods
– year: 2013
  ident: ref81
  article-title: Expectation propagation for approximate Bayesian inference
  publication-title: arXiv:1301.2294
– start-page: 11918
  volume-title: Proc. Adv. Neural Inf. Process. Syst.
  ident: ref3
  article-title: Generative modeling by estimating gradients of the data distribution
– volume: 33
  start-page: 6840
  volume-title: Proc. NIPS
  ident: ref2
  article-title: Denoising diffusion probabilistic models
– volume: 23
  start-page: 13703
  issue: 1
  year: 2021
  ident: ref46
  article-title: Variational inference in high-dimensional linear regression
  publication-title: J. Mach. Learn. Res.
– start-page: 1202
  volume-title: Proc. 30th ACM Int. Conf. Inf. Knowl. Manage.
  ident: ref97
  article-title: A unified view on graph neural networks as graph signal denoising
– year: 2022
  ident: ref6
  article-title: Hierarchical text-conditional image generation with CLIP latents
  publication-title: arXiv:2204.06125
– start-page: 8599
  volume-title: Proc. Int. Conf. Mach. Learn.
  ident: ref8
  article-title: Grad-TTS: A diffusion probabilistic model for text-to-speech
– year: 2023
  ident: ref62
  article-title: The probability flow ODE is provably fast
  publication-title: arXiv:2305.11798
– ident: ref87
  doi: 10.1073/pnas.1802705116
– volume: 3
  start-page: 993
  year: 2003
  ident: ref57
  article-title: Latent Dirichlet allocation
  publication-title: J. Mach. Learn. Res.
– start-page: 2221
  volume-title: Proc. Int. Conf. Mach. Learn.
  ident: ref36
  article-title: An instability in variational inference for topic models
– volume: 6
  start-page: 1069
  issue: 8
  year: 1993
  ident: ref66
  article-title: Some new results on neural network approximation
  publication-title: Neural Netw.
  doi: 10.1016/S0893-6080(09)80018-X
– year: 2023
  ident: ref100
  article-title: White-box transformers via sparse rate reduction
  publication-title: arXiv:2306.01129
– ident: ref94
  doi: 10.1109/ICCV.2015.179
– year: 2023
  ident: ref50
  article-title: Thouless–Anderson-palmer equations for the multi-species Sherrington–Kirkpatrick model
  publication-title: arXiv:2308.09099
– volume: 8
  start-page: 143
  year: 1999
  ident: ref67
  article-title: Approximation theory of the MLP model in neural networks
  publication-title: Acta Numerica
  doi: 10.1017/S0962492900002919
– ident: ref32
  doi: 10.1093/acprof:oso/9780198570837.001.0001
– volume: 162
  start-page: 8867
  volume-title: Proc. 39th Int. Conf. Mach. Learn.
  ident: ref89
  article-title: Equivariant diffusion for molecule generation in 3D
– ident: ref71
  doi: 10.1109/18.256500
– ident: ref31
  doi: 10.1109/FOCS54457.2022.00038
– start-page: 4735
  volume-title: Proc. Int. Conf. Mach. Learn.
  ident: ref15
  article-title: Improved analysis of score-based generative modeling: User-friendly bounds under minimal smoothness assumptions
– start-page: 24139
  volume-title: Proc. Int. Conf. Mach. Learn.
  ident: ref79
  article-title: Neural network approximations of PDEs beyond linearity: A representational perspective
– ident: ref53
  doi: 10.1007/s00039-013-0214-y
– year: 2019
  ident: ref69
  article-title: The Barron space and the flow-induced function spaces for neural network models
  publication-title: arXiv:1906.08039
– year: 2023
  ident: ref102
  article-title: Sampling with flows, diffusion and autoregressive neural networks: A spin-glass perspective
  publication-title: arXiv:2308.14085
– start-page: 12071
  volume-title: Proc. Adv. Neural Inf. Process. Syst.
  ident: ref74
  article-title: Statistically meaningful approximation: A case study on approximating Turing machines with transformers
– ident: ref93
  doi: 10.5555/3104322.3104374
– ident: ref92
  doi: 10.24963/ijcai.2023/751
– year: 2020
  ident: ref4
  article-title: Score-based generative modeling through stochastic differential equations
  publication-title: arXiv:2011.13456
– year: 2023
  ident: ref18
  article-title: Nearly d-linear convergence bounds for diffusion models via stochastic localization
  publication-title: arXiv:2308.03686
– ident: ref95
  doi: 10.1109/CVPR.2018.00196
– ident: ref28
  doi: 10.1561/2200000001
– volume: 2
  start-page: 303
  issue: 4
  year: 1989
  ident: ref64
  article-title: Approximation by superpositions of a sigmoidal function
  publication-title: Math. Control, Signals, Syst.
  doi: 10.1007/BF02551274
– year: 2023
  ident: ref20
  article-title: Reward-directed conditional diffusion: Provable distribution estimation and reward improvement
  publication-title: arXiv:2307.07055
– volume: 28
  start-page: 1548
  issue: 6
  year: 2018
  ident: ref44
  article-title: Gaussian-width gradient complexity, reverse log-Sobolev inequalities and nonlinear large deviations
  publication-title: Geometric Funct. Anal.
  doi: 10.1007/s00039-018-0461-z
– start-page: 2
  volume-title: Proc. Int. Congr. Mathematicians
  ident: ref24
  article-title: Markov random field image models and their applications to computer vision
– ident: ref13
  doi: 10.1109/TPAMI.2023.3261988
– year: 2022
  ident: ref77
  article-title: Transformers learn shortcuts to automata
  publication-title: arXiv:2210.10749
– ident: ref84
  doi: 10.1561/2200000092
– volume-title: High-Dimensional Statistics: A Non-Asymptotic Viewpoint
  year: 2019
  ident: ref40
  doi: 10.1017/9781108627771
– start-page: 9079
  volume-title: Proc. Adv. Neural Inf. Process. Syst.
  ident: ref98
  article-title: Theoretical linear convergence of unfolded ISTA and its practical weights and thresholds
– ident: ref52
  doi: 10.1214/23-aop1675
– volume: 34
  start-page: 8780
  volume-title: Proc. NIPS
  ident: ref5
  article-title: Diffusion models beat GANs on image synthesis
– start-page: 17981
  volume-title: Proc. Adv. Neural Inf. Process. Syst.
  ident: ref7
  article-title: Structured denoising diffusion models in discrete state-spaces
– ident: ref30
  doi: 10.1109/CVPR.2016.90
– year: 2022
  ident: ref38
  article-title: TAP equations for orthogonally invariant spin glasses at high temperature
  publication-title: arXiv:2202.09325
– volume: 148
  start-page: 567
  issue: 3
  year: 2010
  ident: ref48
  article-title: Spin glasses and Stein’s method
  publication-title: Probab. Theory Rel. Fields
  doi: 10.1007/s00440-009-0240-8
– year: 2023
  ident: ref22
  article-title: Diffusion models are minimax optimal distribution estimators
  publication-title: arXiv:2303.01861
– year: 2022
  ident: ref12
  article-title: Diffusion models: A comprehensive survey of methods and applications
  publication-title: arXiv:2209.00796
– ident: ref104
  doi: 10.1088/1742-5468/ab4bbb
– volume-title: OpenAI Announces DALL-E 2
  year: 2022
  ident: ref9
– volume: 33
  start-page: 125
  issue: 1
  year: 2011
  ident: ref68
  article-title: Approximation of functions of few variables in high dimensions
  publication-title: Constructive Approximation
  doi: 10.1007/s00365-010-9105-8
– year: 2023
  ident: ref17
  article-title: Towards faster non-asymptotic convergence for diffusion-based generative models
  publication-title: arXiv:2306.09251
– ident: ref83
  doi: 10.1073/pnas.0909892106
– volume: 18
  start-page: 2887
  issue: 1
  year: 2017
  ident: ref96
  article-title: Convolutional neural networks analyzed via convolutional sparse coding
  publication-title: J. Mach. Learn. Res.
– year: 2022
  ident: ref90
  article-title: DiGress: Discrete denoising diffusion for graph generation
  publication-title: arXiv:2209.14734
– year: 2021
  ident: ref85
  article-title: Local convexity of the TAP free energy and AMP convergence for Z2-synchronization
  publication-title: arXiv:2106.11428
– ident: ref29
  doi: 10.1023/A:1007665907178
– ident: ref41
  doi: 10.1109/FOCS.2017.39
– ident: ref58
  doi: 10.1145/1273496.1273577
– ident: ref37
  doi: 10.1214/20-aop1443
– ident: ref51
  doi: 10.1007/s00440-018-0845-x
– volume: 35
  start-page: 36479
  volume-title: Proc. Adv. Neural Inf. Process. Syst.
  ident: ref11
  article-title: Photorealistic text-to-image diffusion models with deep language understanding
– volume: 94
  start-page: 103
  year: 2017
  ident: ref70
  article-title: Error bounds for approximations with deep ReLU networks
  publication-title: Neural Netw.
  doi: 10.1016/j.neunet.2017.07.002
– ident: ref88
  doi: 10.1109/ALLERTON.2016.7852290
– start-page: 15044
  volume-title: Proc. Adv. Neural Inf. Process. Syst.
  ident: ref78
  article-title: Parametric complexity bounds for approximating PDEs with neural networks
– ident: ref25
  doi: 10.1145/1553374.1553505
– volume: 35
  start-page: 1792
  issue: 26
  year: 1975
  ident: ref34
  article-title: Solvable model of a spin-glass
  publication-title: Phys. Rev. Lett.
  doi: 10.1103/PhysRevLett.35.1792
– volume: 75
  start-page: 1326
  volume-title: Proc. Conf. Learn. Theory
  ident: ref45
  article-title: The mean-field approximation: Information inequalities, algorithms, and complexity
– ident: ref80
  doi: 10.1145/3501714.3501727
– year: 2023
  ident: ref86
  article-title: Mean-field variational inference with the TAP free energy: Geometric and statistical properties in linear models
  publication-title: arXiv:2311.08442
– ident: ref39
  doi: 10.1007/3-540-45435-7_4
– ident: ref105
  doi: 10.1007/s00440-021-01085-x
– ident: ref42
  doi: 10.1109/MSP.2020.3016905
– year: 2023
  ident: ref101
  article-title: Emergence of segmentation with minimalistic white-box transformers
  publication-title: arXiv:2308.16271
– volume: 2
  start-page: 359
  issue: 5
  year: 1989
  ident: ref65
  article-title: Multilayer feedforward networks are universal approximators
  publication-title: Neural Netw.
  doi: 10.1016/0893-6080(89)90020-8
– ident: ref43
  doi: 10.1016/j.aim.2016.05.017
– year: 2023
  ident: ref54
  article-title: Posterior sampling in high dimension via diffusion processes
  publication-title: arXiv:2304.11449
– ident: ref99
  doi: 10.1109/TSP.2017.2708040
– year: 2022
  ident: ref59
  article-title: Let us build bridges: Understanding and extending diffusion generative models
  publication-title: arXiv:2208.14699
– start-page: 4474
  volume-title: Proc. Int. Conf. Artif. Intell. Statist.
  ident: ref91
  article-title: Permutation invariant graph generation via score-based generative modeling
– volume: 30
  start-page: 327
  year: 2021
  ident: ref73
  article-title: Neural network approximation
  publication-title: Acta Numerica
  doi: 10.1017/S0962492921000052
– ident: ref103
  doi: 10.1088/0305-4470/15/6/035
– start-page: 2256
  volume-title: Proc. Int. Conf. Mach. Learn.
  ident: ref1
  article-title: Deep unsupervised learning using nonequilibrium thermodynamics
– ident: ref56
  doi: 10.1109/ISIT54713.2023.10206572
– year: 2022
  ident: ref14
  article-title: Sampling is as easy as learning the score: Theory for diffusion models with minimal data assumptions
  publication-title: arXiv:2209.11215
– ident: ref21
  doi: 10.1088/1742-5468/acf8ba
– ident: ref49
  doi: 10.1016/j.anihpb.2005.04.001
– year: 2023
  ident: ref75
  article-title: Transformers as statisticians: Provable in-context learning with in-context algorithm selection
  publication-title: arXiv:2306.04637
SSID ssj0014512
Score 2.4982872
Snippet We investigate the efficiency of deep neural networks for approximating scoring functions in diffusion-based generative modeling. While existing approximation...
SourceID crossref
ieee
SourceType Index Database
Publisher
StartPage 2930
SubjectTerms Approximation algorithms
Computational modeling
Diffusion model
Diffusion models
graphical model
Graphical models
Inference algorithms
Mathematical models
Noise reduction
residual neural network
Residual neural networks
Risk minimization
Title Deep Networks as Denoising Algorithms: Sample-Efficient Learning of Diffusion Models in High-Dimensional Graphical Models
URI https://ieeexplore.ieee.org/document/10857317
Volume 71
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVIEE
  databaseName: IEEE Electronic Library (IEL)
  customDbUrl:
  eissn: 1557-9654
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0014512
  issn: 0018-9448
  databaseCode: RIE
  dateStart: 19630101
  isFulltext: true
  titleUrlDefault: https://ieeexplore.ieee.org/
  providerName: IEEE
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV07T8MwELaACQaeRbzlgYXBbRrHcY4N0ZaHRBeKxBbFsQ0VJUE0HeDX44tTVJCQ2KLEUSzfOb7n9xFyGocyllnXMq4BWBRGmoFIFOM2AaOkkbIGcb0bxtcP0e2jeGya1eteGGNMXXxm2nhZ5_J1mc8wVNbBSnnpDrxlsiyT2DdrfacMItH10OBdt4Od0zHPSQbQGd2MnCcYijYXXEDIf5xBC6Qq9Zky2CDD-Wx8KclLe1apdv75C6jx39PdJOuNdUkvvDpskSVTbJONOXMDbTbyNllbgCHcIR89Y97o0BeET2k2pT1TlGOMItCLyVP5Pq6eX6fn9D5DKGHWr1En3KdpA876REtLe2NrZxh6o0ivNpnScUGxiIT1kD_AY3_QK4THRrVoBrXIw6A_urxmDSUDy53vUjHgkJlIBqC6Of4OlIqMFVwGoQatkRswyJxVoG0S5zoIAS0ISJzYBQQ2lxnfJStFWZg9QhMRxhKy0HJhIgU2izHFC6By90ArtU_O5kJK3zzyRlp7LAGkTqApCjRtBLpPWrj8C-P8yh_8cf-QrOLrvgLniKxU7zNz7IyLSp3USvUFUR_LKQ
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LT9wwEB5ROLQcyqNUpQ_wgUsPXrKxHWd6Q11gaWEvXSRuURzbsCpNEJs90F9fT5yttpUq9RYlVmJ5xvE8vw_gKEt1psuh58IicplKy1HlhgufozPaad2BuF5NsvG1_HKjbvpm9a4XxjnXFZ-5AV12uXzbVAsKlR1TpbwOB94z2FBSShXbtX4nDaQaRnDwYdjDwe1YZiUTPJ5eTIMvmKqBUEJhKv44hVZoVbpT5WwLJsv5xGKS74NFawbVz7-gGv97wtvwsrcv2UlUiB1Yc_UubC25G1i_lXdhcwWI8BU8jZx7YJNYEj5n5ZyNXN3MKI7ATu5vm8dZe_dj_ol9KwlMmJ92uBPh06yHZ71ljWejmfcLCr4xIli7n7NZzaiMhI-IQSCif7BzAsgmxegH7cH12en085j3pAy8Ct5Ly1Fg6aRO0Awr-iEYI51XQiepRWuJHTApg11gfZ5VNkmRbAjMg-AVJr7SpXgN63VTuzfAcpVmGsvUC-WkQV9mlORFNFV4YI3Zh49LIRUPEXuj6HyWBIsg0IIEWvQC3Yc9Wv6VcXHl3_7j_iE8H0-vLovLi8nXd_CCXhXrcd7Devu4cB-CqdGag07BfgEfvs52
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Deep+Networks+as+Denoising+Algorithms%3A+Sample-Efficient+Learning+of+Diffusion+Models+in+High-Dimensional+Graphical+Models&rft.jtitle=IEEE+transactions+on+information+theory&rft.au=Mei%2C+Song&rft.au=Wu%2C+Yuchen&rft.date=2025-04-01&rft.pub=IEEE&rft.issn=0018-9448&rft.volume=71&rft.issue=4&rft.spage=2930&rft.epage=2954&rft_id=info:doi/10.1109%2FTIT.2025.3535923&rft.externalDocID=10857317
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0018-9448&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0018-9448&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0018-9448&client=summon