Statistical Inference with Non-Normalized Models: Score Matching and Noise Contrastive Estimation

A non-normalized model is a statistical model defined by an unnormalized density, i.e., a density function that does not integrate to one. In machine learning, such models are often referred to as energy-based models. Examples include Markov random fields, distributions on manifolds, and Boltzmann m...

Full description

Saved in:
Bibliographic Details
Published inJournal of the Japan Statistical Society, Japanese Issue Vol. 54; no. 2; pp. 177 - 203
Main Author Matsuda, Takeru
Format Journal Article
LanguageJapanese
Published Japan Statistical Society 04.03.2025
一般社団法人 日本統計学会
Subjects
Online AccessGet full text
ISSN0389-5602
2189-1478
DOI10.11329/jjssj.54.177

Cover

More Information
Summary:A non-normalized model is a statistical model defined by an unnormalized density, i.e., a density function that does not integrate to one. In machine learning, such models are often referred to as energy-based models. Examples include Markov random fields, distributions on manifolds, and Boltzmann machines. These models allow for flexible data modeling but present challenges for likelihood-based statistical inference due to the presence of an intractable normalization constant. To address this issue, various statistical inference methods that do not require explicit computation of the normalization constant have been developed. In this paper, we introduce two parameter estimation methods for non-normalized models: score matching and noise contrastive estimation. We also discuss recent advancements, such as information criteria and nonlinear independent component analysis, as well as connections to other statistical methods, including shrinkage estimation and bridge sampling.
ISSN:0389-5602
2189-1478
DOI:10.11329/jjssj.54.177