Information geometry

The subject of information geometry blends several areas of statistics, computer science, physics, and mathematics.The subject evolved from the groundbreaking article published by legendary statistician C.R.Rao in 1945.His works led to the creation of Cramer-Rao bounds, Rao distance, and Rao-Blackaw...

Full description

Saved in:
Bibliographic Details
Main Authors Plastino, Angelo, Srinivasa Rao, Arni S. R., Rao, Calyampudi Radhakrishna
Format eBook Book
LanguageEnglish
Published Amsterdam North-Holland, an imprint of Elsevier 2021
Elsevier Science & Technology
North-Holland
Edition1
SeriesHandbook of Statistics
Subjects
Online AccessGet full text
ISBN0323855679
9780323855679

Cover

Abstract The subject of information geometry blends several areas of statistics, computer science, physics, and mathematics.The subject evolved from the groundbreaking article published by legendary statistician C.R.Rao in 1945.His works led to the creation of Cramer-Rao bounds, Rao distance, and Rao-Blackawellization.
AbstractList The subject of information geometry blends several areas of statistics, computer science, physics, and mathematics. The subject evolved from the groundbreaking article published by legendary statistician C.R. Rao in 1945. His works led to the creation of Cramer-Rao bounds, Rao distance, and Rao-Blackawellization. Fisher-Rao metrics and Rao distances play a very important role in geodesics, econometric analysis to modern-day business analytics. The chapters of the book are written by experts in the field who have been promoting the field of information geometry and its applications.
The subject of information geometry blends several areas of statistics, computer science, physics, and mathematics.The subject evolved from the groundbreaking article published by legendary statistician C.R.Rao in 1945.His works led to the creation of Cramer-Rao bounds, Rao distance, and Rao-Blackawellization.
Author Srinivasa Rao, Arni S. R.
Rao, Calyampudi Radhakrishna
Plastino, Angelo
Author_xml – sequence: 1
  fullname: Plastino, Angelo
– sequence: 2
  fullname: Srinivasa Rao, Arni S. R.
– sequence: 3
  fullname: Rao, Calyampudi Radhakrishna
BackLink https://cir.nii.ac.jp/crid/1130008276642910219$$DView record in CiNii
BookMark eNpFz09LxDAQBfCIf9Cue_MDeBDEw8IkaWaSo5ZVFxa8iNeStKnW7TbaRMVvb7Wilxke_HjwMrbXh97vsAykkFop1LT7H8gcsIwLkoZQCzhk8xifAUCQQanEETtZ9U0Ytja1oT999GHr0_B5zPYb20U___0z9nC9vC9uF-u7m1VxuV5YjsKkhasUNiAaS6CgpjGScFQDas0toNM5Ns6CdpxU5WunrRSNt8pIpxCB5IxdTMU2bvxHfApdiuV7510Im1ga0n-jcLTnk30Zwuubj6n8YZXv02C7cnlVIMncKD7Ks0n2bVtW7fflXI6rtSDEXBgOghv5BUrqVIc
ContentType eBook
Book
DBID RYH
DEWEY 519.5
DatabaseName CiNii Complete
DatabaseTitleList

DeliveryMethod fulltext_linktorsrc
Discipline Mathematics
Applied Sciences
Statistics
EISBN 0323855687
9780323855686
Edition 1
ExternalDocumentID 9780323855686
EBC6734951
BC10374854
GroupedDBID 38.
AAAAS
AABBV
AAFKH
AALRI
AANYM
AAXUO
ABGWT
ABIWA
ABLXK
ABQNV
ABQQC
ABRSK
AECLD
ALMA_UNASSIGNED_HOLDINGS
APVFW
ATDNW
BBABE
CETPU
CZZ
HGY
KUA
L7C
O7H
RYH
SDK
SRW
UE6
ID FETCH-LOGICAL-a1629t-bc56f02fa7050d7bc572b7d06881a06b846fba08b175cedb8a32fea593b566073
ISBN 0323855679
9780323855679
IngestDate Fri Nov 08 03:47:13 EST 2024
Wed Oct 01 01:05:26 EDT 2025
Fri Jun 27 00:27:57 EDT 2025
IsPeerReviewed false
IsScholarly false
LCCallNum_Ident QA276.23 .S656 2021
Language English
LinkModel OpenURL
MergedId FETCHMERGED-LOGICAL-a1629t-bc56f02fa7050d7bc572b7d06881a06b846fba08b175cedb8a32fea593b566073
Notes Includes bibliographical references and index
OCLC 1273976820
PQID EBC6734951
PageCount 250
ParticipantIDs askewsholts_vlebooks_9780323855686
proquest_ebookcentral_EBC6734951
nii_cinii_1130008276642910219
PublicationCentury 2000
PublicationDate c2021
2021
2021-09-26
PublicationDateYYYYMMDD 2021-01-01
2021-09-26
PublicationDate_xml – year: 2021
  text: c2021
PublicationDecade 2020
PublicationPlace Amsterdam
PublicationPlace_xml – name: Amsterdam
– name: Chantilly
PublicationSeriesTitle Handbook of Statistics
PublicationYear 2021
Publisher North-Holland, an imprint of Elsevier
Elsevier Science & Technology
North-Holland
Publisher_xml – name: North-Holland, an imprint of Elsevier
– name: Elsevier Science & Technology
– name: North-Holland
SSID ssj0002796352
ssib049017952
Score 2.259609
Snippet The subject of information geometry blends several areas of statistics, computer science, physics, and mathematics.The subject evolved from the groundbreaking...
The subject of information geometry blends several areas of statistics, computer science, physics, and mathematics. The subject evolved from the groundbreaking...
SourceID askewsholts
proquest
nii
SourceType Aggregation Database
Publisher
SubjectTerms Geometrical models in statistics
TableOfContents 2. Overview and comparisons of applications -- 2.1. Classical dynamics (Frieden, 1998, 2004 -- Frieden and Gatenby, 2007) -- 2.2. Quantum physics (Frieden, 1998, 2004) -- 2.3. Biology (Darwin, 1859 -- Fisher, 1922 -- Frieden and Gatenby, 2020 -- Gatenby and Frieden, 2016 -- Hodgkin and Huxley, 1952) -- 2.4. Thermodynamics (Frieden et al., 1999) -- 2.5. Extending use of the principle of natural selection (Popper, 1963) -- 2.6. From biological cell to earth to solar system, galaxy, universe, and multiverse -- 2.7. Creation of a multiverse (Popper, 1963) by requiring its Fisher I to be maximized -- 2.8. Analogy of a cancer ``universe´´ -- 2.9. What ultimately causes a multiverse to form? -- 2.10. Is there empirical evidence for a multiverse having formed? -- 2.11. Details of the process of growing successive universes (Frieden and Gatenby, 2019) -- 2.12. How many universes N might exist in the multiverse? -- 2.13. Annihilation of universes -- 2.14. Growth of a bubble of nothing -- 2.15. Counter-growth of new universes -- 2.16. Possibility of many annihilation waves -- 2.17. How large a number N of universes exist (Linde and Vanchurin, 2010)? -- 2.18. Is the multiverse merely a theoretical construct? -- 2.19. Should the fact that we do not, and have not observed life elsewhere in our universe affect a belief that we exist ... -- 3. Derivation of principle of maximum Fisher information (MFI) -- 3.1. Cramer-Rao (C-R) inequality (Frieden, 1998, 2004 -- Frieden and Gatenby, 2020) -- 3.2. On derivation of the C-R inequality -- 3.3. What do such data values (augmented by knowledge of a single equality obeyed by the system physics) have to say abou ... -- 3.3.1. Dependence of system knowledge on the arbitrary nature of forming the data -- 3.3.2. Dependence on dimensionality -- 3.3.3. Dependence of system complexity (or order) upon Fisher I.
4. Kantian view of Fisher information use to predict a physical law -- 4.1. How principle of maximum information originates with Kant -- 4.2. On significance of the information difference I-J -- 5. Principle of minimum loss of Fisher information -- 5.1. Verifying that minimum loss is actually achieved by the principle -- 5.2. Summary and foundations of the Fisher approach to knowledge acquisition -- 5.3. What is accomplished by use of the Fisher approach -- 6. Commonality of information-based growths of cancer and viral infections -- 6.1. MFI applied to early cancer growth -- 6.2. Later-stage cancer growth -- 6.3. MFI applied to early covid-19 growth -- 6.4. Common biological causes of cancer- and covid-19 growth -- the ACE2 link -- References -- Chapter 7: Quantum metrology and quantum correlations -- 1. Quantum correlations -- 2. Parameter estimation -- 3. Cramer-Rao bound -- 4. Quantum Fisher information -- 5. Quantum correlations in estimation theory -- 5.1. Heisenberg limit -- 5.2. Interferometric power -- 6. Conclusion -- References -- Chapter 8: Information, economics, and the Cramér-Rao bound -- 1. Introduction -- 2. Shannon entropy and Fisher information -- 3. Financial economics -- 3.1. Discount factors and bonds -- 3.2. Derivative securities -- 4. Macroeconomics -- 5. Discussion and summary -- Acknowledgments -- References -- Chapter 9: Zipf's law results from the scaling invariance of the Cramer-Rao inequality -- 1. Introduction -- 2. Our goal -- 3. Fisher's information measure (FIM) and its minimization -- 4. Derivation of Zipf's law -- 5. Zipf plots -- 6. Summary -- References -- Further reading -- Section III: Advanced statistical theory -- Chapter 10: λ-Deformed probability families with subtractive and divisive normalizations -- 1. Introduction -- 1.1. Deformation models -- 1.2. Deformed probability families: General approach
4.3. Mean value of r -- 4.4. Variance V -- 4.5. The HO-Tsallis Fisher information measure -- 5. Failure of the Boltzmann-Gibbs (BG) statistics for Newton's gravitation -- 5.1. Tackling Znu -- 5.2. Mean values derived from our partition function (PP) -- 5.2.1. r-Value -- 5.2.2. The r2 instance -- 5.3. Variance Deltar = r2-r2 -- 5.4. Gravitational FIM -- 5.5. Incompatibility between Boltzmann-Gibbs statistics (BGS) and long-range interactions -- 6. Statistics of gravitation in Tsallis statistics -- 6.1. Gravity-Tsallis partition function -- 6.2. Gravity-Tsallis mean values for r and r2 -- 6.3. Tsallis Gravity treatment and Fisher's information measure -- 6.4. Tsallis Gravity treatment and Cramer-Rao inequality (CRI) -- 7. Conclusions -- References -- Chapter 5: Information geometry and classical Cramér-Rao-type inequalities -- 1. Introduction -- 2. I-divergence and Iα-divergence -- 2.1. Extension to infinite X -- 2.2. Bregman vs Csiszár -- 2.3. Classical vs quantum CR inequality -- 3. Information geometry from a divergence function -- 3.1. Information geometry for α-CR inequality -- 3.2. An α-version of Cramér-Rao inequality -- 3.3. Generalized version of Cramér-Rao inequality -- 4. Information geometry for Bayesian CR inequality and Barankin bound -- 5. Information geometry for Bayesian α-CR inequality -- 6. Information geometry for Hybrid CR inequality -- 7. Summary -- Acknowledgments -- Appendix -- A.1. Other generalizations of Cramér-Rao inequality -- References -- Section II: Theoretical applications and physics -- Chapter 6: Principle of minimum loss of Fisher information, arising from the Cramer-Rao inequality: Its role i -- 1. Introduction (Fisher, 1922 -- Frieden 1998, 2004 -- Frieden and Gatenby, 2019) -- 1.1. On learning, energy, sensory messages -- 1.2. On variational approaches -- 1.3. Vital role played by information
1.3. Chapter outline -- 2. λ-Deformation of exponential and mixture families -- 2.1. λ-Deformation -- 2.2. Deformation: Subtractive approach -- 2.3. Deformation: Divisive approach -- 2.4. Relation between the two normalizations -- 2.5. λ-Exponential and λ-mixture families -- 3. Deforming Legendre duality: λ-Duality -- 3.1. From Bregman divergence to λ-logarithmic divergence -- 3.2. λ-Deformed Legendre duality -- 3.3. Relationship between λ-conjugation and Legendre conjugation -- 3.4. Information geometry of λ-logarithmic divergence -- 4. λ-Deformed entropy and divergence -- 4.1. Relation between potential functions and Rényi entropy -- 4.2. Relation between λ-logarithmic divergence and Rényi divergence -- 4.3. Entropy maximizing property of λ-exponential family -- 5. Example: λ-Deformation of the probability simplex -- 5.1. λ-Exponential representation -- 5.2. λ-Mixture representation -- 6. Summary and conclusion -- References -- Chapter 11: Some remarks on Fisher information, the Cramer-Rao inequality, and their applications to physics -- 1. Introduction -- 2. Diffusion equation -- 3. Connection with Tsallis statistics -- 4. Conclusions -- Appendix -- A.1. The Cramer-Rao bound (Frieden, 1989) -- References -- Index
Intro -- Information Geometry -- Copyright -- Contents -- Contributors -- Preface -- Section I: Foundations of information geometry -- Chapter 1: Revisiting the connection between Fisher information and entropy's rate of change -- 1. Introduction -- 2. Fisher information and Cramer-Rao inequality -- 3. Fisher information and the rate of change of Boltzmann-Gibbs entropy -- 3.1. Brownian particle with constant drag force -- 3.2. Systems described by an N-dimensional Fokker-Planck equation -- 4. Possible lines for future research -- 5. Conclusions -- References -- Chapter 2: Pythagoras theorem in information geometry and applications to generalized linear models -- 1. Introduction -- 2. Pythagoras theorems in information geometry -- 3. Power entropy and divergence -- 4. Linear regression model -- 5. Generalized linear model -- 6. Discussion -- References -- Further reading -- Chapter 3: Rao distances and conformal mapping -- 1. Introduction -- 2. Manifolds -- 2.1. Conformality between two regions -- 3. Rao distance -- 4. Conformal mapping -- 5. Applications -- Acknowledgments -- References -- Chapter 4: Cramer-Rao inequality for testing the suitability of divergent partition functions -- 1. Introduction -- 2. A first illustrative example -- 2.1. Evaluation of the partition function -- 2.2. Instruction manual for using our procedure -- 2.3. Evaluation of r -- 2.4. Dealing with r2 -- 2.5. Obtaining fisher information measure -- 2.6. The six steps to obtain a finite Fisher's information -- 2.7. Cramer-Rao inequality (CRI) -- 2.8. Numerical example -- 3. A Brownian motion example -- 3.1. The present partition function -- 3.2. Mean values of x-powers -- 3.3. Tackling fisher -- 3.4. The present Cramer-Rao inequality -- 4. The harmonic oscillator (HO) in Tsallis statistics -- 4.1. The HO-Tsallis partition function -- 4.2. HO-Tsallis mean values for r2
Title Information geometry
URI https://cir.nii.ac.jp/crid/1130008276642910219
https://ebookcentral.proquest.com/lib/[SITE_ID]/detail.action?docID=6734951
https://www.vlebooks.com/vleweb/product/openreader?id=none&isbn=9780323855686
Volume v.Volume 45
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnR1NS8MwNOhO7qROcX4xxNvoSNolaa-O6RAU8Qs9laRJtbh1staB_npf0rWbUxC9hCZNA3nvNXnfD6FjYrJmuQF2BBGe0-2K2JGMS0diLpWKVUysM-bFJRvcdc8f6MO8XKKNLsllJ_r4Ma7kP1iFMcCriZL9A2arRWEAngG_0AKGoV1ifqtu6XheBR22n_R4pPO5N-8V8MN5khbhK8ZldVxpUYy1Zioy0b4WxetJmrRvOu3rztzcU6hOxfBdjF7fVAJTlbH8J9lzUWe7VBG4ZElFYC1AzsDQVaGshpMjGRm9oXU2KENbvkiW2IO7nFJWVHpZSkl90iM2bQ3trqJVzm3d-rPHSrXlcvirqZWBq1VmuY6qfh3VRfYC5zic8XkGF3uaJN-uQ3vH366jmon72EArOt1E9YsqnW3WQI0FeLdKeG-h-9P-bW_gzMpJAB0yN8gdGVEWYzcWHFOsOHS5K7kyZXeIwEwCKxZLgX0JLFWklfSF58Za0MCTwPTCWbiNauk41TuopWEmxcKLGVdGovPdCOgaR1xpSjzfb6Kjhe2F06E1fWfhAgx81kQHsOswSkxLjF0R-DHOQCIMTMH1oIlaJTxC-_3MXzfsn_QY90C0Jbu_LLGH1uYEsY9q-eRNHwAXlctDi7NPVB8XsA
linkProvider Elsevier
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=book&rft.title=Information+geometry&rft.au=Plastino%2C+Angelo&rft.au=Srinivasa+Rao%2C+Arni+S.+R.&rft.au=Rao%2C+Calyampudi+Radhakrishna&rft.date=2021-01-01&rft.pub=North-Holland%2C+an+imprint+of+Elsevier&rft.isbn=9780323855679&rft.externalDocID=BC10374854
thumbnail_m http://utb.summon.serialssolutions.com/2.0.0/image/custom?url=https%3A%2F%2Fvle.dmmserver.com%2Fmedia%2F640%2F97803238%2F9780323855686.jpg