Hidden Markov modeling for maximum probability neuron reconstruction

Recent advances in brain clearing and imaging have made it possible to image entire mammalian brains at sub-micron resolution. These images offer the potential to assemble brain-wide atlases of neuron morphology, but manual neuron reconstruction remains a bottleneck. Several automatic reconstruction...

Full description

Saved in:
Bibliographic Details
Published inCommunications biology Vol. 5; no. 1; pp. 388 - 11
Main Authors Athey, Thomas L., Tward, Daniel J., Mueller, Ulrich, Vogelstein, Joshua T., Miller, Michael I.
Format Journal Article
LanguageEnglish
Published London Nature Publishing Group UK 25.04.2022
Nature Publishing Group
Nature Portfolio
Subjects
Online AccessGet full text
ISSN2399-3642
2399-3642
DOI10.1038/s42003-022-03320-0

Cover

More Information
Summary:Recent advances in brain clearing and imaging have made it possible to image entire mammalian brains at sub-micron resolution. These images offer the potential to assemble brain-wide atlases of neuron morphology, but manual neuron reconstruction remains a bottleneck. Several automatic reconstruction algorithms exist, but most focus on single neuron images. In this paper, we present a probabilistic reconstruction method, ViterBrain, which combines a hidden Markov state process that encodes neuron geometry with a random field appearance model of neuron fluorescence. ViterBrain utilizes dynamic programming to compute the global maximizer of what we call the most probable neuron path. We applied our algorithm to imperfect image segmentations, and showed that it can follow axons in the presence of noise or nearby neurons. We also provide an interactive framework where users can trace neurons by fixing start and endpoints. ViterBrain is available in our open-source Python package brainlit. ViterBrain is an automated probabilistic reconstruction method that can reconstruct neuronal geometry and processes from microscopy images with code available in the open-source Python package, brainlit.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2399-3642
2399-3642
DOI:10.1038/s42003-022-03320-0