Accelerating iterative ptychography with an integrated neural network

Electron ptychography is a powerful and versatile tool for high‐resolution and dose‐efficient imaging. Iterative reconstruction algorithms are powerful but also computationally expensive due to their relative complexity and the many hyperparameters that must be optimised. Gradient descent‐based iter...

Full description

Saved in:
Bibliographic Details
Published inJournal of microscopy (Oxford) Vol. 300; no. 2; pp. 180 - 190
Main Authors McCray, Arthur R. C., Ribet, Stephanie M., Varnavides, Georgios, Ophus, Colin
Format Journal Article
LanguageEnglish
Published England Wiley Subscription Services, Inc 01.11.2025
Subjects
Online AccessGet full text
ISSN0022-2720
1365-2818
1365-2818
DOI10.1111/jmi.13407

Cover

More Information
Summary:Electron ptychography is a powerful and versatile tool for high‐resolution and dose‐efficient imaging. Iterative reconstruction algorithms are powerful but also computationally expensive due to their relative complexity and the many hyperparameters that must be optimised. Gradient descent‐based iterative ptychography is a popular method, but it may converge slowly when reconstructing low spatial frequencies. In this work, we present a method for accelerating a gradient descent‐based iterative reconstruction algorithm by training a neural network (NN) that is applied in the reconstruction loop. The NN works in Fourier space and selectively boosts low spatial frequencies, thus enabling faster convergence in a manner similar to accelerated gradient descent algorithms. We discuss the difficulties that arise when incorporating a NN into an iterative reconstruction algorithm and show how they can be overcome with iterative training. We apply our method to simulated and experimental data of gold nanoparticles on amorphous carbon and show that we can significantly speed up ptychographic reconstruction of the nanoparticles.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:0022-2720
1365-2818
1365-2818
DOI:10.1111/jmi.13407