On Using CFD and Experimental Data to Train an Artificial Neural Network to Reconstruct ECVT Images: Application for Fluidized Bed Reactors

Electrical capacitance volume tomography (ECVT) is an experimental technique capable of reconstructing 3D solid volume fraction distribution inside a sensing region. This technique has been used in fluidized beds as it allows for accessing data that are very difficult to obtain using other experimen...

Full description

Saved in:
Bibliographic Details
Published inProcesses Vol. 12; no. 2; p. 386
Main Authors Montilla, Carlos, Ansart, Renaud, Majji, Anass, Nadir, Ranem, Cid, Emmanuel, Simoncini, David, Negny, Stephane
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.02.2024
MDPI
Subjects
Online AccessGet full text
ISSN2227-9717
2227-9717
DOI10.3390/pr12020386

Cover

More Information
Summary:Electrical capacitance volume tomography (ECVT) is an experimental technique capable of reconstructing 3D solid volume fraction distribution inside a sensing region. This technique has been used in fluidized beds as it allows for accessing data that are very difficult to obtain using other experimental devices. Recently, artificial neural networks have been proposed as a new type of reconstruction algorithm for ECVT devices. One of the main drawbacks of neural networks is that they need a database containing previously reconstructed images to learn from. Previous works have used databases with very simple or limited configurations that might not be well adapted to the complex dynamics of fluidized bed configurations. In this work, we study two different approaches: a supervised learning approach that uses simulated data as a training database and a reinforcement learning approach that relies only on experimental data. Our results show that both techniques can perform as well as the classical algorithms. However, once the neural networks are trained, the reconstruction process is much faster than the classical algorithms.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2227-9717
2227-9717
DOI:10.3390/pr12020386