A Gaussian mixture filter with adaptive refinement for nonlinear state estimation
•A feedforward neural network is used to approximate the state equation for better tractability of system dynamics.•Adaptive refinement of Gaussian mixtures in both prior and posterior state PDFs via assessment of nonlinearity.•The convergence rates of nonlinearity measures and the bound of state PD...
Saved in:
| Published in | Signal processing Vol. 201; p. 108677 |
|---|---|
| Main Authors | , |
| Format | Journal Article |
| Language | English |
| Published |
Elsevier B.V
01.12.2022
|
| Subjects | |
| Online Access | Get full text |
| ISSN | 0165-1684 |
| DOI | 10.1016/j.sigpro.2022.108677 |
Cover
| Summary: | •A feedforward neural network is used to approximate the state equation for better tractability of system dynamics.•Adaptive refinement of Gaussian mixtures in both prior and posterior state PDFs via assessment of nonlinearity.•The convergence rates of nonlinearity measures and the bound of state PDF estimation errors are both quantified.
The state estimation of highly nonlinear dynamic systems is difficult because the probability distribution of their states can be highly non-Gaussian. An adaptive Gaussian mixture filter is developed in this work to address this challenge, in which the Gaussian mixture models are refined based on the system's local severity of nonlinearity to attain a high-fidelity estimation of the state distribution. A set of nonlinearity assessment criteria are designed to trigger the splitting of Gaussian components at both the prediction and update stages of Bayesian filtering and the error bound of estimated distribution is established. The new filter has been benchmarked against the existing methods on two challenging problems and it consistently provides among-the-best accuracy with a reasonable computational cost, which proves that it can be used as a reliable state estimator for engineering systems with highly nonlinear dynamics and subject to high magnitudes of uncertainties. |
|---|---|
| ISSN: | 0165-1684 |
| DOI: | 10.1016/j.sigpro.2022.108677 |