SmaAt-UNet: Precipitation nowcasting using a small attention-UNet architecture

•A novel SmaAt-UNet model is introduced.•The core UNet model is equipped with attention mechanism and depthwise-separable convolutions.•This model requires a quarter of UNet model parameters whilst achieving a comparable prediction performance.•The proposed model is examined on a real-life datasets...

Full description

Saved in:
Bibliographic Details
Published inPattern recognition letters Vol. 145; pp. 178 - 186
Main Authors Trebing, Kevin, Staǹczyk, Tomasz, Mehrkanoon, Siamak
Format Journal Article
LanguageEnglish
Published Amsterdam Elsevier B.V 01.05.2021
Elsevier Science Ltd
Subjects
Online AccessGet full text
ISSN0167-8655
1872-7344
DOI10.1016/j.patrec.2021.01.036

Cover

More Information
Summary:•A novel SmaAt-UNet model is introduced.•The core UNet model is equipped with attention mechanism and depthwise-separable convolutions.•This model requires a quarter of UNet model parameters whilst achieving a comparable prediction performance.•The proposed model is examined on a real-life datasets consisting of precipitation maps and cloud cover. Weather forecasting is dominated by numerical weather prediction that tries to model accurately the physical properties of the atmosphere. A downside of numerical weather prediction is that it is lacking the ability for short-term forecasts using the latest available information. By using a data-driven neural network approach we show that it is possible to produce an accurate precipitation nowcast. To this end, we propose SmaAt-UNet, an efficient convolutional neural networks-based on the well known UNet architecture equipped with attention modules and depthwise-separable convolutions. We evaluate our approaches on a real-life datasets using precipitation maps from the region of the Netherlands and binary images of cloud coverage of France. The experimental results show that in terms of prediction performance, the proposed model is comparable to other examined models while only using a quarter of the trainable parameters.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0167-8655
1872-7344
DOI:10.1016/j.patrec.2021.01.036