Traffic Flow Imputation Using Parallel Data and Generative Adversarial Networks

Traffic data imputation is critical for both research and applications of intelligent transportation systems. To develop traffic data imputation models with high accuracy, traffic data must be large and diverse, which is costly. An alternative is to use synthetic traffic data, which is cheap and eas...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on intelligent transportation systems Vol. 21; no. 4; pp. 1624 - 1630
Main Authors Chen, Yuanyuan, Lv, Yisheng, Wang, Fei-Yue
Format Journal Article
LanguageEnglish
Published New York IEEE 01.04.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN1524-9050
1558-0016
DOI10.1109/TITS.2019.2910295

Cover

More Information
Summary:Traffic data imputation is critical for both research and applications of intelligent transportation systems. To develop traffic data imputation models with high accuracy, traffic data must be large and diverse, which is costly. An alternative is to use synthetic traffic data, which is cheap and easy-access. In this paper, we propose a novel approach using parallel data and generative adversarial networks (GANs) to enhance traffic data imputation. Parallel data is a recently proposed method of using synthetic and real data for data mining and data-driven process, in which we apply GANs to generate synthetic traffic data. As it is difficult for the standard GAN algorithm to generate time-dependent traffic flow data, we made twofold modifications: 1) using the real data or the corrupted ones instead of random vectors as latent codes to generator within GANs and 2) introducing a representation loss to measure discrepancy between the synthetic data and the real data. The experimental results on a real traffic dataset demonstrate that our method can significantly improve the performance of traffic data imputation.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1524-9050
1558-0016
DOI:10.1109/TITS.2019.2910295