One-Step Time-Dependent Future Video Frame Prediction with a Convolutional Encoder-Decoder Neural Network
There is an inherent need for autonomous cars, drones, and other robots to have a notion of how their environment behaves and to anticipate changes in the near future. In this work, we focus on anticipating future appearance given the current frame of a video. Existing work focuses on either predict...
        Saved in:
      
    
          | Published in | Image Analysis and Processing - ICIAP 2017 Vol. 10484; pp. 140 - 151 | 
|---|---|
| Main Authors | , , , , | 
| Format | Book Chapter | 
| Language | English | 
| Published | 
        Switzerland
          Springer International Publishing AG
    
        2017
     Springer International Publishing  | 
| Series | Lecture Notes in Computer Science | 
| Subjects | |
| Online Access | Get full text | 
| ISBN | 3319685597 9783319685595  | 
| ISSN | 0302-9743 1611-3349  | 
| DOI | 10.1007/978-3-319-68560-1_13 | 
Cover
| Summary: | There is an inherent need for autonomous cars, drones, and other robots to have a notion of how their environment behaves and to anticipate changes in the near future. In this work, we focus on anticipating future appearance given the current frame of a video. Existing work focuses on either predicting the future appearance as the next frame of a video, or predicting future motion as optical flow or motion trajectories starting from a single video frame. This work stretches the ability of CNNs (Convolutional Neural Networks) to predict an anticipation of appearance at an arbitrarily given future time, not necessarily the next video frame. We condition our predicted future appearance on a continuous time variable that allows us to anticipate future frames at a given temporal distance, directly from the input video frame. We show that CNNs can learn an intrinsic representation of typical appearance changes over time and successfully generate realistic predictions at a deliberate time difference in the near future. | 
|---|---|
| ISBN: | 3319685597 9783319685595  | 
| ISSN: | 0302-9743 1611-3349  | 
| DOI: | 10.1007/978-3-319-68560-1_13 |