Excessive Noise Injection Training of Neural Networks for Markerless Tracking in Obscured and Segmented Environments

In this letter, we demonstrate that the generalization properties of a neural network (NN) can be extended to encompass objects that obscure or segment the original image in its foreground or background. We achieve this by piloting an extension of the noise injection training technique, which we ter...

Full description

Saved in:
Bibliographic Details
Published inNeural computation Vol. 18; no. 9; pp. 2122 - 2145
Main Authors Unsworth, C. P., Coghill, G.
Format Journal Article
LanguageEnglish
Published One Rogers Street, Cambridge, MA 02142-1209, USA MIT Press 01.09.2006
MIT Press Journals, The
Subjects
Online AccessGet full text
ISSN0899-7667
1530-888X
DOI10.1162/neco.2006.18.9.2122

Cover

More Information
Summary:In this letter, we demonstrate that the generalization properties of a neural network (NN) can be extended to encompass objects that obscure or segment the original image in its foreground or background. We achieve this by piloting an extension of the noise injection training technique, which we term (ENI), on a simple feedforward multilayer perceptron (MLP) network with vanilla backward error propagation to achieve this aim. Six tests are reported that show the ability of an NN to distinguish six similar states of motion of a simplified human figure that has become obscured by moving vertical and horizontal bars and random blocks for different levels of obscuration. Four more extensive tests are then reported to determine the bounds of the technique. The results from the ENI network were compared to results from the same NN trained on clean states only. The results pilot strong evidence that it is possible to track a human subject behind objects using this technique, and thus this technique lends itself to a real-time markerless tracking system from a single video stream.
Bibliography:September, 2006
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-2
content type line 23
ISSN:0899-7667
1530-888X
DOI:10.1162/neco.2006.18.9.2122