FReLU: Flexible Rectified Linear Units for Improving Convolutional Neural Networks
Rectified linear unit (ReLU) is a widely used activation function for deep convolutional neural networks. However, because of the zero-hard rectification, ReLU networks lose the benefits from negative values. In this paper, we propose a novel activation function called flexible rectified linear unit...
Saved in:
Published in | 2018 24th International Conference on Pattern Recognition (ICPR) pp. 1223 - 1228 |
---|---|
Main Authors | , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.08.2018
|
Subjects | |
Online Access | Get full text |
DOI | 10.1109/ICPR.2018.8546022 |
Cover
Summary: | Rectified linear unit (ReLU) is a widely used activation function for deep convolutional neural networks. However, because of the zero-hard rectification, ReLU networks lose the benefits from negative values. In this paper, we propose a novel activation function called flexible rectified linear unit (FReLU) to further explore the effects of negative values. By redesigning the rectified point of ReLU as a learnable parameter, FReLU expands the states of the activation output. When a network is successfully trained, FReLU tends to converge to a negative value, which improves the expressiveness and thus the performance. Furthermore, FReLU is designed to be simple and effective without exponential functions to maintain low-cost computation. For being able to easily used in various network architectures, FReLU does not rely on strict assumptions by self-adaption. We evaluate FReLU on three standard image classification datasets, including CIFAR-10, CIFAR-100, and ImageNet. Experimental results show that FReLU achieves fast convergence and competitive performance on both plain and residual networks. |
---|---|
DOI: | 10.1109/ICPR.2018.8546022 |