Regularization of Deep Neural Networks with Average Pooling Dropout

El Houssaine HSSAYNI and Mohamed ETTAOUIL

As a powerful technic, Dropout, has been proposed as a simple, yet effective regularization approach to combat overfitting in deep neural networks, in particular, in deep convolutional neural networks (DCNNs) which have shown surprising execution in several visual recognition problems, and attracted considerable interests in recent years. A pooling process plays a very important role in deep convolutional neural networks, which serves to reduce the dimensionality of processed data for decreasing computational cost as well as for avoiding overfitting and enhancing generalization capability of the network. For this raison we focus on the pooling layer, and we propose a new pooling method named Average Pooling Dropout, by applying dropout technic in pooling region, in order to avoid overfitting problem, as well as to enhance the generalization ability of DCNNs. Experimental results on several image benchmarks show that Average Pooling Dropout outflanks the current pooling strategies in arrangement execution as well as is effective for enhancing the generalization ability of DCNNs. Moreover, we show that Average Pooling Dropout combined with other regularization methods, such as batch normalization, is competitive with other existing techniques in classification performance.

Volume 12 | 04-Special Issue

Pages: 1720-1726

DOI: 10.5373/JARDCS/V12SP4/20201654