ROMJIST Volume 25, No. 2, 2022, pp. 150-165
Amit VERMA, Toshanlal MEENPAL, Bibhudendra ACHARYA Computational Cost Reduction of Convolution Neural Networks by Insignificant Filter Removal
ABSTRACT: Convolutional Neural Networks are widely employed in a range of computer vision applications such as image classification and text recognition. While delivering excellent results across a range of applications, these high performing CNNs are computationally intensive. This is due to dependency on huge number of parameters which limit their re-usability on lower end CPUs. To address these limitations, we propose an eigenvalue-based framework (EVF) to reduce computational cost by removing insignificant convolution layers’ filters from the network while maintaining similar accuracy. Proposed method is architecture independent and may easily be deployed to existing deep learning platforms. Experiments have been carried on standard VGG-16 and AlexNet. Based on experiments the resultant reduced models outperforms the original models in terms of computational cost while maintaining similar accuracy. We have also compared proposed EVF with the state-of-the-art methods. We have achieved comparable accuracy after filter pruning only and without further retraining.KEYWORDS: Computational cost reduction, convolutional filter, VGG-16, AlexNetRead full text (pdf)
