Friday , May 20 2022

Tuning Convolutional Neural Network Hyperparameters by Bare Bones Fireworks Algorithm

Ira TUBA1, Mladen VEINOVIC1, Eva TUBA1, Romana CAPOR HROSIK2, Milan TUBA1
1 Singidunum University, 32 Danijelova Street, 11000, Belgrade, Serbia,,, (*Corresponding author)
2 University of Dubrovnik, 12 Kneza Damjana Jude Street, 20000, Dubrovnik, Croatia

Abstract: Digital image classification is an important component in various applications. Lately, convolutional neural networks have been widely used as a classifier since they achieve superior results, while their application is relatively simple. In order to achieve the best possible results, tuning of the network’s hyperparameters is necessary but that represents an exponentially hard optimization problem with computationally very expensive fitness function. The swarm intelligence algorithms have been proven to be effective in solving such exponentially hard optimization problems, however their application to this particular problem has not been sufficiently studied. In this paper, convolutional neural network hyperparameters were tuned by the bare bones fireworks algorithm. The quality of the proposed method was tested on two standard benchmark datasets, CIFAR-10 and MNIST. The results were compared to CIFAR-Net, LeNet-5 and the networks optimized by the harmony search algorithm and the proposed method achieved better results considering the classification accuracy. The proposed method for CNN hyperparameter tuning improved the classification accuracy up to 99.34% on the MNIST dataset and up to 75.51% on the CIFAR-10 dataset compared to 99.25% and 74.76% reported by another method from the specialized literature.

Keywords: Convolutional neural networks, Hyperparameters tuning, Optimization, Swarm intelligence, Bare bones fireworks algorithm.


Ira TUBA, Mladen VEINOVIC, Eva TUBA, Romana CAPOR HROSIK, Milan TUBA, Tuning Convolutional Neural Network Hyperparameters by Bare Bones Fireworks Algorithm, Studies in Informatics and Control, ISSN 1220-1766, vol. 31(1), pp. 25-35, 2022.