The classification of biological images is an important task with crucial applications in many fields, such as cell phenotype recognition, detection of cell organelles, and histopathological classification. Bioimage classification can help in early medical diagnosis, allowing automatic disease classification. In this chapter, the authors classify biomedical images using ensembles of neural networks. They create this ensemble using a ResNet50 architecture and employing a set of different activation functions: ReLU, leaky ReLU, Parametric ReLU, Exponential Linear Unit, Adaptive Piecewise Linear Unit, S-Shaped ReLU, Swish, Mish, Mexican Linear Unit, Gaussian Linear Unit, Parametric Deformable Linear Unit, Soft Root Sign, among others. They tested multiple ensembles obtained using the sum rule. The sum rule is a method to combine the results of multiple networks, and it consists in averaging the output probability vectors of every network. Their tests were organized in two categories: • They substituted every ReLU in the original network with all the activations that were listed earlier. In this way, they created many different networks, each of which had a specific activation function, different from those employed in the other networks (all the activation functions were, however, the same in the network); • They substituted every ReLU in the original network with a randomly selected activation. In this case, however, every activation layer might contain a different activation function. They repeated this process multiple times since the stochasticity of this process allows a new network to be obtained every time. Different pools of activation functions were used. As a baseline, they used an ensemble of neural networks that only use ReLU activations. They tested their networks on several small and medium-sized biomedical image data sets. The results prove that their best ensemble obtains a better performance than those using naive approaches. In order to encourage the reproducibility of this work, the MATLAB code of all the experiments will be shared at https://github.com/LorisNanni.
Nanni, L., Lumini, A., Ghidoni, S., Maguolo, G. (2022). Comparisons among different stochastic selections of activation layers for convolutional neural networks for health care. Amsterdam : Elsevier [10.1016/B978-0-323-85751-2.00003-7].
Comparisons among different stochastic selections of activation layers for convolutional neural networks for health care
Lumini, Alessandra;
2022
Abstract
The classification of biological images is an important task with crucial applications in many fields, such as cell phenotype recognition, detection of cell organelles, and histopathological classification. Bioimage classification can help in early medical diagnosis, allowing automatic disease classification. In this chapter, the authors classify biomedical images using ensembles of neural networks. They create this ensemble using a ResNet50 architecture and employing a set of different activation functions: ReLU, leaky ReLU, Parametric ReLU, Exponential Linear Unit, Adaptive Piecewise Linear Unit, S-Shaped ReLU, Swish, Mish, Mexican Linear Unit, Gaussian Linear Unit, Parametric Deformable Linear Unit, Soft Root Sign, among others. They tested multiple ensembles obtained using the sum rule. The sum rule is a method to combine the results of multiple networks, and it consists in averaging the output probability vectors of every network. Their tests were organized in two categories: • They substituted every ReLU in the original network with all the activations that were listed earlier. In this way, they created many different networks, each of which had a specific activation function, different from those employed in the other networks (all the activation functions were, however, the same in the network); • They substituted every ReLU in the original network with a randomly selected activation. In this case, however, every activation layer might contain a different activation function. They repeated this process multiple times since the stochasticity of this process allows a new network to be obtained every time. Different pools of activation functions were used. As a baseline, they used an ensemble of neural networks that only use ReLU activations. They tested their networks on several small and medium-sized biomedical image data sets. The results prove that their best ensemble obtains a better performance than those using naive approaches. In order to encourage the reproducibility of this work, the MATLAB code of all the experiments will be shared at https://github.com/LorisNanni.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.