Assessing the feasibility of approximating higher-order problem signatures in Artificial Neural Networks with hybrid transfer functions
Problem signatures are patterns that reveal a glimpse of the computational strategy most likely to be suitable for a given problem. Such a pattern could be the preferred choice of the activation and output functions for a given problem in neural networks that implement transfer functions optimization. We refer to these patterns as first-order signatures. Higher-order signatures capture information on a higher level, such as the likelihood of neural computational paths (i.e. connection between two or more transfer functions) used by the fittest models for specific problems. In addition, it also captures information about their weights.
In this paper, we show that higher-order problem signatures meet our proposed criteria for problem signatures: specifically, that the signatures of the different datasets tested have a lot of neural computation paths in common that makes them similar at a glance, but after thresholding their differences are more apparent. In addition to that, we also show that the signatures were consistent regardless of size of the population (P), number of runs (R), or size of the subsample used for approximating the signatures (N). However, in the case of the subsample size (N), we found that this was provided the sample size was fixed during sampling.
Keywords: Meta-feature,Neural Network, Optimization, Transfer functions
Download Full-Text
ABOUT THE AUTHORS
Abdullahi Adamu
Research Interest : Neural Networks, Optimisation, and other Machine learning fields such as Image processing.
Tomas Maul
Interested in Neural Networks, Image Processing, and other Computational Neuroscience topics.
Andrzej Bargiela
Interested in Machine learning, Granular Computing, and Optimization.
Abdullahi Adamu
Research Interest : Neural Networks, Optimisation, and other Machine learning fields such as Image processing.
Tomas Maul
Interested in Neural Networks, Image Processing, and other Computational Neuroscience topics.
Andrzej Bargiela
Interested in Machine learning, Granular Computing, and Optimization.