Abstract:This paper presents the constrained Hybrid Metaheuristic (cHM) algorithm as a general framework for continuous optimisation. Unlike many existing metaheuristics that are tailored to specific function classes or problem domains, cHM is designed to operate across a broad spectrum of objective functions, including those with unknown, heterogeneous, or complex properties such as non-convexity, non-separability, and varying smoothness. We provide a formal description of the algorithm, highlighting its modular structure and two-phase operation, which facilitates dynamic adaptation to the problem's characteristics. A key feature of cHM is its ability to harness synergy between both candidate solutions and component metaheuristic strategies. This property allows the algorithm to apply the most appropriate search behaviour at each stage of the optimisation process, thereby improving convergence and robustness. Our extensive experimental evaluation on 28 benchmark functions demonstrates that cHM consistently matches or outperforms traditional metaheuristics in terms of solution quality and convergence speed. In addition, a practical application of the algorithm is demonstrated for a feature selection problem in the context of data classification. The results underscore its potential as a versatile and effective black-box optimiser suitable for both theoretical research and practical applications.




Abstract:This study investigates the potential of hybrid metaheuristic algorithms to enhance the training of Probabilistic Neural Networks (PNNs) by leveraging the complementary strengths of multiple optimisation strategies. Traditional learning methods, such as gradient-based approaches, often struggle to optimise high-dimensional and uncertain environments, while single-method metaheuristics may fail to exploit the solution space fully. To address these challenges, we propose the constrained Hybrid Metaheuristic (cHM) algorithm, a novel approach that combines multiple population-based optimisation techniques into a unified framework. The proposed procedure operates in two phases: an initial probing phase evaluates multiple metaheuristics to identify the best-performing one based on the error rate, followed by a fitting phase where the selected metaheuristic refines the PNN to achieve optimal smoothing parameters. This iterative process ensures efficient exploration and convergence, enhancing the network's generalisation and classification accuracy. cHM integrates several popular metaheuristics, such as BAT, Simulated Annealing, Flower Pollination Algorithm, Bacterial Foraging Optimization, and Particle Swarm Optimisation as internal optimisers. To evaluate cHM performance, experiments were conducted on 16 datasets with varying characteristics, including binary and multiclass classification tasks, balanced and imbalanced class distributions, and diverse feature dimensions. The results demonstrate that cHM effectively combines the strengths of individual metaheuristics, leading to faster convergence and more robust learning. By optimising the smoothing parameters of PNNs, the proposed method enhances classification performance across diverse datasets, proving its application flexibility and efficiency.