Special Issue: Evolutionary algorithms for training neural networks in engineering applications
Guest Editor
Prof. Danilo Pelusi
Department of Communication Sciences, University of Teramo, Balzarini, 1, 64100 Teramo, Italy
Email: dpelusi@unite.it
Manuscript Topics
Training a Neural Network (NN) is a hard task. Training algorithms such as Gradient Descent, Conjugate Gradient, Quasi-Newton, Resilient Backpropagation, and Levenberg-Marquardt are characterized by huge computational complexity. On the other hand, the training phase is a crucial part for a suitable working of NN. The use of exhaustive methods to train a NN don’t provide results in reasonable times. Therefore, the application of low complexity techniques is necessary.
Evolutionary Computation (EC) is a sub-field of Artificial Intelligence (AI) and is used extensively in complex optimization problems. It is used to solve problems that have too many variables for traditional algorithms. EC makes use of nature-inspired algorithms such as the Swarm Algorithms (SA). They are population-based algorithms where the population consists of agents. Each agent represents a potential solution of the optimization problem. The population evolves generation-by-generation attempting to find the optimal solution. The idea is to exploit the results of previous generation for current generation.
The aim of this Special Issue is to explore the applications of evolutionary algorithms to train neural networks in engineering fields. Some of this fields are: Facial Recognition, Stock Market Prediction, Social Media, Aerospace, Healthcare, Signature Verification and Handwriting Analysis, Data Text Mining and so on.
This Special Issue serves as a forum for facilitating and enhancing information sharing among researchers, with topics of interest including the development of advanced techniques that can help to solve numerous complex problems.
Topics of interest for this Special Issue include but are not limited to:
• Evolutionary training
• Sparse matrix deep learning
• Image recognition
• Data text mining
• Evolutionary Recurrent Neural Network
• Evolutionary Differentiable Neural Architecture Search
• Multi-objective Neural Architecture Search
• Evolutionary Optimization of Deep Learning
• Hybridization of Evolutionary Computation and Neural Networks
• Large-scale Optimization for Evolutionary Deep Learning
• Evolutionary Multi-task Optimization in Deep Learning
• Full-space Neural Architecture Search
• Evolving Neural Networks
• Automatic Design of Neural Architectures
• Classification/Clustering/Regression
• Feature Selection and Extraction
• Multi-task optimization, Multi-task learning, Meta learning
• Learning Based Optimization
• Evolutionary search for activation functions
• Evolutionary weight optimization of neural networks
Keywords: Evolutionary Computation, Machine Learning, Swarm Algorithms, Deep Learning, Evolutionary training
Instructions for authors
https://www.aimspress.com/mbe/news/solo-detail/instructionsforauthors
Please submit your manuscript to online submission system
https://aimspress.jams.pub/