New Advances in the Theory and Application of Optimization Algorithms
| Chair: | |
![]() |
|
| Yiying Zhang Shandong University of Science and Technology, China |
Key Words: Newton's Method, Gradient Descent Method, Conjugate Gradient Method, Particle Swarm Algorithm, Genetic Algorithm
Information: Optimization algorithms are core supporting technologies in computational mathematics, artificial intelligence, operations research, and control science. The theoretical research and engineering application of optimization algorithms have always been the focus of academic attention. Classical numerical optimization methods (such as Newton's method, gradient descent method, conjugate gradient method, etc.) and swarm intelligence optimization algorithms (such as particle swarm optimization algorithm, genetic algorithm, etc.) each have their own advantages in convergence analysis, complexity characterization, parameter self-adaptation, and handling high-dimensional non-convex problems. The cross-fusion of the two is becoming an effective paradigm for solving complex optimization problems. This special topic focuses on the latest theoretical breakthroughs and practical applications of optimization algorithms. We sincerely invite scholars and researchers worldwide to share their latest research achievements in classical numerical optimization methods and swarm intelligence optimization algorithms, and contribute their wisdom and strength to the continuous evolution and interdisciplinary integration of optimization algorithms.
Topics of interest include but are not limited to:
- Theoretical frontiers of classical optimization algorithms
- Theoretical progress of swarm intelligence optimization algorithms
- Optimization methods for high-dimensional non convex problems
- Cross fusion of classic and swarm intelligence optimization algorithms
- Optimization algorithms for complex engineering problems
- Application of Optimization Algorithms in Machine Learning
Submission Deadline: April 28, 2026 (the first round)
