Gradient methods for large-scale minimization problems
Authors:
Abstract:
Gradient methods with Chebyshev relaxation functions are developed. In contrast to the classical gradient procedures, the methods retain the convergence and efficiency for non-convex nonlinear programming problems under the conditions of high stiffness of target functionals and high dimension of the optimized parameters vector.