I often hear people reference the conjugate gradient algorithm as though it can be used as a general algorithm for minimizing any continuous function, though one may have to assume that it is Lipschitz or convex
( Read more... )
There are nonlinear conjugate gradient descent methods that work by calculating a descent direction based on the derivative at each step. How well-behaved they will be depends on the problem and method. Since it's nonlinear, you can't guarantee orthogonality or trivially calculate exact step lengths.
I guess the wiki page is a place to start. My course text was Nocedal and Wright.
I got the impression that even relatively simple practical application was hard enough to justify a field of optimization specialists.
Comments 2
Yes
Reply
I guess the wiki page is a place to start. My course text was Nocedal and Wright.
I got the impression that even relatively simple practical application was hard enough to justify a field of optimization specialists.
Reply
Leave a comment