Optimization, finding the minimum
"We present generalizations of Newton's method that incorporate derivatives of an arbitrary order d but maintain a polynomial dependence on dimension in their cost per iteration.
"At each step, our dth -order method uses semidefinite programming to construct and minimize a sum of squares-convex approximation to the dth -order Taylor expansion of the function we wish to minimize.
"We prove that our dth -order method has local convergence of order d .
"This results in lower oracle complexity compared to the classical
Newton method.
"We show on numerical examples that basins of attraction
around local minima can get larger as d
increases.
Comments
Post a Comment
Empathy recommended