# Line Search (FR method with Armijo type)

## 1724 days 전, jhlee2chn 작성

# Rosenbrock function def fun(x): return 100*(x[0]^2-x[1])^2+(x[0]-1)^2 # its gradient def gfun(x): return vector([400*(x[0]^2-x[1])*x[0] + 2*x[0]-2, -200*x[0]^2 + 200*x[1]]) # Armijo type inexact line search def armijo(xk, dk): gamma=0.5 sigma=0.2 m=0 maxm=20 while m <= maxm: if fun(xk+gamma^m*dk) <= fun(xk)+sigma*gamma^m*gfun(xk).inner_product(dk): break m=m+1 alpha=gamma^m newxk=xk+alpha*dk return newxk # main program (Fletcher-Reeves conjugate gradient method) tol=1e-6 k=0 xk=vector([-1, 2]) # initial guess gk=gfun(xk) # initial gradient dk=-gk r=[] while dk.norm() > tol: xk=armijo(xk, dk) gnew=gfun(xk) beta=gnew.inner_product(gnew)/gk.inner_product(gk) dk=-gnew+beta*dk r.append(xk) k=k+1 gk=gnew print "optimal solution ", xk print "iteration number ", k print "optimal value ", fun(xk)
 optimal solution (0.999999494651675, 0.999998987362896) iteration number 333 optimal value 2.55753565037496e-13 optimal solution (0.999999494651675, 0.999998987362896) iteration number 333 optimal value 2.55753565037496e-13

2. Comparison

var('x, y') f=100*(x^2-y)^2+(x-1)^2 # Rosenbrock function p1=contour_plot(f, (x,-1.5, 3), (y, -1.5, 5), contours=[0, 2,..,200], cmap='hsv', fill=False) p2=line2d(r)+point(r, color='black') show(p1+p2, aspect_ratio=1)
minimize(f,[-1.0, 2.0], algorithm='cg')
 Optimization terminated successfully. Current function value: 0.000000 Iterations: 31 Function evaluations: 80 Gradient evaluations: 80 (0.99999996728, 0.999999934429) Optimization terminated successfully. Current function value: 0.000000 Iterations: 31 Function evaluations: 80 Gradient evaluations: 80 (0.99999996728, 0.999999934429)