Bonus Point Project on Non-Linear Optimization
Gradient Descent, Newton and Backtracking Line Search Algorithms
Given the Rosembrock function
f ( x ) = 100( x2 − x12 )2 + (1 − x1 )2 .
1. Use Gradient descent with Backtracking line search to find the minimum value of f ,
given:
• The initial step size for backtracking line search is t0 = 1.
• The starting points for gradient descent are x0 = (1.2, 1.2) T and x0 = (−1.2, 1) T .
• The algorithm terminates when the number of iterates exceeds 1000, or ∥∇ f ( xk )∥ <
10−4 .
2. Use Newton method with Backtracking line search to find the minimum value of f
using the same information as above.
3. Compare the convergence speed of the two algorithms.