## Transcribed Text

Consider the following unconstrained optimization problem
(UP) minimize 2xtQx-cx
where x E Rn and Q is an n X n symmetric positive definite matrix. Let x* be the
minimizer of problem (UP). Let V € Rn be an eigenvector of Q corresponding to the
eigenvalue l. Suppose that the initial point for the steepest descent algorithm is
(a) (10pts) Show that the gradient of = - x° is
Vf(r))=lv. =
(b) (10pts) Prove that if the steepest descent direction is taken, then the step-size which
minimizes f in this direction is 00 = 1/X.
(c) (20pts) Confirm the above assertions for the function
f(x1,x2)=3x}-2x1x2 +3x2+2x1 -6x2. = -
Compute the point x* obtained by one iteration of the steepest descent algo-
rithm, starting at point x° = (1,2).
Show that x* is the unique minimum of f.
Verify that x° - x* is an eigenvector of the Hessian of f.
Prove the following two parts independently.
(a) (15pts) Consider the problem
1
(P) minimize T
subject to
where Q is a positive-definite matrix. Prove that Newton's method will determine
the minimizer of f (x) = 1xtQx - cT. x in one iteration, regardless of the starting
point.
(b) Let f : Rn
R be twice continuously differentiable function. Consider the following
unconstrained optimization problem
minimize f (x)
subject to x E R
and a linear transformation of variables x = Ay + b, where A is an n X n nonsingular
matrix and y E Rn.
i. (5pts) Find Newton's direction dk in the space of the variables y.
ii. (5pts) Find the sequence {yk} generated by the Newton's method.
iii. (5pts) Show that the Newton's method is not affected by linear scaling of the
variables. Hint: Transform the sequence obtained in Part (b) into the sequence
{xk} in the space of the variables x by replacing Syk by xk

This material may consist of step-by-step explanations on how to solve a problem or examples of proper writing, including the use of citations, references, bibliographies, and formatting. This material is made available for the sole purpose of studying and learning - misuse is strictly forbidden.