Assignment 2 Due date: Friday, October 11


1. If A is an nn positive definite, symmetric matrix and {di, i = 1,n} is a set of vectors in Rn which are conjugate with respect to A show that the di are linearly independent.


2. Let be a set of linearly independent vectors in Rn and A be an nn positive definite, symmetric matrix. Show that the sequence of vectors


are conjugate with respect to A.


3. In an example in the notes the minimum of 2x2 - 2xy + 5y2 + 4x + 4y + 4 was found by the conjugate gradient method using the gradient of the function for gk and equation (8) for gk. Show that identical results for g1 and g0 are obtained when calculated from equations (1) - (4) using the explicit form of the matrix A. Take g0 = h0 = -f(1,0).


4. Write three separate programs to implement each of the following search methods:

i) the method of steepest descent;

ii) the method of conjugate directions and

iii) the conjugate gradient method

for finding the minimum of a function in Rn. Use the IMSL routine DUVMIF or imsl_d_min_uncon to carry out the linear searches in Rn and the subprogram that you wrote for question 4 of Assignment 1 to evaluate the functions in Rn.


Test these programs by finding a minimum of the following functions:

a) 5x2 - 10xy +19y2 - 6yz + 5z2 + 8zx - 8x - 44y - 12z

b) -(3x2 + 2xy + y2) exp(-3x2 - 2y2)

c) cos(2x + 5y - z) exp(-2x2 - y2 -yz - 3z2)


Run each of these programs with three different starting values. Compare the results obtained by the three different methods with regard to the number of interations required to attain an accuracy of 4 decimal places in the location of the minimum. How does the accuracy of the minimum compare among the three different methods?


Details of algorithms to find the minimum of a function f(x) for x Rn:


i) Method of Steepest Descents

- at each stage take the search direction u = -f(x)


ii) Conjugate Directions Method

1) initialize the search directions ui, i = 1, n to the co-ordinate directions;

2) do steps 3) to 6) n times;

3) starting from x0 search in each of the directions ui, i = 1, n in turn to reach xn;

4) put ui = ui+1, i = 1, n-1;

5) put un = xn - x0;

6) search in this new direction un to reach new x0;

7) after n iterations of steps 3) to 6) reinitialize the search directions ui, i = 1, n to the co-ordinate directions and return to step 2).


iii) Conjugate Gradient Method

1) set g0 = h0 = -f(x0);

2) do steps 3) to 5) for k = 0, 1, 2 ...

3) search along the direction hk to minimum point xk+1;

4) set gk+1 = -f(xk+1);

5) calculate hk+1 from equation (2) and the second form of equation (8) of the notes;


In all these methods stop when two iterations produce values for x which differ by less than the required tolerance.