**MATH4141/MATH6651/PHYS5070**

**FW02**

**Assignment 2 Due
date: Friday, October 11**

1. If **A** is an n´n positive definite, symmetric matrix and {**d**_{i}, i = 1,n}
is a set of vectors in **R**^{n}
which are conjugate with respect to **A**
show that the **d**_{i} are
linearly independent.

2. Let
be a set of linearly
independent vectors in **R**^{n}
and **A** be an n´n positive definite, symmetric matrix. Show that the sequence of vectors

are conjugate with respect to **A**.

3. In an example in
the notes the minimum of 2x^{2} - 2xy + 5y^{2} + 4x + 4y + 4
was found by the conjugate gradient method using the gradient of the function
for **g**_{k }and equation (8)
for g_{k}. Show that identical
results for **g**_{1 }and g_{0} are obtained
when calculated from equations (1) - (4) using the explicit form of the matrix
A. Take **g**_{0} = **h**_{0}
= -Ñf(1,0).

4. Write three
separate programs to implement each of the following search methods:

i) the method of
steepest descent;

ii) the method of
conjugate directions and

iii) the conjugate gradient method

for finding the minimum of a function in **R**^{n}. Use the IMSL
routine DUVMIF or imsl_d_min_uncon to carry out the linear searches in **R**^{n }and the subprogram that
you wrote for question 4 of Assignment 1 to evaluate the functions in **R**^{n}.

Test these
programs by finding a minimum of the following functions:

a) 5x^{2} -
10xy +19y^{2} - 6yz + 5z^{2} + 8zx - 8x - 44y - 12z

b) -(3x^{2} +
2xy + y^{2}) exp(-3x^{2} - 2y^{2})

c) cos(2x + 5y - z) exp(-2x^{2} - y^{2} -yz - 3z^{2})

Run each of these programs with three different starting
values. Compare the results obtained by
the three different methods with regard to the number of interations required
to attain an accuracy of 4 decimal places
in the location of the minimum. How does
the accuracy of the minimum compare among the three different methods?

Details of algorithms to find the minimum of a function f(**x**) for **x** Î **R**^{n}:

i) Method of Steepest Descents

- at each stage take the search direction **u** = -Ñf(**x**)

ii) Conjugate Directions Method

1) initialize the search directions **u**_{i}, i = 1, n to the co-ordinate directions;

2) do steps

3) starting from **x**_{0 }search in each of the directions **u**_{i}, i = 1, n in turn to
reach **x**_{n};

4) put **u**_{i}
= **u**_{i+1}, i = 1, n-1;

5) put **u**_{n}^{
}= **x**_{n }-_{ }**x**_{0};

6) search in this new direction **u**_{n} to reach new **x**_{0};

** **7)
after n iterations of steps**u**_{i}, i = 1, n to the co-ordinate directions and return to
step 2).

iii) Conjugate Gradient Method

1) set **g**_{0}
= **h**_{0} = -Ñf(**x _{0}**);

2) do steps

3) search
along the direction **h**_{k} to
minimum point **x**_{k+1};

4) set **g**_{k+1}
= -Ñf(**x _{k+1}**);

5) calculate
**h**_{k+1} from equation (2)
and the second form of equation (8) of the notes;

In all these methods stop when two
iterations produce values for **x**
which differ by less than the required tolerance.

**TERM
TEST - FRIDAY, OCT. 18**