% 27/5/99 Note: This is modified from mt 1992 README file for C code. % Based on paper: % PM Williams,"A Marquardt Algorithm for Choosing the Step-size in % Backpropagation Learning with Conjugate Gradients ", % University of Sussex, 1991, Cognitive Science Research Paper, CSRP 229. Conjugate Gradient Descent Method --------------------------------- This program is an implementation of the conjugate gradient (CG) method. Conjugate gradient methods have the following property: If a function, f(w) of N parameters w, describes an N-dimensional quadratic surface then no more than N line searches are required to find values for the parameters w which minimise f(w). For functions which are approximately quadratic more than N line searches are required. The method used here is based on a Marquardt algorithm for choosing step-size along a given search direction. The direction of each line search in f is determined by the direction of steepest descent and by the direction of the previous line search. Each search direction is chosen so as not to `undo' work done during previous line searches; that is, each line search direction is conjugate to the previous search direction. For each line search, two evaluations of the function's derivative, f', are used to estimate parameters of a 1D quadratic, g(), in the search direction. The extrenum of f is estimated as the extrenum of the quadratic function, g. A report describing the algorithm can be obtained from (SUSSEX WEB SITE). %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % END OF FILE %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%