A new Conjugate Gradient Algorithm of Unconstrained Optimization Using Parallel Processor
Files
Date
2007
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Neelain University
Abstract
ABSTRIACT
ln optimization methods, we try to find and determine the best
solution to certain mathematically defined problems:
Minimize f(x) , xeil?”
Optimization problems can be classified as constrained and
unconstrained problems; however, as constrained optimization
problems can be transformed into unconstrained cases, the majority
of recent research works have been focused on unconstrained
optimization problems, including the new techniques.
Almost all numerical methods developed to solve f(x) are iterative
in nature, i.e. given an initial point xo, the methods generate a
sequence of points x,,,x, until some stopping criterion is satisfied .
The iterative methods are first theoretically developed to minimize
convex quadratic functions in a finite number of iterations and they
are extended to solve the general problems.
These numerical methods can be divided into two classes
according to whether derivatives are evaluated or not (first or second
derivative). The method which evaluates derivatives is called gradient
method.
Within this thesis, we first choose, one of the well-known
methods, Conjugate Gradient “CG~method" which can solve
iteratively both linear and nonlinear functions.
This method is extended to find the minima (or maxima) using
two kinds of searches to find the minimum solution. These are called:
1- Exact line search , i.e. g,+,’d, = 0 for i=1,2,... .
2- lnexact line search.
We choose about ten nonlinear functions and we use the program
of this method to optimize these functions using special starting
points with different dimensions for many of them.
ln this thesis, a new algorithm is developed for minimization of
the quadratic and extended quadratic function using the inexact line
search.
This thesis is concerned with the development and testing of
the new algorithm using line search to solve different standard
functions.
We have extended our work to other two methods, quasi-
Newton method and BFGS method, which begin the search along a
gradient line as the CG-method and use gradient information to build
a quadratic.
Then we studied the parallel solution of these algorithms and
the effect of using parallelism on these algorithms.
Programs have been written using sequential design (to be
executed serially). We have used the parallel models of these
methods (design and analysis) and the parallelism of these methods
in different ways. Further study was made of the important measures
used in parallel computing. We have found that parallelism is only
effective in linear functions and hence linearization methods for
solving nonlinear functions.
The other important measures of the efficiency of these
algorithms are NOF (number of function evaluation). We have tried to
reduce NOF by using inexact line search with extended conjugate
gradient methods to optimize the unconstrained nonlinear problems.
lt is found that in some functions NOF are reduced, especially
high dimension ones. In others, NOF are not reduced.
So it is difficult to conclude whether this method is better or worse
compared with others. However, we may say it is competing.
lt gave good result with Powell function, which is generally
accepted as a good function, and it may add a new algorithm for
solving these types of problems. In general, this statement in
common with all algorithms of solving nonlinear equations. The
function is the main factor.
Description
A Thesis Submitted To the Faculty of Computer
Science & Information Technology
Keywords
Gradient Algorithm