PHD theses : Computer Science

Permanent URI for this collectionhttps://repository.neelain.edu.sd/handle/123456789/12169

Browse

Search Results

Now showing 1 - 5 of 5
  • Thumbnail Image
    Item
    Anew Conjugate Gradient Algorithm of Unconstrained Optimization Using Parallel Processor
    (Al Neelain University, 2007-02) Nadwa Ali Ahmad Al-Abbas
    ln optimization methods, we try to find and determine the best solution to certain mathematically defined problems: Minimize f(x) , x em" Optimization problems can be classified as constrained and unconstrained problems; however, as constrained optimization problems can be transformed into unconstrained cases, the majority of recent research works have been focused on unconstrained optimization problems, including the new techniques. Almost all numerical methods developed to solve f(x) are iterative in nature, i.e. given an initial point xo, the methods generate a sequence of points x,,,x,,... until some stopping criterion is satisfied . The iterative methods are first theoretically developed to minimize convex quadratic functions in a finite number of iterations and they are extended to solve the general problems. These numerical methods can be divided into two classes according to whether derivatives are evaluated or not (first or second derivative). The method which evaluates derivatives is called gradient method. Within this thesis, we first choose, one of the well-known methods, Conjugate Gradient "CG-method" which can solve iteratively both linear and nonlinear functions. This method is extended to find the minima (or maxima) using two kinds of searches to find the minimum solution. These are called: 1- Exact line search , i.e. g,+,Td, = 0 for i=1,2,... . 2- inexact line search. We choose about ten nonlinear functions and we use the program of this method to optimize these functions using special starting points with different dimensions for many of them. ln this thesis, a new algorithm is developed for minimization of the quadratic and extended quadratic function using the inexact line search. This thesis is concerned with the development and testing of the new algorithm using line search to solve different standard functions. We have extended our work to other two methods, quasi- Newton method and BFGS method, which begin the search along a gradient line as the CG-method and use gradient information to build a quadratic. Then we studied the parallel solution of these algorithms and the effect of using parallelism on these algorithms. Programs have been written using sequential design (to be executed serially). We have used the parallel models of these methods (design and analysis) and the parallelism of these methods in different ways. Further study was made of the important measures used in parallel computing. We have found that parallelism is only effective in linear functions and hence linearization methods for solving nonlinear functions. The other important measures of the efficiency of these algorithms are NOF (number of function evaluation). We have tried to reduce NOF by using inexact line search with extended conjugate gradient methods to optimize the unconstrained nonlinear problems. lt is found that in some functions NOF are reduced, especially high dimension ones. In others, NOF are not reduced. So it is difficult to conclude whether this method is better or worse compared with others. However, we may, say it is competing. lt gave good result with Powell function, which is generally accepted as a good function, and it may add a new algorithm for solving these types of problems. In general, this statement in common with all algorithms of solving nonlinear equations. The function is the main factor.
  • Thumbnail Image
    Item
    RECOMMENDED APPROACH FOR SOFTWARE PROJECT PLANNING AND COST ESTIMATIONS IN THE SUDAN
    (2002) AWAD ELKARIM MOHAMMED YOUSIF
    ABSTRACT Most -if not all - the software projects, of demand, in Sudan are small- sized or, possibly, medium-sized in some cases. The software projects that the various institutions, developers and the software houses have attempted, during the elapsed decades were not inclined to adopting software engineering basics. The standard techniques, standard documentation and systematic project reviews have been neglected. Consequently, the products came too feeble to suffice customers’ requirements or, as in the overwhelming majority of cases, a complete failure to serve the cause and purpose, for which it was originally entitled. In other cases the projects were terminated at earlier stages than completion and left unfinished due to the perplexed approaches during their development. The local software project development has been born accompanied by the problems of immaturity, because it did neither follow the systematic planning nor the proper cost estimation. Recently and when software projects were in demand the problem has escalated into a severe crisis. A tremendous effort has to be made clearing the present messy situation. indeed, a start from scratch is needed to revive the proper conception of the software project development, built on understanding the basics of software engineering fundamentals. ‘- The author made a keen survey upon 26 companies that practice software project development. A careful study has been made on the ways they adopt in planning and estimating the costs of software projects. Great efforts were made attempting to put these methods as indicators that could possibly be compared with the techniques, tools and systematic approaches adopted in Software Engineering. lt was observed that Software Engineering training progress was lacking and thus leading to insufficiency in software Engineering infrastructure. Eventually, Software Engineering methodology, tools, techniques and procedures were often absent. As a consequence schedules suffered frequent changes, the thing that lead to over-budgeting and conformed a mighty causal to time slips. Beyond that, no planning for standards, as well as no proper software development procedures were maintained and plans for Quality Assurance and Risk Management were rarely considered. The study has proposed some indicators and practical procedures that can be adopted and easily followed to achieve satisfactory results in this aspect.
  • Thumbnail Image
    Item
    Monte Carlo Based Digital Image Simulation Processing
    (Neelain University, 2009) Hassan Hamad Abuelhassan Abdallah
    Abstract lmage segmentation is a fundamental first step image processing technique which helps attaining the objectives of computer vision. The primary objective of this thesis is to enhance existing image segmentation methods based on Monte Carlo integration. We have suggested Monte Carlo techniques to find the ensemble average of neighboring image properties. This way, more precise image operations such as edge detection, merging areas of similar properties and meshing techniques will be possible. Examples from everyday's activities using advanced computer graphics and image analysis techniques are used. Future work includes using of enhanced algorithms in medical image segmentation and analysis.
  • Thumbnail Image
    Item
    A new Conjugate Gradient Algorithm of Unconstrained Optimization Using Parallel Processor
    (Neelain University, 2007) Nadwa Ali Ahmad Al-Abbas
    ABSTRIACT ln optimization methods, we try to find and determine the best solution to certain mathematically defined problems: Minimize f(x) , xeil?” Optimization problems can be classified as constrained and unconstrained problems; however, as constrained optimization problems can be transformed into unconstrained cases, the majority of recent research works have been focused on unconstrained optimization problems, including the new techniques. Almost all numerical methods developed to solve f(x) are iterative in nature, i.e. given an initial point xo, the methods generate a sequence of points x,,,x, until some stopping criterion is satisfied . The iterative methods are first theoretically developed to minimize convex quadratic functions in a finite number of iterations and they are extended to solve the general problems. These numerical methods can be divided into two classes according to whether derivatives are evaluated or not (first or second derivative). The method which evaluates derivatives is called gradient method. Within this thesis, we first choose, one of the well-known methods, Conjugate Gradient “CG~method" which can solve iteratively both linear and nonlinear functions. This method is extended to find the minima (or maxima) using two kinds of searches to find the minimum solution. These are called: 1- Exact line search , i.e. g,+,’d, = 0 for i=1,2,... . 2- lnexact line search. We choose about ten nonlinear functions and we use the program of this method to optimize these functions using special starting points with different dimensions for many of them. ln this thesis, a new algorithm is developed for minimization of the quadratic and extended quadratic function using the inexact line search. This thesis is concerned with the development and testing of the new algorithm using line search to solve different standard functions. We have extended our work to other two methods, quasi- Newton method and BFGS method, which begin the search along a gradient line as the CG-method and use gradient information to build a quadratic. Then we studied the parallel solution of these algorithms and the effect of using parallelism on these algorithms. Programs have been written using sequential design (to be executed serially). We have used the parallel models of these methods (design and analysis) and the parallelism of these methods in different ways. Further study was made of the important measures used in parallel computing. We have found that parallelism is only effective in linear functions and hence linearization methods for solving nonlinear functions. The other important measures of the efficiency of these algorithms are NOF (number of function evaluation). We have tried to reduce NOF by using inexact line search with extended conjugate gradient methods to optimize the unconstrained nonlinear problems. lt is found that in some functions NOF are reduced, especially high dimension ones. In others, NOF are not reduced. So it is difficult to conclude whether this method is better or worse compared with others. However, we may say it is competing. lt gave good result with Powell function, which is generally accepted as a good function, and it may add a new algorithm for solving these types of problems. In general, this statement in common with all algorithms of solving nonlinear equations. The function is the main factor.
  • Thumbnail Image
    Item
    A new Conjugate Gradient Algorithm of Unconstrained Optimization Using Parallel Processor
    (Neelain University, 2007) Nadwa Ali Ahmad Al-Abbas
    ABSTRIACT ln optimization methods, we try to find and determine the best solution to certain mathematically defined problems: Minimize f(x) , xeil?” Optimization problems can be classified as constrained and unconstrained problems; however, as constrained optimization problems can be transformed into unconstrained cases, the majority of recent research works have been focused on unconstrained optimization problems, including the new techniques. Almost all numerical methods developed to solve f(x) are iterative in nature, i.e. given an initial point xo, the methods generate a sequence of points x,,,x, until some stopping criterion is satisfied . The iterative methods are first theoretically developed to minimize convex quadratic functions in a finite number of iterations and they are extended to solve the general problems. These numerical methods can be divided into two classes according to whether derivatives are evaluated or not (first or second derivative). The method which evaluates derivatives is called gradient method. Within this thesis, we first choose, one of the well-known methods, Conjugate Gradient “CG~method" which can solve iteratively both linear and nonlinear functions. This method is extended to find the minima (or maxima) using two kinds of searches to find the minimum solution. These are called: 1- Exact line search , i.e. g,+,’d, = 0 for i=1,2,... . 2- lnexact line search. We choose about ten nonlinear functions and we use the program of this method to optimize these functions using special starting points with different dimensions for many of them. ln this thesis, a new algorithm is developed for minimization of the quadratic and extended quadratic function using the inexact line search. This thesis is concerned with the development and testing of the new algorithm using line search to solve different standard functions. We have extended our work to other two methods, quasi- Newton method and BFGS method, which begin the search along a gradient line as the CG-method and use gradient information to build a quadratic. Then we studied the parallel solution of these algorithms and the effect of using parallelism on these algorithms. Programs have been written using sequential design (to be executed serially). We have used the parallel models of these methods (design and analysis) and the parallelism of these methods in different ways. Further study was made of the important measures used in parallel computing. We have found that parallelism is only effective in linear functions and hence linearization methods for solving nonlinear functions. The other important measures of the efficiency of these algorithms are NOF (number of function evaluation). We have tried to reduce NOF by using inexact line search with extended conjugate gradient methods to optimize the unconstrained nonlinear problems. lt is found that in some functions NOF are reduced, especially high dimension ones. In others, NOF are not reduced. So it is difficult to conclude whether this method is better or worse compared with others. However, we may say it is competing. lt gave good result with Powell function, which is generally accepted as a good function, and it may add a new algorithm for solving these types of problems. In general, this statement in common with all algorithms of solving nonlinear equations. The function is the main factor.