Search Results

Now showing 1 - 10 of 14
  • Item
    Parallel Particle Swarm Optimization
    (North Dakota State University, 2016) Manne, Priyanka
    PSO is a population based evolutionary algorithm and is motivated from the simulation of social behavior, which differs from the natural selection scheme of genetic algorithms. It is an optimization technique based on swarm intelligence, which simulates the bio-inspired behavior. PSO is a popular global search method and the algorithm is being widely used in conjunction with several other algorithms in different fields of study. Modern day computational problems demand highly capable processing machines and improved optimization techniques. Since it is being widely used, it is important to search for ways to speed up the process of PSO, as the complexity of the problems increase. The paper describes a way to improve it via parallelization. The parallel PSO algorithm’s robustness and efficiency is demonstrated. This paper evaluates the parallelized version of the PSO algorithm with the use of Parallel Computing Toolbox available in Matlab.
  • Item
    Investigation of Strength Pareto Evolutionary Algorithm
    (North Dakota State University, 2019) Kakarlapudi, Madhumitha
    The Strength Pareto Evaluation Algorithm (SPEA) (Zitzler and Thiele 1999) is one of the prominent technique for approximating the pareto-optimal set for the Multiple Objective Optimization (MOO) algorithm. The Strength Pareto Evaluation Algorithm 2 (SPEA2) is an improved version of SPEA that was introduced in the year 2001. SPEA2 in contrast to SPEA incorporates a fine-grained fitness assignment strategy, an improved archive truncation technique, and a density assessment procedure. In this paper, we studied the influence of the optimization ability of SPEA2 on different benchmark functions by evaluating different performance metrics. The benchmark functions used in the paper include 10 constrained functions (CF’s) and 10 unconstrained functions (UF’s), through which, by varying parameters such as number of iterations, variable size, population and archives, we performed our experiments.
  • Item
    Applied Nonparametric Statistical Tests to Compare Evolutionary and Swarm Intelligence Approaches
    (North Dakota State University, 2014) Amanchi, Srinivas Adithya
    Recently, in many experimental studies, the statistical analysis of nonparametric comparisons has grown in the area of computational intelligence. The research refers to application of different techniques that are used to show comparison among the algorithms in an experimental study. Pairwise statistical technique perform individual comparison between two algorithms and multiple statistical technique perform comparison between more than two algorithms. Techniques include the Sign test, Wilcoxon signed ranks test, the multiple sign test, the Friedman test, the Friedman aligned ranks test and the Quade test. In this paper, we used these tests to analyze the results obtained in an experimental study comparing well-known algorithms and optimization functions. The analyses showed that the application of statistical tests helps to identify the algorithm that is significantly different than the remaining algorithms in a comparison. Different statistical analyses were conducted on the results of an experimental study obtained with varying dimension size.
  • Item
    Evaluation of a New Bio-Inspired Algorithm: Krill Herd
    (North Dakota State University, 2014) Madamanchi, Devi
    Number of nature inspired algorithms is proposed to solve complex optimization problems. The Krill Herd algorithm is one such biologically-inspired algorithm, proposed to solve optimization problems in response to biological and environmental processes. It is mainly based on the simulation technique of the herding behavior of the krill swarms. The objective function is defined as the combination of minimum distance of the krill individual from food and from the highest density of the swarm. The position of the krill individual is mainly influenced by three important factors, (i) movement induced by other krill individuals, (ii) foraging activity, and (iii) random diffusion. The process is mimicked to find optimum solution of the algorithm. In this paper, I implemented and evaluated the algorithm using five different benchmark functions, namely Alpine, Ackley, Griewank, Rastrigin and Sphere. The results obtained are satisfactory and proves the Krill herd algorithm’s efficiency in solving the optimization problems.
  • Item
    Particle Swarm Optimization Algorithm: Variants and Comparisons
    (North Dakota State University, 2015) Mattaparthi, Sowjanya
    Since the introduction of Particle Swarm optimization by Dr. Eberhart and Dr. Kennedy, there have been many variations of the algorithm proposed by many researchers and various applications presented using the algorithm. In this paper, we applied variants of Particle swarm optimization on various benchmark functions in multiple dimensions, using the computational procedure to find the optimal solutions for those functions. We ran the variants of the algorithm 51 times on each of the 17-benchmark functions and computed the average, variance and standard deviation for 10, 30, and 50 dimensions. Using the results, we found the suitable variants of the algorithm for the benchmark functions by considering the minimum optimal solution produced by each variant.
  • Item
    Real Parameter Optimization Using Differential Evolution
    (North Dakota State University, 2013) Dawar, Deepak
    Over recent years, Evolutionary Algorithms (EA) have emerged as a practical approach to solve hard optimization problems presented in real life. The inherent advantage of EA over other types of numerical optimization methods lies in the fact that they require very little or no prior knowledge of the objective function. Information like differentiability or continuity is not necessary. The inspiration to learn from evolutionary processes and emulate them on a computer comes from varied directions, the most pertinent of which is the field of optimization. This paper presents one such Evolutionary Algorithm known as Differential Evolution (DE) and tests its performance on benchmark problems. Different variants of basic DE are discussed and their advantages and disadvantages are listed. This paper, through exhaustive experimentation, proposes an acceptable set of control parameters which may be applied to most of the benchmark functions to achieve good performance.
  • Item
    Evaluation of Firefly Algorithm Using Benchmark Functions
    (North Dakota State University, 2013) Kundur, Anuroop
    The field of nature inspired computing and optimization techniques have evolved to solve the difficult optimization problems in diverse fields of engineering, science and technology. The Firefly algorithm is one of the several nature inspired algorithms that have been developed in the recent past and is inspired from the flashing behavior of the fireflies. The flashing behavior of the fireflies is to attract other fireflies in the group for mating. The less bright firefly will be attracted by the brighter one. As all the fireflies are assumed to be unisexual, each firefly is attracted to the other. This process is mimicked in the algorithm to find the solution to objective function. In this paper, we evaluate the algorithm using few multi-dimensional benchmark functions. The results of the simulation are satisfactory showing the algorithm to have good performance abilities.
  • Item
    Parallelization of Generic PSO Java Code Using MPJExpress
    (North Dakota State University, 2013) Madamanchi, Manoj Babu
    Many scientific, engineering and economic problems involve the optimization of a set of parameters. The Particle Swarm Optimization (PSO) is one of the new techniques that have been empirically shown to perform well. The PSO algorithm is a population-based search algorithm based on simulating the social behavior of birds within a flock. Large-scale engineering optimization problems impose large computational demands, resulting in long solution times even on modern high-end processors. To obtain enhanced computational throughput and global search capability parallel algorithms and parallel architectures have drawn lots of attention. Parallelization of PSO has proved to enhance computational throughput and global search capability In this paper, we detail the parallelization of an increasingly popular global search method, the PSO algorithm using MPJ Express. Both synchronous and asynchronous parallel implementations are investigated. The parallel PSO algorithm’s robustness and efficiency are demonstrated by using four standard benchmark functions Alpine, Rosenbrock, Rastrigin and Schaffer.
  • Item
    Implementation and Evaluation of CMA-ES Algorithm
    (North Dakota State University, 2015) Gagganapalli, Srikanth Reddy
    Over recent years, Evolutionary Algorithms have emerged as a practical approach to solve hard optimization problems in the fields of Science and Technology. The inherent advantage of EA over other types of numerical optimization methods lies in the fact that they require very little or no prior knowledge regarding differentiability or continuity of the objective function. The inspiration to learn evolutionary processes and emulate them on computer comes from varied directions, the most pertinent of which is the field of optimization. In most applications of EAs, computational complexity is a prohibiting factor. This computational complexity is due to number of fitness evaluations. This paper presents one such Evolutionary Algorithm known as Covariance Matrix Adaption Evolution Strategies (CMA ES) developed by Nikolaus Hansen, We implemented and evaluated its performance on benchmark problems aiming for least number of fitness evaluations and prove that the CMA-ES algorithm is efficient in solving optimization problems.
  • Item
    Implementation of a Clonal Selection Algorithm
    (North Dakota State University, 2014) Valluru, Srikanth
    Some of the best optimization solutions were inspired by nature. Clonal selection algorithm is a technique that was inspired from genetic behavior of the immune system, and is widely implemented in the scientific and engineering fields. The clonal selection algorithm is a population-based search algorithm describing the immune response to antibodies by generating more cells that identify these antibodies, increasing its affinity when subjected to some internal process. In this paper, we have implemented the artificial immune network using the clonal selection principle within the optimal lab system. The basic working of the algorithm is to consider the individuals of the populations in a network cell and to calculate fitness of each cell, i.e., to evaluate all the cells against the optimization function then cloning cells accordingly. The algorithm is evaluated to check the efficiency and performance using few standard benchmark functions such as Alpine, Ackley, Rastrigin, Schaffer, and Sphere.