Genetic Programming Inc.
Los Altos, California, USA
Econometrics Inc.
Chicago, Illinois, USA
University of California
Berkeley, California, USA
Stanford University
Stanford, California, USA
INTRODUCTION
Design is a major activity of practicing engineers. The design
process entails creation of a complex structure to satisfy
user-defined requirements. Since the design process typically
entails tradeoffs between competing considerations, the end
product of the process is usually a satisfactory and compliant
design as opposed to a perfect design. Design is usually viewed
as requiring creativity and human intelligence. Consequently,
the field of design is a source of challenging problems for
automated techniques of machine intelligence. In particular,
design problems are useful for determining whether an automated
technique can produce results that are competitive with
human-produced results.
Genetic Algorithms in Shape Optimization:
Finite and Boundary Element Applications
Institute for Materials and Structural Models, Faculty of Engineering
Central University of Venezuela
PO Box 50.361, Caracas 1050-A, Venezuela
Email: mcerrola@reacciun.ve, annwill@sagi.ucv.edu.ve
Abstract
This work deals with the continuous and discrete optimization
of 3D truss structures as well as with the shape optimization of
bidimensional Finite Element (FE) and Boundary Element (BE) models,
by using Genetic Algorithms and B-splines boundary modelling.
Several successful attempts have been made
in order to optimize shape of continuous models to reduce their area and
to reduce the internal stresses. Here, the authors present and discuss
a new methodology to shape optimization based on the use of GAs and
B-spline modelling. Internal variables to be controlled are discussed,
then introducing some concepts of B-splines formulation. A software
based on Genetic Algorithms developed for FE models is also presented.
Finally, some engineering examples are included in order to show that
the technique proposed is able to deal with FE and BE models.
Evolutionary Computation: Recent Developments and Open Issues
Computer Science Department
George Mason University
Fairfax, VA 22030 USA
kdejong@gmu.edu
INTRODUCTION
The field of evolutionary computation (EC) is in a stage of tremendous growth as witnessed by the increasing number of conferences, workshops, and papers in the area as well as the emergence of several journals dedicated to the field. It is becoming increasingly difficult to keep track of and understand the wide variety of new algorithms and new applications.
I believe there is, however, there is a coherent structure to the EC
field that can help us understand where we are and where we're headed.
The purpose of this paper is to present that view, use it to assess
the field, and then discuss important new directions for research and
applications.
Evolutionary Algorithms for Multi-Criterion Optimization
in Engineering Design
INTRODUCTION
Many real-world engineering design or decision making problems
involve simultaneous optimization of multiple objectives.
The principle of multi-criterion optimization is different from
that in a single-objective optimization.
In single-objective optimization, the goal is to find the best design
solution, which corresponds to the minimum or maximum value of the objective
function [9, 34]. On the contrary, in a multi-criterion optimization
with conflicting
objectives, there is no single optimal solution. The interaction among
different objectives gives rise to a set of compromised solutions,
largely known as the Pareto-optimal solutions [40, 2].
Since none of these Pareto-optimal
solutions can be identified as better than others without any
further consideration, the goal in a multi-criterion optimization is to
find as many Pareto-optimal solutions as possible. Once such solutions are
found,
it usually requires a higher-level decision-making with other considerations to
choose one of them for implementation. Here, we address the first task of
finding multiple Pareto-optimal solutions. There are two objectives in a
multi-criterion optimization: (i) find solutions close to the true
Pareto-optimal solutions and (ii) find solutions that are widely different
from each other. The first task is desired to satisfy optimality conditions
in the obtained solutions. The second task is desired to have no bias towards
any particular objective function.
ACO Algorithms for the Traveling Salesman Problem
IRIDIA, Université Libre de Bruxelles, Belgium
{tstutzle,mdorigo}@ulb.ac.be
Ant algorithms are a recently developed,
population-based approach which has been successfully applied to
several NP-hard combinatorial optimization problems. As the name
suggests, ant algorithms have been inspired by the behavior of real
ant colonies, in particular, by their foraging behavior. One of the
main ideas of ant algorithms is the indirect communication of a colony
of agents, called (artificial) ants, based on pheromone trails
(pheromones are also used by real ants for communication). The
(artificial) pheromone trails are a kind of distributed numeric
information which is modified by the ants to reflect their experience
while solving a particular problem. Recently, the Ant Colony
Optimization (ACO) meta-heuristic has been proposed which provides a
unifying framework for most applications of ant algorithms
to combinatorial optimization problems. In
particular, all the ant algorithms applied to the TSP fit perfectly
into the ACO meta-heuristic and, therefore, we will call these
algorithms also ACO algorithms.
Multidisciplinary Hybrid Constrained GA Optimization
Department of Aerospace Engineering, 233 Hammond Building
The Pennsylvania State University, University Park, PA 16802, USA
INTRODUCTION
Realistic engineering problems always involve interaction of several
disciplines like fluid dynamics, heat transfer, elasticity,
electromagnetism, dynamics, etc. Thus, realistic problems are
always multidisciplinary and the geometric space is typically
arbitrarily shaped and three-dimensional. Each of the individual
disciplines is governed by its own system of governing partial
differential equations or integral equations of different degree
of non-linearity and based on often widely disparate time scales
and length scales. All of these factors make a typical multidisciplinary
optimization problem highly non-linear and interconnected. Consequently,
an objective function space for a typical multidisciplinary problem
could be expected to have a number of local minimums. A typical
multidisciplinary optimization problem therefore requires the use
of optimization algorithms that can either avoid the local minimums
or escape from the local minimums. Non-gradient based optimizers
have these capabilities. On the other hand, once the neighborhood
of the global minimum has been found, the non-gradient based
optimizers have difficulty converging to the global minimum.
For this purpose, it is more appropriate to use gradient-based optimizers.
An Introduction to Evolutionary Computation and Some Applications
Natural Selection, Inc.
3333 N. Torrey Pines Ct., Suite 200
La Jolla, CA 92037
dfogel@natural-selection.com
Abstract
Evolutionary computation is becoming common in the solution
of difficult, real-world problems in industry, medicine, and defense.
This paper introduces evolutionary computation and reviews some
of the practical advantages to using evolutionary algorithms as
compared with classic methods of optimization or artificial intelligence.
Specific advantages include the flexibility of the procedures,
as well as the ability to self-adapt the search for optimum solutions
on the fly. In addition, several recent applications of evolutionary
computation are reviewed. As desktop computers increase in speed,
the application of evolutionary algorithms will become routine.
Some Recent Important Foundational Results in Evolutionary Computation
Natural Selection, Inc.
3333 N. Torrey Pines Ct., Suite 200
La Jolla, CA 92037, USA
dfogel@natural-selection.com
Abstract
A review of four foundational results in evolutionary
computation is offered. In each case, recent discoveries overturn
conventional wisdom. The importance of these new findings is discussed
in the context of future progress in the field of evolutionary computation.
Using Genetic Algorithms for Optimization:
Technology Transfer in Action
Center for Scientific Computing (CSC), Finland
WHY AND HOW TO OPTIMIZE?
Optimization aims for efficient allocation of scarce resources. You
find examples of optimization in any scientific or engineering discipline.
For example, the minimum principle of physics produces many kinds of
optimization problems. Agriculturists want to improve their cattle
so that the cows are more healthy and produce better milk. Engineers
are interested in making better paper machines, or in optimizing the
fuel consumption of engines. As another type of example, consider
designing optimal schedules for school classes or aircraft crews, or
minimizing the lengths of wires in an electrical device.
Genetic Programming: Turing's Third Way
to Achieve Machine Intelligence
Stanford University
Stanford, California, USA
Genetic Programming Inc.
Los Altos, California, USA
University of California
Berkeley, California, USA
Econometrics Inc.
Chicago, Illinois, USA
INTRODUCTION
One of the central challenges of computer science is to get
a computer to solve a problem without explicitly programming it.
In particular, it would be desirable to have a problem-independent
system whose input is a high-level statement of a problem's
requirements and whose output is a working computer program
that solves the given problem. The challenge is to make computers
do what needs to be done, without having to tell the computer
exactly how to do it.
Three evolutionary approaches to clustering
"Al.I.Cuza" University of Iasi, Romania
e-mail: hluchian@infoiasi.ro
Abstract
This paper is an overview of our work for solving
clustering problems in an evolutionary setting. In approaching
these problems, we faced many of the general questions one has to
answer when solving real-world problems under such a weak
approach as Genetic Algorithms; we use here some of these questions
as sub-titles for the paragraphs of the paper. However, it has not
been our intention to undertake a study of such questions or to
provide general answers to them. Simply, a particular experience
is presented: a sequence of four case studies - two problems
in their general versions and two real-world applications.
The structure of this paper does not follow closely the usual one:
the emphasis is on the design process rather than, for example,
on experiments and their interpretation. Our hope is that those
wishing to have a practical insight into either Genetic Algorithms
from the viewpoint of Clustering, or into some versions of Clustering
from the viewpoint of Genetic Algorithms, will find this paper interesting.
Genetic Algorithms and Fractals
INRIA Rocquencourt, B.P. 105, 78153 LE CHESNAY Cedex, France
Tel. +33 1 39 63 55 23, Fax +33 1 39 63 59 95
e-mail: Evelyne.Lutton@inria.fr, http://www-rocq.inria.fr/fractales/
Abstract
Fractals are largely known as a theory that is used to generate nice
(and complex) images, fractals are less known as efficient tools in
the framework of signal analysis. However, some fractal notions
developed for signal analysis are very convenient in the analysis of
GA behaviour. The aim of this paper is to show that there are deep
justifications to consider together Genetic Algorithms (GA) and
fractals, both from applied and theoretical viewpoint. We present below
a synthesis of research that have been done on this topic at INRIA in
the FRACTALES group.
Evolutionary Algorithms for Engineering Applications
University of North Carolina, Charlotte, NC 28223, USA and
at the Institute of Computer Science, Polish Academy of Sciences
ul. Ordona 21, 01-237 Warsaw, Poland
e-mail: zbyszek@uncc.edu
Department of Mechanical Engineering
Indian Institute of Technology, Kanpur, Pin 208 016, India
e-mail: deb@iitk.ac.in
Department of Computer Science, Aarhus University
DK-8000 Aarhus C, Denmark
e-mail: marsch@daimi.au.dk
Department of Computer Science, Aarhus University
DK-8000 Aarhus C, Denmark
e-mail: stidsen@daimi.au.dk
INTRODUCTION
Evolutionary computation techniques have drawn much attention as
optimization methods in the last two decades.
They are are stochastic optimization
methods which are conveniently presented using the metaphor of
natural evolution: a randomly initialized population
of individuals (set of points of the search space at hand) evolves
following a crude parody of the Darwinian principle of
the survival of the fittest. New individuals are generated using
some variation operators (e.g., simulated genetic operations such as
mutation and crossover). The probability of survival of the newly
generated solutions depends on their fitness (how well they
perform with respect to the optimization problem at hand): the best
are kept with a high probability, the worst are rapidly discarded.
Embedded Path Tracing and Neighbourhood Search Techniques
in Genetic Algorithms
School of Mathematical and Information Sciences
Coventry University, UK
Email: C.Reeves@coventry.ac.uk
NTT Communication Science Laboratories
Kyoto, Japan
Email: yamada@cslab.kecl.ntt.co.jp
Abstract
Heuristic search methods have been increasingly applied to combinatorial optimization problems. However, it does not always seem to be recognized that there is an interaction between the type of search adopted (the representation, operators, and search strategy) and the nature of the problem. Thus, it sometimes appears as if such properties as local optimality are to be understood as an invariant of the problem. In fact, while a specific problem defines a unique search space, different `landscapes' are created by the different heuristic search operators used to search it.
In this paper, after demonstrating this
fact, we turn to the question of how we can exploit such effects in
terms of a fruitful search strategy. Many approaches could be taken; the
one described here embeds a development of neighbourhood search methods
into a genetic algorithm. We present results where this approach is used
for flowshop scheduling problems to generate very high-quality
solutions.
Genetic algorithm as a tool for solving electrical engineering problems
Institute of Computer Science
Technical University of Lodz
Sterlinga 16/18, 90-217 Lodz, Poland
e-mail: rudnicki@ics.p.lodz.pl, piotr@ics.p.lodz.pl
Systems Research Institute
Polish Academy of Sciences
Newelska 6, 01-447 Warsaw, Poland
e-mail: cholajda@ibspan.waw.pl
Abstract
The paper aims at presenting genetic algorithms as an efficient
numerical tool in the field of electrical engineering. After
an introduction to evolution algorithms methodologies we present
two real world applications. The first one enables creation of
an expert system which diagnoses technical conditions of an oil
transformer basing on results of chromatography of gases dissolved
in transformer oil (Dissolved Gas Analysis - DGA). The system
is designed to diagnose small and medium size units for which
DGA is an essential source of information on their technical
conditions. The diagnosis is based on a set of rules which form
a database in the expert system. In order to select as small group
of rules in the database as possible, a genetic algorithm is applied.
The second application is that of parameter identification of
an induction motor from available performance data using the equivalent
circuit model of an machine. For identification we apply a multilayer
feedforward network, genetic algorithm and simulated annealing
strategies. Numerical results compare well with the actual parameters
of small induction motors.
Parallel and Distributed Evolutionary Algorithms: A Review
Institute of Computer Science, University of Lausanne
1015 Lausanne, Switzerland
E-mail: Marco.Tomassini@di.epfl.ch
Web: www-iis.unil.ch
INTRODUCTION
Evolutionary algorithms (EAs) find their inspiration in the evolutionary processes occurring in Nature. The main idea is that in order for a population of individuals to adapt to some environment, it should behave like a natural system; survival, and therefore reproduction, is promoted by the elimination of useless or harmful traits and by rewarding useful behavior. Technically, evolutionary algorithms can be considered as a broad class of stochastic optimization techniques. They are particularly well suited for hard problems where little is known about the underlying search space. An evolutionary algorithm maintains a population of candidate solutions for the problem at hand, and makes it evolve by iteratively applying a (usually quite small) set of stochastic operators, known as mutation, recombination and selection. The resulting process tends to find globally satisfactory, if not optimal, solutions to the problem much in the same way as in Nature populations of organisms tend to adapt to their surrounding environment.