next up previous
Next: The ``NK'' Adaptive Landscape Up: Evolving Better Representations through Previous: Introduction

Constructional Selection

The method of evolving representations is to expand the genome one gene at a time by randomly altering the genotype-phenotype map function to include one additional argument. If that alteration reduces fitness, it is rejected, and another attempt made until an alteration that increases fitness is found. Once this is done, a conventional GA is run with this expanded genome to adapt it to a new fitness peak. Then, another round of genome expansion is run. This is illustrated in Figure 1.


  
Figure: The genome growth algorithm using constructional selection to evolve the representation.
\begin{figure}\epsffile{EPS/GrowthAlgorithm.eps}\end{figure}

This method is expressed formally as follows. Let there be n genes in the genome. The allelic value, xi, for each gene i is an element of the gene's own domain space, Gi (for binary coding, $G_i = \{0,1\}$; for real valued GAs, $G_i = \Re$; in genetic programming, Gi could be the function or terminal sets or both). A genotype is an array, $\mbox{\boldmath\(x\)\unboldmath }= \Vert x_i\Vert _{i=1}^n \in {\cal G}_n$, of the allelic values, where ${\cal G}_n = G_1 \times G_2 \times \ldots \times G_n$ is the genotype space (in typical GAs, the domain spaces Gi are identical).

A genotype-phenotype map function, $\Psi_n:{\cal G}_n\mapsto {\cal S}$, maps the genotype $\mbox{\boldmath\(x\)\unboldmath }$ to the search space ${\cal S}$. A fitness function w(s) is defined on each point $s\in{\cal S}$ in the search space.

As the genome grows, one obtains a sequence of maps:

\begin{eqnarray*}\Psi_1:& G_1 &\longmapsto {\cal S}\\
\Psi_2:& G_1 \times G_2 &...
...\Psi_n:& G_1 \times G_2 \ldots \times G_n &\longmapsto {\cal S}.
\end{eqnarray*}


When a new gene is added to the genome, the genotype-phenotype map is modified from $\Psi_n$ to $\Psi_{n+1}$. The modification is made by a probabilistic operator with probabilities $T(\Psi_n \!\rightarrow\!\Psi_{n+1})$. It is here that heuristic knowledge of the search space must be incorporated by the designer, since random search through the space of all possible genotype-phenotype maps is no better than random search through ${\cal S}$. One must have an a priori expectation that some of the modifications will produce new maps that are only incrementally changed from old maps (as in [3]). This allows one to relocate the problem of knowledge incorporation to a level where it may be more naturally accommodated.

For example, this would be the case with series representation $s = \Psi_n(\mbox{\boldmath\(x\)\unboldmath }) = \sum_{i=1}^n [a_i\cos(\omega_i) + b_i\sin(\omega_i)]$, where $x_i = (a_i, b_i, \omega_i)\in G_i$. Another example is a genetic programming approach, where new functions and terminals can be appended to the parse tree [7].

Let us return to the idea that an evolved point in the search space has some aspects that are highly adapted, and others that are not yet adapted. If this partition into sub-goals is possible (and for some search spaces it may not be), then one would like to augment the genotype-phenotype map in a way that leaves the highly adapted traits unaffected while exploring the unadapted traits. Yet these ``traits'' (e.g. subgoals) are usually not directly observable, but are abstract, emergent properties of the fitness function. If one has a priori reason to believe that at least some of the modifications $\Psi_n \!\rightarrow\!\Psi_{n+1}$ will be able to correctly partition the adapted traits from the unadapted ones, then constructional selection is a means by which to incorporate those modifications into the genotype-phenotype map. This can be illustrated in a model of genome growth using Kauffman's ``NK'' adaptive landscape model.


next up previous
Next: The ``NK'' Adaptive Landscape Up: Evolving Better Representations through Previous: Introduction
Lee Altenberg
1998-05-27