What are genetic algorithms? Genetic algorithms are stochastical search heuristics which mimic the process of natural selection. For finding a solution they use a group of individuals, which all represent single solutions of the problem. In every round of the algorithm via certain operators from the old group a new population is created.

These operators are:

- Survival of the fittest (the best individuals from the previous round are taken in the next one)
- Crossover (two individuals of the previous round take the role of "parents" of an individual of the new population)
- Mutation (an individual mutates random properties)

We hope, that the individuals of the populations get "fitter" over time, and after several rounds we get a good individual, and thus a good solution.

Genetic algorithms are or can be used to solve complex problems, usually NP-hard ones, for which the search space is too big to completely traverse it and one generally has little clues about how to solve the problem. Genetic algorithms are a good general approach to solve problems of all kinds.

But in this generality lies also a weakness of this heuristic: For many problems, also hard ones, there are very good problem specific solving approaches with which genetic algorithms cannot compute. Further it is not clear, how the idea of two parents translates to the algorithmic principle. Procedures like Simulated Annealing inherently use one parent, when staying to the analogy, and thus might make more sense.

Readers interested in this wide topic are referred to the existing literature. Here I now want to continue with the implementation of the genetic algorithm and solve instances of the Travelling Salesman Problem (TSP) with it.

The Travelling Salesman Problem is a relatively well-known NP-hard problem. It is about finding a cheap round trip between a set of cities - that means, every city has to be visited exactly once, and the salesman has to return to his starting city in the end. With n cities there are (n - 1)! possible orders to visit them. This is an incredibly fast growing function, for example for 10 cities we get 360.000 combinations, for 15 cities there are already 87 billion combinations and for 70 cities more than there are atoms in the entire universe.

Thus in principle genetic algorithms seem like a good candidate to get a relative good approximate solution. But on the other hand there are very good approximation algorithms for this problem, so think of the introductory words.

We manage a TSP instance in the class

*TSPInstance*. For n cities the n x n integer matrix

*Connections*is created, in which entry (i, j) describes the distance from city i to j. If there is no connection between these cities, the entry is set to infinity (

*Integer.MAX_VALUE*).

Such an instance then is passed to the genetic algorithm. Starting function is the function

*Go()*of the class

*GeneticAlgorithm*.

In this we see the basic flow of the algorithm: First we initialize the starting population. Then we run for a certain number of iterations, in which we call the function

*Breed()*, which creates the new population out of the previous one.

We represent a solution of the TSP instance simply as a permutation of the cities, which describes the order in which these are visited. So ADBC for example stands for the route A-D-B-C-A. In the program the cities are numbered, so we use an integer array, the content would then for example be 0, 3, 1, 2. The "fitness" of an individual are the costs of the tour (infinity if the tour is not possible).

At the beginning of the algorithm we create an array of individuals and fill this with random tours, so random permutations of the city. The size of the array corresponds to the size of the population. As a size we take 30. This parameter, as the following ones, has a huge impact on the quality of the solution. For every problem different parameters should be tried. For the population size a size of 30 is often used.

In the function

*Breed()*we first iterate over the current population and calculate the fitness of the individuals. Then we sort them by ascending fitness.

This way we can simply take the best individuals from the previous population into the new one. The percentage of these is expressed in percent, here we use 10, which means that the 3 best individuals of the previous population are taken into the new one.

We then have to fill the remaining 27 spots in the new population, which we do via crossover and mutation.

When iterating over the old population we create the array

*SummedFitness*, in which cumulated fitness values are saved. So at position

*i*the sum of the fitness values of the individuals 0 - i is saved. With this we can do a so called fitness proportionate selection, in which individuals with a higer fitness are chosen preferably as parents. For this we create a random number between 0 and the last value of

*SummedFitness*, the individual, in whose interval the number falls, is selected.

This way we choose two parents. Now we again let a random coin decide whether we want to do a crossover operation. I here chose a crossover probability of 80 %. If no crossover is done, the child is simply a clone of parent 1. If a crossover is done, the child becomes a genetic combination of both parents. For mixing the genomes there are different possibilites. Here we use the following approach: We randomly choose a crossover point X between 0 and n - 1.

In the child we then select the first X - 1 cities from the tour directly from parent 1. The remaining cities are ordered in the order how they occur in parent 2.

As an example let us consider the parents AGEDCBF and FDAGEBC. As a crossover point we choose 3. The child thus starts its tour with the cities AGE. Remaining are the cities D, C, B and F, which are ordered like in parent 2: FDBC. Thus we get as the complete tour for the child: AGEFDBC. Again let me say, this is just one possible way to do a crossover, for example there is also the possibility to split the string in 2 parts (Two-point crossover).

In the last step the child is randomly mutated. Also for this there are different possibilities, we use the following one: We iterate over every element of the tour and mutate with a probability of 10 %. If the mutation rate is too low, the algorithm cannot create enough good solution candidates and gets stuck in a local minimum, if it is too high, the algorithm becomes random search. If we mutate, we randomly select another city in the tour and switch the two.

With this, we created, based on the old population, a new child. We repeat this process until there are again 30 individuals in the new population.

Since the best individuals remain in the populations, the fitness of the best individual never decreases and hopefully increases over time. In the end we output the best solution.

Here the code of a console application, I think this should be relatively good readable with the above explanation:

using System;

using System.Collections.Generic;

using System.Linq;

using System.Text;

using System.Threading.Tasks;

namespace GA_TSP

{

class Program

{

static void Main(string[] args)

{

TSPInstance Instance = CreateRandomTSPInstance(20);

GeneticAlgorithm GA = new GeneticAlgorithm();

GA.Go(Instance, true, 0.8, 0.1, 0.1, 30, 10000);

}

// Create a random TSP instance, connect two cities with probability 50 % and choose a length between 0 and 1000.

public static TSPInstance CreateRandomTSPInstance(int n)

{

TSPInstance Instance = new TSPInstance(n);

Random Rnd = new Random();

for (int i = 0; i < n; i++)

{

for (int j = 0; j < n; j++)

{

if (Rnd.Next(100) < 70)

{

Instance.AddConnection(i, j, Rnd.Next(1000));

}

}

}

return Instance;

}

}

public class TSPInstance

{

public int n;

public int[][] Connections;

public TSPInstance(int _n)

{

n = _n;

Connections = new int[n][];

for (int i = 0; i < n; i++)

{

Connections[i] = new int[n];

for (int j = 0; j < n; j++)

{

Connections[i][j] = int.MaxValue;

}

}

}

public void AddConnection(int start, int end, int distance)

{

Connections[start][end] = distance;

}

}

public class GeneticAlgorithm

{

public class Individual : ICloneable, IComparable<Individual>

{

public double Fitness;

public int[] Tour;

public Individual(int length)

{

Tour = new int[length];

}

// needed for sorting, sort individuals by their fitness

public int CompareTo(Individual other)

{

if (other.Fitness < Fitness)

return -1;

if (other.Fitness > Fitness)

return 1;

else

return 0;

}

// creates a deep copy of an individual

public object Clone()

{

Individual Cloned = new Individual(Tour.Length);

for (int i = 0; i < Tour.Length; i++)

{

Cloned.Tour[i] = Tour[i];

}

Cloned.Fitness = Fitness;

return Cloned;

}

}

// algorithm parameters, experiment!

double CrossoverFrequency;

double MutationFrequency;

double ElitistFrequency;

int PopulationSize;

int NrIterations;

Individual[] Population;

Individual Best = null;

TSPInstance Instance;

Boolean Print;

public Individual Go(TSPInstance inst, Boolean print, double co, double mu, double el, int popsize, int it)

{

CrossoverFrequency = co;

MutationFrequency = mu;

ElitistFrequency = el;

PopulationSize = popsize;

NrIterations = it;

Instance = inst;

Population = new Individual[PopulationSize];

Print = print;

for (int i = 0; i < PopulationSize; i++)

{

Population[i] = new Individual(inst.n);

}

Initialize();

for (int i = 0; i < NrIterations; i++)

{

Breed();

if (Print && i % 100 == 0)

{

Console.WriteLine("Iteration: " + i.ToString());

}

}

if (Print)

Console.WriteLine("Best: " + Best.Fitness.ToString());

return Best;

}

public void Initialize()

{

// fill the initial population, for each individual create a random permutation of the cities

Random Rnd = new Random();

for (int i = 0; i < PopulationSize; i++)

{

List<int> Cities = new List<int>();

for (int j = 0; j < Instance.n; j++)

{

Cities.Add(j);

}

int Counter = 0;

while (Cities.Count > 0)

{

int Index = Rnd.Next(Cities.Count);

Population[i].Tour[Counter++] = Cities[Index];

Cities.RemoveAt(Index);

}

}

}

public void Breed()

{

double[] SummedFitness = new double[PopulationSize];

// first calculate the fitness of each individual and sort in ascending order

double SumFitness = 0;

for (int i = 0; i < PopulationSize; i++)

{

Population[i].Fitness = CalculateFitness(Population[i]);

}

Array.Sort(Population);

for (int i = 0; i < PopulationSize; i++)

{

if (Best == null || Population[i].Fitness < Best.Fitness)

{

Best = Population[i];

if (Print)

Console.WriteLine("New best found, fitness: " + Best.Fitness.ToString());

}

SumFitness += Population[i].Fitness;

SummedFitness[i] = SumFitness;

}

Random Rnd = new Random();

Individual[] NewPopulation = new Individual[PopulationSize];

// take best individuals from last population

for (int i = 0; i < Math.Ceiling(PopulationSize * ElitistFrequency); i++)

{

NewPopulation[i] = (Individual)Population[i].Clone();

}

// fill the rest of the new populaton

for (int i = (int)Math.Ceiling(PopulationSize * ElitistFrequency); i < PopulationSize; i++)

{

double RParent1 = Rnd.NextDouble() * SummedFitness[PopulationSize - 1];

double RParent2 = Rnd.NextDouble() * SummedFitness[PopulationSize - 1];

Individual Parent1 = null;

Individual Parent2 = null;

Individual Child = new Individual(Instance.n);

// roulette wheel selection of the parents

for (int j = 0; j < PopulationSize; j++)

{

if (Parent1 == null && SummedFitness[j] >= RParent1)

{

Parent1 = Population[j];

}

if (Parent2 == null && SummedFitness[j] >= RParent2)

{

Parent2 = Population[j];

}

}

// do a crossover with chance CrossoverFrequency

Child = (Individual)Parent1.Clone();

if (Rnd.NextDouble() <= CrossoverFrequency)

{

// when doing a crossover, first copy the tour from parent 1, then starting from COPoint fill in the remaining spots in the order of parent 2

int COPoint = Rnd.Next(Instance.n);

List<int> VisitedCities = new List<int>();

for (int j = 0; j < COPoint; j++)

{

VisitedCities.Add(Parent1.Tour[j]);

}

int Pos = COPoint;

for (int j = 0; j < Instance.n; j++)

{

if (!VisitedCities.Contains(Parent2.Tour[j]))

{

Child.Tour[Pos++] = Parent2.Tour[j];

}

}

}

for (int j = 0; j < Instance.n; j++)

{

// for each position of the tour, do a mutation with chance MutationFrequency

if (Rnd.NextDouble() <= MutationFrequency)

{

// when mutating, switch 2 cities

int End = Rnd.Next(Instance.n);

int Temp = Population[i].Tour[j];

Population[i].Tour[j] = Population[i].Tour[End];

Population[i].Tour[End] = Temp;

}

}

NewPopulation[i] = Child;

}

Population = NewPopulation;

}

public int CalculateFitness(Individual ind)

{

// sum over entire tour and return costs

int Costs = 0;

for (int i = 0; i < ind.Tour.Length; i++)

{

if (i == ind.Tour.Length - 1)

{

if (Instance.Connections[ind.Tour[i]][ind.Tour[0]] == int.MaxValue)

return int.MaxValue;

Costs += Instance.Connections[ind.Tour[i]][ind.Tour[0]];

}

else

{

if (Instance.Connections[ind.Tour[i]][ind.Tour[i + 1]] == int.MaxValue)

return int.MaxValue;

Costs += Instance.Connections[ind.Tour[i]][ind.Tour[i + 1]];

}

}

return Costs;

}

}

}

Play with it, write me in the comments for what you use genetic algorithms and whether you can find better parameters / improvements of the algorithm for the TSP problem, I am very interested in that!