In this section, we will develop an implementation of the genetic algorithm.

The first step is to create a population of random bitstrings. We could use boolean values True and False, string values ‘0’ and ‘1’, or integer values 0 and 1. In this case, we will use integer values.

We can generate an array of integer values in a range using the randint() function, and we can specify the range as values starting at 0 and less than 2, e.g. 0 or 1. We will also represent a candidate solution as a list instead of a NumPy array to keep things simple.

An initial population of random bitstring can be created as follows, where “n_pop” is a hyperparameter that controls the population size and “n_bits” is a hyperparameter that defines the number of bits in a single candidate solution:

…

# initial population of random bitstring

pop = [randint(0, 2, n_bits).tolist() for _ in range(n_pop)]

…

# initial population of random bitstring

pop = [randint(0, 2, n_bits).tolist() for _ in range(n_pop)]

Next, we can enumerate over a fixed number of algorithm iterations, in this case, controlled by a hyperparameter named “n_iter“.

…

# enumerate generations

```
for gen in range(n_iter):
...
```

…

# enumerate generations

```
for gen in range(n_iter):
...
```

The first step in the algorithm iteration is to evaluate all candidate solutions.

We will use a function named objective() as a generic objective function and call it to get a fitness score, which we will minimize.

…

# evaluate all candidates in the population

scores = [objective(c) for c in pop]

…

# evaluate all candidates in the population

scores = [objective(c) for c in pop]

We can then select parents that will be used to create children.

The tournament selection procedure can be implemented as a function that takes the population and returns one selected parent. The k value is fixed at 3 with a default argument, but you can experiment with different values if you like.

# tournament selection

def selection(pop, scores, k=3):

# first random selection

selection_ix = randint(len(pop))

for ix in randint(0, len(pop), k-1):

# check if better (e.g. perform a tournament)

if scores[ix] < scores[selection_ix]:

selection_ix = ix

return pop[selection_ix]

# tournament selection

def selection(pop, scores, k=3):

# first random selection

selection_ix = randint(len(pop))

for ix in randint(0, len(pop), k-1):

# check if better (e.g. perform a tournament)

if scores[ix] < scores[selection_ix]:

selection_ix = ix

return pop[selection_ix]

We can then call this function one time for each position in the population to create a list of parents.

…

# select parents

selected = [selection(pop, scores) for _ in range(n_pop)]

…

# select parents

selected = [selection(pop, scores) for _ in range(n_pop)]

We can then create the next generation.

This first requires a function to perform crossover. This function will take two parents and the crossover rate. The crossover rate is a hyperparameter that determines whether crossover is performed or not, and if not, the parents are copied into the next generation. It is a probability and typically has a large value close to 1.0.

The crossover() function below implements crossover using a draw of a random number in the range [0,1] to determine if crossover is performed, then selecting a valid split point if crossover is to be performed.

# crossover two parents to create two children

def crossover(p1, p2, r_cross):

# children are copies of parents by default

c1, c2 = p1.copy(), p2.copy()

# check for recombination

if rand() < r_cross:

# select crossover point that is not on the end of the string

pt = randint(1, len(p1)-2)

# perform crossover

c1 = p1[:pt] + p2[pt:]

c2 = p2[:pt] + p1[pt:]

return [c1, c2]

# crossover two parents to create two children

def crossover(p1, p2, r_cross):

# children are copies of parents by default

c1, c2 = p1.copy(), p2.copy()

# check for recombination

if rand() < r_cross:

# select crossover point that is not on the end of the string

pt = randint(1, len(p1)-2)

# perform crossover

c1 = p1[:pt] + p2[pt:]

c2 = p2[:pt] + p1[pt:]

return [c1, c2]

We also need a function to perform mutation.

This procedure simply flips bits with a low probability controlled by the “r_mut” hyperparameter.

# mutation operator

def mutation(bitstring, r_mut):

for i in range(len(bitstring)):

# check for a mutation

if rand() < r_mut:

# flip the bit

bitstring[i] = 1 - bitstring[i]

# mutation operator

def mutation(bitstring, r_mut):

for i in range(len(bitstring)):

# check for a mutation

if rand() < r_mut:

# flip the bit

bitstring[i] = 1 - bitstring[i]

We can then loop over the list of parents and create a list of children to be used as the next generation, calling the crossover and mutation functions as needed.

…

# create the next generation

children = list()

for i in range(0, n_pop, 2):

# get selected parents in pairs

p1, p2 = selected[i], selected[i+1]

# crossover and mutation

for c in crossover(p1, p2, r_cross):

# mutation

mutation(c, r_mut)

# store for next generation

children.append(c)

…

# create the next generation

children = list()

for i in range(0, n_pop, 2):

# get selected parents in pairs

p1, p2 = selected[i], selected[i+1]

# crossover and mutation

for c in crossover(p1, p2, r_cross):

# mutation

mutation(c, r_mut)

# store for next generation

children.append(c)

We can tie all of this together into a function named genetic_algorithm() that takes the name of the objective function and the hyperparameters of the search, and returns the best solution found during the search.

# genetic algorithm

def genetic_algorithm(objective, n_bits, n_iter, n_pop, r_cross, r_mut):

# initial population of random bitstring

pop = [randint(0, 2, n_bits).tolist() for _ in range(n_pop)]

# keep track of best solution

best, best_eval = 0, objective(pop[0])

# enumerate generations

for gen in range(n_iter):

# evaluate all candidates in the population

scores = [objective(c) for c in pop]

# check for new best solution

for i in range(n_pop):

if scores[i] < best_eval:

best, best_eval = pop[i], scores[i]

print(">%d, new best f(%s) = %.3f" % (gen, pop[i], scores[i]))

# select parents

selected = [selection(pop, scores) for _ in range(n_pop)]

# create the next generation

children = list()

for i in range(0, n_pop, 2):

# get selected parents in pairs

p1, p2 = selected[i], selected[i+1]

# crossover and mutation

for c in crossover(p1, p2, r_cross):

# mutation

mutation(c, r_mut)

# store for next generation

children.append(c)

# replace population

pop = children

return [best, best_eval]

# genetic algorithm

def genetic_algorithm(objective, n_bits, n_iter, n_pop, r_cross, r_mut):

# initial population of random bitstring

pop = [randint(0, 2, n_bits).tolist() for _ in range(n_pop)]

# keep track of best solution

best, best_eval = 0, objective(pop[0])

# enumerate generations

for gen in range(n_iter):

# evaluate all candidates in the population

scores = [objective(c) for c in pop]

# check for new best solution

for i in range(n_pop):

if scores[i] < best_eval:

best, best_eval = pop[i], scores[i]

print(">%d, new best f(%s) = %.3f" % (gen, pop[i], scores[i]))

# select parents

selected = [selection(pop, scores) for _ in range(n_pop)]

# create the next generation

children = list()

for i in range(0, n_pop, 2):

# get selected parents in pairs

p1, p2 = selected[i], selected[i+1]

# crossover and mutation

for c in crossover(p1, p2, r_cross):

# mutation

mutation(c, r_mut)

# store for next generation

children.append(c)

# replace population

pop = children

return [best, best_eval]