Giter Club home page Giter Club logo

genetic-algorithm's Introduction

Genetic Algorithm Class

Python implementation of a genetic algorithm to solve optimization problems with n control variables.

Description

A genetic algorithm (GA) is a search heuristic part of a broader family of algorithms called evolutionary algorithms (EAs). EAs are population-based metaheuristics that integrate mechanisms inspired by biological evolution such as reproduction, mutation, selection. The GA algorithm is used particularly in optimization problems where calculating gradients of an objective function is problematic or not possible.

Steps in GA

  1. Initialization: initialize a population of individuals or candidate solutions to the problem. This initialization can be done by means of random sampling. Each individual is defined by an encoding which we call genes.
  2. Selection: calculate the best candidates based on a defined fitness function we want to optimize. We select the best j parents which will be combined. The parameter j is arbitrary.
  3. Crossover: we combine the genes of the parents to produce an offspring. These are s new individuals in our population.
  4. Mutation: add randomness to the generated offspring. We can add e.g. a Gausian noise to one of the genes of the offspring for each individual.
  5. Replacement: select the l fittest individuals of the population to evaluate on the next epoch.

We repeat these evolution steps for certain amount of epochs or until an exit condition is met.

GA implementation

Dependencies

Hyperparameters

  • Individuals:

    • lower_bound
    • upper_bound
    • number_of_genes: dimension of the search space. In this implementation it indicates the shape of the array that represents each individual.
    Note: The number of genes of each individual and the fitness function must be congruent
  • Population:

    • n_parents: j parents.
    • offspring_size: the s new individuals from combining jparents.
    • mutation_mean, mutation_sd: mean and standard deviation of the Gaussian noise added during the mutation step.
    • size: maximum size of the population or l fittest individuals to survive for the next epoch.
  • Evolution:

    • epochs: number of times we repet each evolution step.

Example

An example fitness function could be something like this:

def fitness(x, y):
    return x*(x-1)*np.cos(2*x-1)*np.sin(2*x-1)*(y-2)

We can limit our search to evaluate individuals formula within the domain formula with the ind_parameters dictionary. Likewise, we control the population parameters with the pop_parameters.

# example.py
import numpy as np
from ga.evolution import Evolution

# Define a fitness function
def fitness(x, y):
    return x * (x - 1) * np.cos(2 * x - 1) * np.sin(2 * x - 1) * (y - 2)

# Define parameter for each individual
ind_parameters = {'lower_bound': -2,
                  'upper_bound': 2,
                  'number_of_genes': 2}

# Define parameter for the entire population
pop_parameters = {'n_parents': 6,
                  'offspring_size':(2, ind_parameters['number_of_genes']),
                  'mutation_mean': 0.25,
                  'mutation_sd': 0.5,
                  'size': 10}
def example():
    # Instantiate an evolution
    evo = Evolution(pop_parameters, ind_parameters, fitness)
    # Repeat evolution step 200 epochs
    epochs = 10000
    # Record fitness history 
    history = []
    x_history = []
    y_history = []
    for _ in range(epochs):
        print('Epoch {}/{}, Progress: {}%\r'.format(_+1, epochs, np.round(((_+1)/epochs)*100, 2)), end="")
        evo.step()
        history.append(evo._best_score)
        x_history.append(evo._best_individual[0][0])
        y_history.append(evo._best_individual[0][1])
    
    print('\nResults:')
    print('Best individual:', evo.solution.best_individual)
    print('Fitness value of best individual:', evo.solution.best_score)

example()

The results are really close to the global optimum within this domain and the best individual does not change after 50 epochs.

# Output
Epoch  200/200, Progress 100.0%
Results:
Best individual: [-1.52637873, -2.        ]
Fitness value of best individual: 7.4697265870418414

References

genetic-algorithm's People

Contributors

dependabot[bot] avatar zepd avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.