Python Genetic Algorithm for Optimization
Write a Python program to implement a genetic algorithm for solving optimization problems.
A genetic algorithm (GA) is a heuristic optimization technique inspired by the process of natural selection. It involves generating an initial population of potential solutions, evaluating their fitness, and iteratively applying selection, crossover, and mutation operators to evolve the population towards better solutions. The algorithm continues until a stopping criterion, such as a maximum number of generations or a satisfactory fitness level, is met. GAs are particularly useful for solving complex optimization problems where traditional methods may be inefficient or infeasible.
Sample Solution:
Python Code :
# Import the random module for generating random numbers
import random
# Import the numpy module for numerical operations
import numpy as np
# Problem definition: Example function to optimize (e.g., Rastrigin function)
def rastrigin(x):
# Define the constant A
A = 10
# Compute the Rastrigin function value for the input x
return A * len(x) + sum([(xi**2 - A * np.cos(2 * np.pi * xi)) for xi in x])
# GA Parameters
# Set the population size
population_size = 100
# Set the genome length (number of genes in a chromosome)
genome_length = 10
# Set the crossover rate
crossover_rate = 0.8
# Set the mutation rate
mutation_rate = 0.01
# Set the number of generations
num_generations = 100
# Define the bounds for the gene values
bounds = [-5.12, 5.12]
# Initialization
def initialize_population(pop_size, genome_len, bounds):
# Generate a population of random individuals within the given bounds
return [np.random.uniform(bounds[0], bounds[1], genome_len) for _ in range(pop_size)]
# Fitness function
def evaluate_fitness(population):
# Evaluate the fitness of each individual in the population using the Rastrigin function
return [rastrigin(individual) for individual in population]
# Selection (Tournament Selection)
def tournament_selection(population, fitness, tournament_size=3):
# Initialize the list of selected individuals
selected = []
# Select individuals based on tournament selection
for _ in range(len(population)):
# Choose random aspirants for the tournament
aspirants = [random.randint(0, len(population) - 1) for _ in range(tournament_size)]
# Select the individual with the best fitness among the aspirants
selected.append(min(aspirants, key=lambda aspirant: fitness[aspirant]))
# Return the selected individuals
return [population[i] for i in selected]
# Crossover (Single Point Crossover)
def single_point_crossover(parent1, parent2):
# Perform crossover with a given probability
if random.random() < crossover_rate:
# Choose a random crossover point
point = random.randint(1, len(parent1) - 1)
# Create children by combining the parents at the crossover point
child1 = np.concatenate([parent1[:point], parent2[point:]])
child2 = np.concatenate([parent2[:point], parent1[point:]])
# Return the children
return child1, child2
# If no crossover, return the parents as is
return parent1, parent2
# Mutation
def mutate(individual, mutation_rate, bounds):
# Mutate each gene with a given probability
for i in range(len(individual)):
if random.random() < mutation_rate:
# Replace the gene with a random value within the bounds
individual[i] = np.random.uniform(bounds[0], bounds[1])
# Return the mutated individual
return individual
# Genetic Algorithm
def genetic_algorithm():
# Initialize the population
population = initialize_population(population_size, genome_length, bounds)
# Iterate over the number of generations
for generation in range(num_generations):
# Evaluate the fitness of the population
fitness = evaluate_fitness(population)
# Selection
# Select individuals based on their fitness
selected_population = tournament_selection(population, fitness)
# Crossover
# Create the next generation through crossover
next_population = []
for i in range(0, len(selected_population), 2):
parent1 = selected_population[i]
parent2 = selected_population[min(i + 1, len(selected_population) - 1)]
child1, child2 = single_point_crossover(parent1, parent2)
next_population.extend([child1, child2])
# Mutation
# Mutate the individuals in the next generation
population = [mutate(individual, mutation_rate, bounds) for individual in next_population]
# Evaluation
# Re-evaluate the fitness of the new population
fitness = evaluate_fitness(population)
# Find the best fitness and corresponding individual
best_fitness = min(fitness)
best_individual = population[fitness.index(best_fitness)]
# Print the best fitness of the current generation
print(f'Generation {generation + 1}: Best Fitness = {best_fitness}')
# Return the best individual found
return best_individual
# Uncomment the line below to run the genetic algorithm
best_solution = genetic_algorithm()
# Print the best solution found
print("Best solution found:", best_solution)
Output:
Generation 1: Best Fitness = 110.85587631209354 Generation 2: Best Fitness = 83.55558953479647 Generation 3: Best Fitness = 83.55558953479647 Generation 4: Best Fitness = 73.01303728939105 Generation 5: Best Fitness = 63.49980511636931 Generation 6: Best Fitness = 54.95357841273925 Generation 7: Best Fitness = 52.164657308731805 Generation 8: Best Fitness = 41.53442355084143 Generation 9: Best Fitness = 40.77655529061664 Generation 10: Best Fitness = 32.3028275826227 Generation 11: Best Fitness = 31.703605071195994 Generation 12: Best Fitness = 26.945249768931816 Generation 13: Best Fitness = 25.17456565689244 Generation 14: Best Fitness = 24.305561725047113 Generation 15: Best Fitness = 23.7537672807303 Generation 16: Best Fitness = 20.538189461578256 Generation 17: Best Fitness = 20.538189461578256 Generation 18: Best Fitness = 17.82192103629376 Generation 19: Best Fitness = 15.660458167383922 Generation 20: Best Fitness = 15.660458167383922 Generation 21: Best Fitness = 15.660458167383922 Generation 22: Best Fitness = 15.660458167383922 Generation 23: Best Fitness = 15.660458167383922 Generation 24: Best Fitness = 15.660458167383922 Generation 25: Best Fitness = 15.660458167383922 Generation 26: Best Fitness = 15.660458167383922 Generation 27: Best Fitness = 11.219758723842418 Generation 28: Best Fitness = 11.219758723842418 Generation 29: Best Fitness = 11.219758723842418 Generation 30: Best Fitness = 10.304030841022396 Generation 31: Best Fitness = 9.879221918189913 Generation 32: Best Fitness = 9.879221918189913 Generation 33: Best Fitness = 8.96349403536989 Generation 34: Best Fitness = 8.96349403536989 Generation 35: Best Fitness = 8.96349403536989 Generation 36: Best Fitness = 8.520190466016146 Generation 37: Best Fitness = 8.520190466016146 Generation 38: Best Fitness = 8.520190466016146 Generation 39: Best Fitness = 8.520190466016146 Generation 40: Best Fitness = 8.520190466016146 Generation 41: Best Fitness = 7.490863078858169 Generation 42: Best Fitness = 8.50795948797564 Generation 43: Best Fitness = 8.50795948797564 Generation 44: Best Fitness = 8.50795948797564 Generation 45: Best Fitness = 8.50795948797564 Generation 46: Best Fitness = 8.1156522074876 Generation 47: Best Fitness = 8.1156522074876 Generation 48: Best Fitness = 8.1156522074876 Generation 49: Best Fitness = 8.1156522074876 Generation 50: Best Fitness = 8.1156522074876 Generation 51: Best Fitness = 8.045668696125716 Generation 52: Best Fitness = 8.045668696125716 Generation 53: Best Fitness = 8.045668696125716 Generation 54: Best Fitness = 8.045668696125716 Generation 55: Best Fitness = 8.045668696125716 Generation 56: Best Fitness = 8.045668696125716 Generation 57: Best Fitness = 8.045668696125716 Generation 58: Best Fitness = 6.103864693961185 Generation 59: Best Fitness = 6.103864693961185 Generation 60: Best Fitness = 6.103864693961185 Generation 61: Best Fitness = 6.103864693961185 Generation 62: Best Fitness = 6.103864693961185 Generation 63: Best Fitness = 6.103864693961185 Generation 64: Best Fitness = 6.103864693961185 Generation 65: Best Fitness = 5.093520351378899 Generation 66: Best Fitness = 5.093520351378899 Generation 67: Best Fitness = 5.093520351378899 Generation 68: Best Fitness = 5.093520351378899 Generation 69: Best Fitness = 5.093520351378899 Generation 70: Best Fitness = 5.093520351378899 Generation 71: Best Fitness = 5.093520351378899 Generation 72: Best Fitness = 5.093520351378899 Generation 73: Best Fitness = 5.093520351378899 Generation 74: Best Fitness = 5.093520351378899 Generation 75: Best Fitness = 5.093520351378899 Generation 76: Best Fitness = 5.093520351378899 Generation 77: Best Fitness = 5.093520351378899 Generation 78: Best Fitness = 5.093520351378899 Generation 79: Best Fitness = 5.093520351378899 Generation 80: Best Fitness = 5.093520351378899 Generation 81: Best Fitness = 5.093520351378899 Generation 82: Best Fitness = 5.093520351378899 Generation 83: Best Fitness = 5.093520351378899 Generation 84: Best Fitness = 5.093520351378899 Generation 85: Best Fitness = 5.093520351378899 Generation 86: Best Fitness = 5.093520351378899 Generation 87: Best Fitness = 5.093520351378899 Generation 88: Best Fitness = 5.0417512241617 Generation 89: Best Fitness = 5.0417512241617 Generation 90: Best Fitness = 5.0417512241617 Generation 91: Best Fitness = 5.0417512241617 Generation 92: Best Fitness = 5.0417512241617 Generation 93: Best Fitness = 5.0417512241617 Generation 94: Best Fitness = 5.0417512241617 Generation 95: Best Fitness = 5.0417512241617 Generation 96: Best Fitness = 5.0417512241617 Generation 97: Best Fitness = 5.0417512241617 Generation 98: Best Fitness = 5.0417512241617 Generation 99: Best Fitness = 5.0417512241617 Generation 100: Best Fitness = 5.0417512241617 Best solution found: [-1.01648779 -0.01999725 0.05078325 1.00710074 1.00313334 0.01685534 0.00182741 0.00903347 1.02140595 -0.02554318]
Explanation:
- Import the necessary modules: "random" for generating random numbers and "numpy" for numerical operations.
- Define the Rastrigin function to optimize.
- Set genetic algorithm parameters: population size, genome length, crossover rate, mutation rate, number of generations, and bounds for gene values.
- Define a function to initialize the population with random individuals within the given bounds.
- Define a function to evaluate the fitness of each individual in the population using the Rastrigin function.
- Implement tournament selection to select individuals based on their fitness.
- Implement single-point crossover to create new individuals by combining parts of parent chromosomes.
- Implement mutation to introduce random changes to individuals with a certain probability.
- Define the main genetic algorithm function:
- Initialize the population.
- Iterate over the number of generations:
- Evaluate the fitness of the population.
- Select individuals based on fitness.
- Perform crossover to create the next generation.
- Apply mutation to the next generation.
- Re-evaluate fitness of the new population.
- Track and print the best fitness of each generation.
- Return the best individual found.
Python Code Editor :
Have another way to solve this solution? Contribute your code (and comments) through Disqus.
Previous: Time Series Forecasting with ARIMA and Pandas.
Next: Real-Time Data Visualization Dashboard with Plotly and Dash.
What is the difficulty level of this exercise?
Test your Programming skills with w3resource's quiz.
It will be nice if you may share this link in any developer community or anywhere else, from where other developers may find this content. Thanks.
https://198.211.115.131/python-exercises/advanced/python-genetic-algorithm-for-optimization.php
- Weekly Trends and Language Statistics
- Weekly Trends and Language Statistics