niapy.algorithms
¶
Module with implementations of basic and hybrid algorithms.
- class niapy.algorithms.Algorithm(population_size=50, initialization_function=<function default_numpy_init>, individual_type=None, seed=None, *args, **kwargs)[source]¶
Bases:
object
Class for implementing algorithms.
- Date:
2018
- Author
Klemen Berkovič
- License:
MIT
- Variables
Name (List[str]) – List of names for algorithm.
rng (numpy.random.Generator) – Random generator.
population_size (int) – Population size.
initialization_function (Callable[[int, Task, numpy.random.Generator, Dict[str, Any]], Tuple[numpy.ndarray, numpy.ndarray[float]]]) – Population initialization function.
individual_type (Optional[Type[Individual]]) – Type of individuals used in population, default value is None for Numpy arrays.
Initialize algorithm and create name for an algorithm.
- Parameters
population_size (Optional[int]) – Population size.
initialization_function (Optional[Callable[[int, Task, numpy.random.Generator, Dict[str, Any]], Tuple[numpy.ndarray, numpy.ndarray[float]]]]) – Population initialization function.
individual_type (Optional[Type[Individual]]) – Individual type used in population, default is Numpy array.
seed (Optional[int]) – Starting seed for random generator.
- Name = ['Algorithm', 'AAA']¶
- __init__(population_size=50, initialization_function=<function default_numpy_init>, individual_type=None, seed=None, *args, **kwargs)[source]¶
Initialize algorithm and create name for an algorithm.
- Parameters
population_size (Optional[int]) – Population size.
initialization_function (Optional[Callable[[int, Task, numpy.random.Generator, Dict[str, Any]], Tuple[numpy.ndarray, numpy.ndarray[float]]]]) – Population initialization function.
individual_type (Optional[Type[Individual]]) – Individual type used in population, default is Numpy array.
seed (Optional[int]) – Starting seed for random generator.
- bad_run()[source]¶
Check if some exceptions where thrown when the algorithm was running.
- Returns
True if some error where detected at runtime of the algorithm, otherwise False
- Return type
bool
- static get_best(population, population_fitness, best_x=None, best_fitness=inf)[source]¶
Get the best individual for population.
- Parameters
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray) – Current populations fitness/function values of aligned individuals.
best_x (Optional[numpy.ndarray]) – Best individual.
best_fitness (float) – Fitness value of best individual.
- Returns
Coordinates of best solution.
beset fitness/function value.
- Return type
Tuple[numpy.ndarray, float]
- get_parameters()[source]¶
Get parameters of the algorithm.
- Returns
Parameter name (str): Represents a parameter name
Value of parameter (Any): Represents the value of the parameter
- Return type
Dict[str, Any]
- init_population(task)[source]¶
Initialize starting population of optimization algorithm.
- Parameters
task (Task) – Optimization task.
- Returns
New population.
New population fitness values.
Additional arguments.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, Dict[str, Any]]
- integers(low, high=None, size=None, skip=None)[source]¶
Get discrete uniform (integer) random distribution of D shape in range from “low” to “high”.
- Parameters
low (Union[int, Iterable[int]]) – Lower integer bound. If high = None low is 0 and this value is used as high
high (Union[int, Iterable[int]]) – One above upper integer bound.
size (Union[None, int, Iterable[int]]) – shape of returned discrete uniform random distribution.
skip (Union[None, int, Iterable[int], numpy.ndarray[int]]) – numbers to skip.
- Returns
Random generated integer number.
- Return type
Union[int, numpy.ndarray[int]]
- iteration_generator(task)[source]¶
Run the algorithm for a single iteration and return the best solution.
- Parameters
task (Task) – Task with bounds and objective function for optimization.
- Returns
Generator getting new/old optimal global values.
- Return type
Generator[Tuple[numpy.ndarray, float], None, None]
- Yields
Tuple[numpy.ndarray, float] – 1. New population best individuals coordinates. 2. Fitness value of the best solution.
- normal(loc, scale, size=None)[source]¶
Get normal random distribution of shape size with mean “loc” and standard deviation “scale”.
- Parameters
loc (float) – Mean of the normal random distribution.
scale (float) – Standard deviation of the normal random distribution.
size (Union[int, Iterable[int]]) – Shape of returned normal random distribution.
- Returns
Array of numbers.
- Return type
Union[numpy.ndarray[float], float]
- random(size=None)[source]¶
Get random distribution of shape size in range from 0 to 1.
- Parameters
size (Union[None, int, Iterable[int]]) – Shape of returned random distribution.
- Returns
Random number or numbers \(\in [0, 1]\).
- Return type
Union[numpy.ndarray[float], float]
- run(task)[source]¶
Start the optimization.
- Parameters
task (Task) – Optimization task.
- Returns
Best individuals components found in optimization process.
Best fitness value found in optimization process.
- Return type
Tuple[numpy.ndarray, float]
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core functionality of algorithm.
This function is called on every algorithm iteration.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population coordinates.
population_fitness (numpy.ndarray) – Current population fitness value.
best_x (numpy.ndarray) – Current generation best individuals coordinates.
best_fitness (float) – current generation best individuals fitness value.
**params (Dict[str, Any]) – Additional arguments for algorithms.
- Returns
New populations coordinates.
New populations fitness values.
New global best position/solution
New global best fitness/objective value
Additional arguments of the algorithm.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- run_task(task)[source]¶
Start the optimization.
- Parameters
task (Task) – Task with bounds and objective function for optimization.
- Returns
Best individuals components found in optimization process.
Best fitness value found in optimization process.
- Return type
Tuple[numpy.ndarray, float]
- set_parameters(population_size=50, initialization_function=<function default_numpy_init>, individual_type=None, *args, **kwargs)[source]¶
Set the parameters/arguments of the algorithm.
- Parameters
population_size (Optional[int]) – Population size.
initialization_function (Optional[Callable[[int, Task, numpy.random.Generator, Dict[str, Any]], Tuple[numpy.ndarray, numpy.ndarray[float]]]]) – Population initialization function.
individual_type (Optional[Type[Individual]]) – Individual type used in population, default is Numpy array.
- standard_normal(size=None)[source]¶
Get standard normal distribution of shape size.
- Parameters
size (Union[int, Iterable[int]]) – Shape of returned standard normal distribution.
- Returns
Random generated numbers or one random generated number \(\in [0, 1]\).
- Return type
Union[numpy.ndarray[float], float]
- uniform(low, high, size=None)[source]¶
Get uniform random distribution of shape size in range from “low” to “high”.
- Parameters
low (Union[float, Iterable[float]]) – Lower bound.
high (Union[float, Iterable[float]]) – Upper bound.
size (Union[None, int, Iterable[int]]) – Shape of returned uniform random distribution.
- Returns
Array of numbers \(\in [\mathit{Lower}, \mathit{Upper}]\).
- Return type
Union[numpy.ndarray[float], float]
- class niapy.algorithms.Individual(x=None, task=None, e=True, rng=None, **kwargs)[source]¶
Bases:
object
Class that represents one solution in population of solutions.
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
- Variables
x (numpy.ndarray) – Coordinates of individual.
f (float) – Function/fitness value of individual.
Initialize new individual.
- Parameters
task (Optional[Task]) – Optimization task.
rand (Optional[numpy.random.Generator]) – Random generator.
x (Optional[numpy.ndarray]) – Individuals components.
e (Optional[bool]) – True to evaluate the individual on initialization. Default value is True.
- __eq__(other)[source]¶
Compare the individuals for equalities.
- Parameters
other (Union[Any, numpy.ndarray]) – Object that we want to compare this object to.
- Returns
True if equal or False if no equal.
- Return type
bool
- __getitem__(i)[source]¶
Get the value of i-th component of the solution.
- Parameters
i (int) – Position of the solution component.
- Returns
Value of ith component.
- Return type
Any
- __init__(x=None, task=None, e=True, rng=None, **kwargs)[source]¶
Initialize new individual.
- Parameters
task (Optional[Task]) – Optimization task.
rand (Optional[numpy.random.Generator]) – Random generator.
x (Optional[numpy.ndarray]) – Individuals components.
e (Optional[bool]) – True to evaluate the individual on initialization. Default value is True.
- __len__()[source]¶
Get the length of the solution or the number of components.
- Returns
Number of components.
- Return type
int
- __setitem__(i, v)[source]¶
Set the value of i-th component of the solution to v value.
- Parameters
i (int) – Position of the solution component.
v (Any) – Value to set to i-th component.
- __str__()[source]¶
Print the individual with the solution and objective value.
- Returns
String representation of self.
- Return type
str
- copy()[source]¶
Return a copy of self.
Method returns copy of
this
object so it is safe for editing.- Returns
Copy of self.
- Return type
- evaluate(task, rng=None)[source]¶
Evaluate the solution.
Evaluate solution
this.x
with the help of task. Task is used for repairing the solution and then evaluating it.- Parameters
task (Task) – Objective function object.
rng (Optional[numpy.random.Generator]) – Random generator.
See also
- generate_solution(task, rng)[source]¶
Generate new solution.
Generate new solution for this individual and set it to
self.x
. This method usesrng
for getting random numbers. For generating random componentsrng
andtask
is used.- Parameters
task (Task) – Optimization task.
rng (numpy.random.Generator) – Random numbers generator object.
- niapy.algorithms.default_individual_init(task, population_size, rng, individual_type=None, **_kwargs)[source]¶
Initialize population_size individuals of type individual_type.
- Parameters
task (Task) – Optimization task.
population_size (int) – Number of individuals in population.
rng (numpy.random.Generator) – Random number generator.
individual_type (Optional[Individual]) – Class of individual in population.
- Returns
Initialized individuals.
Initialized individuals function/fitness values.
- Return type
Tuple[numpy.ndarray[Individual], numpy.ndarray[float]
- niapy.algorithms.default_numpy_init(task, population_size, rng, **_kwargs)[source]¶
Initialize starting population that is represented with numpy.ndarray with shape (population_size, task.dimension).
- Parameters
task (Task) – Optimization task.
population_size (int) – Number of individuals in population.
rng (numpy.random.Generator) – Random number generator.
- Returns
New population with shape (population_size, task.D).
New population function/fitness values.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float]]
niapy.algorithms.basic
¶
Implementation of basic nature-inspired algorithms.
- class niapy.algorithms.basic.AgingNpDifferentialEvolution(min_lifetime=0, max_lifetime=12, delta_np=0.3, omega=0.3, age=<function proportional>, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.de.DifferentialEvolution
Implementation of Differential evolution algorithm with aging individuals.
- Algorithm:
Differential evolution algorithm with dynamic population size that is defined by the quality of population
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
- Variables
Name (List[str]) – list of strings representing algorithm names.
Lt_min (int) – Minimal age of individual.
Lt_max (int) – Maximal age of individual.
delta_np (float) – Proportion of how many individuals shall die.
omega (float) – Acceptance rate for individuals to die.
mu (int) – Mean of individual max and min age.
age (Callable[[int, int, float, float, float, float, float], int]) – Function for calculation of age for individual.
Initialize AgingNpDifferentialEvolution.
- Parameters
min_lifetime (Optional[int]) – Minimum life time.
max_lifetime (Optional[int]) – Maximum life time.
delta_np (Optional[float]) – Proportion of how many individuals shall die.
omega (Optional[float]) – Acceptance rate for individuals to die.
age (Optional[Callable[[int, int, float, float, float, float, float], int]]) – Function for calculation of age for individual.
- Name = ['AgingNpDifferentialEvolution', 'ANpDE']¶
- __init__(min_lifetime=0, max_lifetime=12, delta_np=0.3, omega=0.3, age=<function proportional>, *args, **kwargs)[source]¶
Initialize AgingNpDifferentialEvolution.
- Parameters
min_lifetime (Optional[int]) – Minimum life time.
max_lifetime (Optional[int]) – Maximum life time.
delta_np (Optional[float]) – Proportion of how many individuals shall die.
omega (Optional[float]) – Acceptance rate for individuals to die.
age (Optional[Callable[[int, int, float, float, float, float, float], int]]) – Function for calculation of age for individual.
- aging(task, pop)[source]¶
Apply aging to individuals.
- Parameters
task (Task) – Optimization task.
pop (numpy.ndarray[Individual]) – Current population.
- Returns
New population.
- Return type
numpy.ndarray[Individual]
- decrement_population(pop, task)[source]¶
Decrement population.
- Parameters
pop (numpy.ndarray) – Current population.
task (Task) – Optimization task.
- Returns
Decreased population.
- Return type
numpy.ndarray[Individual]
- delta_pop_created(t)[source]¶
Calculate how many individuals are going to be created.
- Parameters
t (int) – Number of generations made by the algorithm.
- Returns
Number of individuals to be born.
- Return type
int
- delta_pop_eliminated(t)[source]¶
Calculate how many individuals are going to die.
- Parameters
t (int) – Number of generations made by the algorithm.
- Returns
Number of individuals to dye.
- Return type
int
- increment_population(task)[source]¶
Increment population.
- Parameters
task (Task) – Optimization task.
- Returns
Increased population.
- Return type
numpy.ndarray[Individual]
- static info()[source]¶
Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
str
See also
- post_selection(pop, task, xb, fxb, **kwargs)[source]¶
Post selection operator.
- Parameters
pop (numpy.ndarray) – Current population.
task (Task) – Optimization task.
xb (Individual) – Global best individual.
fxb (float) – Global best fitness.
- Returns
New population.
New global best solution
New global best solutions fitness/objective value
- Return type
Tuple[numpy.ndarray, numpy.ndarray, float]
- selection(population, new_population, best_x, best_fitness, task, **kwargs)[source]¶
Select operator for individuals with aging.
- Parameters
population (numpy.ndarray) – Current population.
new_population (numpy.ndarray) – New population.
best_x (numpy.ndarray) – Current global best solution.
best_fitness (float) – Current global best solutions fitness/objective value.
task (Task) – Optimization task.
- Returns
New population of individuals.
New global best solution.
New global best solutions fitness/objective value.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, float]
- set_parameters(min_lifetime=0, max_lifetime=12, delta_np=0.3, omega=0.3, age=<function proportional>, **kwargs)[source]¶
Set the algorithm parameters.
- Parameters
min_lifetime (Optional[int]) – Minimum life time.
max_lifetime (Optional[int]) – Maximum life time.
delta_np (Optional[float]) – Proportion of how many individuals shall die.
omega (Optional[float]) – Acceptance rate for individuals to die.
age (Optional[Callable[[int, int, float, float, float, float, float], int]]) – Function for calculation of age for individual.
- class niapy.algorithms.basic.ArtificialBeeColonyAlgorithm(population_size=10, limit=100, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Artificial Bee Colony algorithm.
- Algorithm:
Artificial Bee Colony algorithm
- Date:
2018
- Author:
Uros Mlakar and Klemen Berkovič
- License:
MIT
- Reference paper:
Karaboga, D., and Bahriye B. “A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm.” Journal of global optimization 39.3 (2007): 459-471.
- Arguments
Name (List[str]): List containing strings that represent algorithm names limit (Union[float, numpy.ndarray[float]]): Maximum number of cycles without improvement.
See also
Initialize ArtificialBeeColonyAlgorithm.
- Parameters
population_size (Optional[int]) – Population size.
limit (Optional[int]) – Maximum number of cycles without improvement.
- Name = ['ArtificialBeeColonyAlgorithm', 'ABC']¶
- __init__(population_size=10, limit=100, *args, **kwargs)[source]¶
Initialize ArtificialBeeColonyAlgorithm.
- Parameters
population_size (Optional[int]) – Population size.
limit (Optional[int]) – Maximum number of cycles without improvement.
- calculate_probabilities(foods)[source]¶
Calculate the probes.
- Parameters
foods (numpy.ndarray) – Current population.
- Returns
Probabilities.
- Return type
numpy.ndarray
- static info()[source]¶
Get algorithms information.
- Returns
Algorithm information.
- Return type
str
See also
- init_population(task)[source]¶
Initialize the starting population.
- Parameters
task (Task) – Optimization task
- Returns
New population
New population fitness/function values
- Additional arguments:
trials (numpy.ndarray): Number of cycles without improvement.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float], Dict[str, Any]]
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of the algorithm.
- Parameters
task (Task) – Optimization task
population (numpy.ndarray) – Current population
population_fitness (numpy.ndarray[float]) – Function/fitness values of current population
best_x (numpy.ndarray) – Current best individual
best_fitness (float) – Current best individual fitness/function value
params (Dict[str, Any]) – Additional parameters
- Returns
New population
New population fitness/function values
New global best solution
New global best fitness/objective value
- Additional arguments:
trials (numpy.ndarray): Number of cycles without improvement.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- class niapy.algorithms.basic.BacterialForagingOptimization(population_size=50, n_chemotactic=100, n_swim=4, n_reproduction=4, n_elimination=2, prob_elimination=0.25, step_size=0.1, swarming=True, d_attract=0.1, w_attract=0.2, h_repel=0.1, w_repel=10.0, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of the Bacterial foraging optimization algorithm.
- Algorithm:
Bacterial Foraging Optimization
- Date:
2021
- Author:
Žiga Stupan
- License:
MIT
- Reference paper:
Passino, “Biomimicry of bacterial foraging for distributed optimization and control,” in IEEE Control Systems Magazine, vol. 22, no. 3, pp. 52-67, June 2002, doi: 10.1109/MCS.2002.1004010.
- Variables
Name (List[str]) – list of strings representing algorithm names.
population_size (Optional[int]) – Number of individuals in population \(\in [1, \infty]\).
n_chemotactic (Optional[int]) – Number of chemotactic steps.
n_swim (Optional[int]) – Number of swim steps.
n_reproduction (Optional[int]) – Number of reproduction steps.
n_elimination (Optional[int]) – Number of elimination and dispersal steps.
prob_elimination (Optional[float]) – Probability of a bacterium being eliminated and a new one being created at a random location in the search space.
step_size (Optional[float]) – Size of a chemotactic step.
d_attract (Optional[float]) – Depth of the attractant released by the cell (a quantification of how much attractant is released).
w_attract (Optional[float]) – Width of the attractant signal (a quantification of the diffusion rate of the chemical).
h_repel (Optional[float]) – Height of the repellent effect (magnitude of its effect).
w_repel (Optional[float]) – Width of the repellent.
See also
Initialize algorithm.
- Parameters
population_size (Optional[int]) – Number of individuals in population \(\in [1, \infty]\).
n_chemotactic (Optional[int]) – Number of chemotactic steps.
n_swim (Optional[int]) – Number of swim steps.
n_reproduction (Optional[int]) – Number of reproduction steps.
n_elimination (Optional[int]) – Number of elimination and dispersal steps.
prob_elimination (Optional[float]) – Probability of a bacterium being eliminated and a new one being created at a random location in the search space.
step_size (Optional[float]) – Size of a chemotactic step.
swarming (Optional[bool]) – If True use swarming.
d_attract (Optional[float]) – Depth of the attractant released by the cell (a quantification of how much attractant is released).
w_attract (Optional[float]) – Width of the attractant signal (a quantification of the diffusion rate of the chemical).
h_repel (Optional[float]) – Height of the repellent effect (magnitude of its effect).
w_repel (Optional[float]) – Width of the repellent.
- Name = ['BacterialForagingOptimization', 'BFO', 'BFOA']¶
- __init__(population_size=50, n_chemotactic=100, n_swim=4, n_reproduction=4, n_elimination=2, prob_elimination=0.25, step_size=0.1, swarming=True, d_attract=0.1, w_attract=0.2, h_repel=0.1, w_repel=10.0, *args, **kwargs)[source]¶
Initialize algorithm.
- Parameters
population_size (Optional[int]) – Number of individuals in population \(\in [1, \infty]\).
n_chemotactic (Optional[int]) – Number of chemotactic steps.
n_swim (Optional[int]) – Number of swim steps.
n_reproduction (Optional[int]) – Number of reproduction steps.
n_elimination (Optional[int]) – Number of elimination and dispersal steps.
prob_elimination (Optional[float]) – Probability of a bacterium being eliminated and a new one being created at a random location in the search space.
step_size (Optional[float]) – Size of a chemotactic step.
swarming (Optional[bool]) – If True use swarming.
d_attract (Optional[float]) – Depth of the attractant released by the cell (a quantification of how much attractant is released).
w_attract (Optional[float]) – Width of the attractant signal (a quantification of the diffusion rate of the chemical).
h_repel (Optional[float]) – Height of the repellent effect (magnitude of its effect).
w_repel (Optional[float]) – Width of the repellent.
- get_parameters()[source]¶
Get parameters of the algorithm.
- Returns
Algorithm parameters.
- Return type
Dict[str, Any]
- init_population(task)[source]¶
Initialize the starting population.
- Parameters
task (Task) – Optimization task
- Returns
New population.
New population fitness/function values.
- Additional arguments:
cost (numpy.ndarray): Costs of cells i. e. Fitness + cell interaction
health (numpy.ndarray): Cell health i. e. The accumulation of costs over all chemotactic steps.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, Dict[str, Any]]
- interaction(cell, population)[source]¶
Compute cell to cell interaction J_cc.
- Parameters
cell (numpy.ndarray) – Cell to compute interaction for.
population (numpy.ndarray) – Population
- Returns
Cell to cell interaction J_cc
- Return type
float
- random_direction(dimension)[source]¶
Generate a random direction vector.
- Parameters
dimension (int) – Problem dimension
- Returns
Normalised random direction vector
- Return type
numpy.ndarray
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of Bacterial Foraging Optimization algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray) – Current populations fitness/function values.
best_x (numpy.ndarray) – Global best individual.
best_fitness (float) – Global best individuals function/fitness value.
**params (Dict[str, Any]) – Additional arguments.
- Returns
New population.
New populations function/fitness values.
New global best solution,
New global best solutions fitness/objective value.
- Additional arguments:
cost (numpy.ndarray): Costs of cells i. e. Fitness + cell interaction
health (numpy.ndarray): Cell health i. e. The accumulation of costs over all chemotactic steps.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- set_parameters(population_size=50, n_chemotactic=100, n_swim=4, n_reproduction=4, n_elimination=2, prob_elimination=0.25, step_size=0.1, swarming=True, d_attract=0.1, w_attract=0.2, h_repel=0.1, w_repel=10.0, **kwargs)[source]¶
Set the parameters/arguments of the algorithm.
- Parameters
population_size (Optional[int]) – Number of individuals in population \(\in [1, \infty]\).
n_chemotactic (Optional[int]) – Number of chemotactic steps.
n_swim (Optional[int]) – Number of swim steps.
n_reproduction (Optional[int]) – Number of reproduction steps.
n_elimination (Optional[int]) – Number of elimination and dispersal steps.
prob_elimination (Optional[float]) – Probability of a bacterium being eliminated and a new one being created at a random location in the search space.
step_size (Optional[float]) – Size of a chemotactic step.
swarming (Optional[bool]) – If True use swarming.
d_attract (Optional[float]) – Depth of the attractant released by the cell (a quantification of how much attractant is released).
w_attract (Optional[float]) – Width of the attractant signal (a quantification of the diffusion rate of the chemical).
h_repel (Optional[float]) – Height of the repellent effect (magnitude of its effect).
w_repel (Optional[float]) – Width of the repellent.
- class niapy.algorithms.basic.BareBonesFireworksAlgorithm(num_sparks=10, amplification_coefficient=1.5, reduction_coefficient=0.5, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Bare Bones Fireworks Algorithm.
- Algorithm:
Bare Bones Fireworks Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
https://www.sciencedirect.com/science/article/pii/S1568494617306609
- Reference paper:
Junzhi Li, Ying Tan, The bare bones fireworks algorithm: A minimalist global optimizer, Applied Soft Computing, Volume 62, 2018, Pages 454-462, ISSN 1568-4946, https://doi.org/10.1016/j.asoc.2017.10.046.
- Variables
Name (List[str]) – List of strings representing algorithm names
num_sparks (int) – Number of sparks
amplification_coefficient (float) – amplification coefficient
reduction_coefficient (float) – reduction coefficient
Initialize BareBonesFireworksAlgorithm.
- Parameters
num_sparks (int) – Number of sparks \(\in[1, \infty)\).
amplification_coefficient (float) – Amplification coefficient \(\in [1, \infty)\).
reduction_coefficient (float) – Reduction coefficient \(\in (0, 1)\).
- Name = ['BareBonesFireworksAlgorithm', 'BBFWA']¶
- __init__(num_sparks=10, amplification_coefficient=1.5, reduction_coefficient=0.5, *args, **kwargs)[source]¶
Initialize BareBonesFireworksAlgorithm.
- Parameters
num_sparks (int) – Number of sparks \(\in[1, \infty)\).
amplification_coefficient (float) – Amplification coefficient \(\in [1, \infty)\).
reduction_coefficient (float) – Reduction coefficient \(\in (0, 1)\).
- static info()[source]¶
Get default information of algorithm.
- Returns
Basic information.
- Return type
str
See also
- init_population(task)[source]¶
Initialize starting population.
- Parameters
task (Task) – Optimization task.
- Returns
Initial solution.
Initial solution function/fitness value.
- Additional arguments:
A (numpy.ndarray): Starting amplitude or search range.
- Return type
Tuple[numpy.ndarray, float, Dict[str, Any]]
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of Bare Bones Fireworks Algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current solution.
population_fitness (float) – Current solution fitness/function value.
best_x (numpy.ndarray) – Current best solution.
best_fitness (float) – Current best solution fitness/function value.
params (Dict[str, Any]) – Additional parameters.
- Returns
New solution.
New solution fitness/function value.
New global best solution.
New global best solutions fitness/objective value.
- Additional arguments:
amplitude (numpy.ndarray): Search range.
- Return type
Tuple[numpy.ndarray, float, numpy.ndarray, float, Dict[str, Any]]
- set_parameters(num_sparks=10, amplification_coefficient=1.5, reduction_coefficient=0.5, **kwargs)[source]¶
Set the arguments of an algorithm.
- Parameters
num_sparks (int) – Number of sparks \(\in [1, \infty)\).
amplification_coefficient (float) – Amplification coefficient \(\in [1, \infty)\).
reduction_coefficient (float) – Reduction coefficient \(\in (0, 1)\).
- class niapy.algorithms.basic.BatAlgorithm(population_size=40, loudness=1.0, pulse_rate=1.0, alpha=0.97, gamma=0.1, min_frequency=0.0, max_frequency=2.0, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Bat algorithm.
- Algorithm:
Bat algorithm
- Date:
2015
- Authors:
Iztok Fister Jr., Marko Burjek and Klemen Berkovič
- License:
MIT
- Reference paper:
Yang, Xin-She. “A new metaheuristic bat-inspired algorithm.” Nature inspired cooperative strategies for optimization (NICSO 2010). Springer, Berlin, Heidelberg, 2010. 65-74.
- Variables
Name (List[str]) – List of strings representing algorithm name.
loudness (float) – Initial loudness.
pulse_rate (float) – Initial pulse rate.
alpha (float) – Parameter for controlling loudness decrease.
gamma (float) – Parameter for controlling pulse rate increase.
min_frequency (float) – Minimum frequency.
max_frequency (float) – Maximum frequency.
See also
Initialize BatAlgorithm.
- Parameters
population_size (Optional[int]) – Population size.
loudness (Optional[float]) – Initial loudness.
pulse_rate (Optional[float]) – Initial pulse rate.
alpha (Optional[float]) – Parameter for controlling loudness decrease.
gamma (Optional[float]) – Parameter for controlling pulse rate increase.
min_frequency (Optional[float]) – Minimum frequency.
max_frequency (Optional[float]) – Maximum frequency.
- Name = ['BatAlgorithm', 'BA']¶
- __init__(population_size=40, loudness=1.0, pulse_rate=1.0, alpha=0.97, gamma=0.1, min_frequency=0.0, max_frequency=2.0, *args, **kwargs)[source]¶
Initialize BatAlgorithm.
- Parameters
population_size (Optional[int]) – Population size.
loudness (Optional[float]) – Initial loudness.
pulse_rate (Optional[float]) – Initial pulse rate.
alpha (Optional[float]) – Parameter for controlling loudness decrease.
gamma (Optional[float]) – Parameter for controlling pulse rate increase.
min_frequency (Optional[float]) – Minimum frequency.
max_frequency (Optional[float]) – Maximum frequency.
- get_parameters()[source]¶
Get parameters of the algorithm.
- Returns
Algorithm parameters.
- Return type
Dict[str, Any]
- static info()[source]¶
Get algorithms information.
- Returns
Algorithm information.
- Return type
str
See also
- init_population(task)[source]¶
Initialize the starting population.
- Parameters
task (Task) – Optimization task
- Returns
New population.
New population fitness/function values.
- Additional arguments:
velocities (numpy.ndarray[float]): Velocities.
alpha (float): Previous iterations loudness.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float], Dict[str, Any]]
- local_search(best, loudness, task, **kwargs)[source]¶
Improve the best solution according to the Yang (2010).
- Parameters
best (numpy.ndarray) – Global best individual.
loudness (float) – Current loudness.
task (Task) – Optimization task.
- Returns
New solution based on global best individual.
- Return type
numpy.ndarray
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of Bat Algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population
population_fitness (numpy.ndarray[float]) – Current population fitness/function values
best_x (numpy.ndarray) – Current best individual
best_fitness (float) – Current best individual function/fitness value
params (Dict[str, Any]) – Additional algorithm arguments
- Returns
New population
New population fitness/function values
New global best solution
New global best fitness/objective value
- Additional arguments:
velocities (numpy.ndarray): Velocities.
alpha (float): Previous iterations loudness.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- set_parameters(population_size=20, loudness=1.0, pulse_rate=1.0, alpha=0.97, gamma=0.1, min_frequency=0.0, max_frequency=2.0, **kwargs)[source]¶
Set the parameters of the algorithm.
- Parameters
population_size (Optional[int]) – Population size.
loudness (Optional[float]) – Initial loudness.
pulse_rate (Optional[float]) – Initial pulse rate.
alpha (Optional[float]) – Parameter for controlling loudness decrease.
gamma (Optional[float]) – Parameter for controlling pulse rate increase.
min_frequency (Optional[float]) – Minimum frequency.
max_frequency (Optional[float]) – Maximum frequency.
- class niapy.algorithms.basic.BeesAlgorithm(population_size=40, m=5, e=4, ngh=1, nep=4, nsp=2, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Bees algorithm.
- Algorithm:
The Bees algorithm
- Date:
2019
- Authors:
Rok Potočnik
- License:
MIT
- Reference paper:
DT Pham, A Ghanbarzadeh, E Koc, S Otri, S Rahim, and M Zaidi. The bees algorithm-a novel tool for complex optimisation problems. In Proceedings of the 2nd Virtual International Conference on Intelligent Production Machines and Systems (IPROMS 2006), pages 454–459, 2006
- Variables
population_size (Optional[int]) – Number of scout bees parameter.
m (Optional[int]) – Number of sites selected out of n visited sites parameter.
e (Optional[int]) – Number of best sites out of m selected sites parameter.
nep (Optional[int]) – Number of bees recruited for best e sites parameter.
nsp (Optional[int]) – Number of bees recruited for the other selected sites parameter.
ngh (Optional[float]) – Initial size of patches parameter.
Initialize BeesAlgorithm.
- Parameters
population_size (Optional[int]) – Number of scout bees parameter.
m (Optional[int]) – Number of sites selected out of n visited sites parameter.
e (Optional[int]) – Number of best sites out of m selected sites parameter.
nep (Optional[int]) – Number of bees recruited for best e sites parameter.
nsp (Optional[int]) – Number of bees recruited for the other selected sites parameter.
ngh (Optional[float]) – Initial size of patches parameter.
- Name = ['BeesAlgorithm', 'BEA']¶
- __init__(population_size=40, m=5, e=4, ngh=1, nep=4, nsp=2, *args, **kwargs)[source]¶
Initialize BeesAlgorithm.
- Parameters
population_size (Optional[int]) – Number of scout bees parameter.
m (Optional[int]) – Number of sites selected out of n visited sites parameter.
e (Optional[int]) – Number of best sites out of m selected sites parameter.
nep (Optional[int]) – Number of bees recruited for best e sites parameter.
nsp (Optional[int]) – Number of bees recruited for the other selected sites parameter.
ngh (Optional[float]) – Initial size of patches parameter.
- bee_dance(x, task, ngh)[source]¶
Bees Dance. Search for new positions.
- Parameters
x (numpy.ndarray) – One individual from the population.
task (Task) – Optimization task.
ngh (float) – A small value for patch search.
- Returns
New individual.
New individual fitness/function values.
- Return type
Tuple[numpy.ndarray, float]
- get_parameters()[source]¶
Get parameters of the algorithm.
- Returns
Algorithm Parameters.
- Return type
Dict[str, Any]
- static info()[source]¶
Get information about algorithm.
- Returns
Algorithm information
- Return type
str
See also
- init_population(task)[source]¶
Initialize the starting population.
- Parameters
task (Task) – Optimization task
- Returns
New population.
New population fitness/function values.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, Dict[str, Any]]
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of Forest Optimization Algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray[float]) – Current population.
population_fitness (numpy.ndarray[float]) – Current population function/fitness values.
best_x (numpy.ndarray) – Global best individual.
best_fitness (float) – Global best individual fitness/function value.
**params (Dict[str, Any]) – Additional arguments.
- Returns
New population.
New population fitness/function values.
New global best solution.
New global best fitness/objective value.
- Additional arguments:
ngh (float): A small value used for patches.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- set_parameters(population_size=40, m=5, e=4, ngh=1, nep=4, nsp=2, **kwargs)[source]¶
Set the parameters of the algorithm.
- Parameters
population_size (Optional[int]) – Number of scout bees parameter.
m (Optional[int]) – Number of sites selected out of n visited sites parameter.
e (Optional[int]) – Number of best sites out of m selected sites parameter.
nep (Optional[int]) – Number of bees recruited for best e sites parameter.
nsp (Optional[int]) – Number of bees recruited for the other selected sites parameter.
ngh (Optional[float]) – Initial size of patches parameter.
- class niapy.algorithms.basic.CamelAlgorithm(population_size=50, burden_factor=0.25, death_rate=0.5, visibility=0.5, supply_init=10, endurance_init=10, min_temperature=- 10, max_temperature=10, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Camel traveling behavior.
- Algorithm:
Camel algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
- Reference paper:
Ali, Ramzy. (2016). Novel Optimization Algorithm Inspired by Camel Traveling Behavior. Iraq J. Electrical and Electronic Engineering. 12. 167-177.
- Variables
Name (List[str]) – List of strings representing name of the algorithm.
population_size (Optional[int]) – Population size \(\in [1, \infty)\).
burden_factor (Optional[float]) – Burden factor \(\in [0, 1]\).
death_rate (Optional[float]) – Dying rate \(\in [0, 1]\).
visibility (Optional[float]) – View range of camel.
supply_init (Optional[float]) – Initial supply \(\in (0, \infty)\).
endurance_init (Optional[float]) – Initial endurance \(\in (0, \infty)\).
min_temperature (Optional[float]) – Minimum temperature, must be true \($T_{min} < T_{max}\).
max_temperature (Optional[float]) – Maximum temperature, must be true \(T_{min} < T_{max}\).
See also
Initialize CamelAlgorithm.
- Parameters
population_size (Optional[int]) – Population size \(\in [1, \infty)\).
burden_factor (Optional[float]) – Burden factor \(\in [0, 1]\).
death_rate (Optional[float]) – Dying rate \(\in [0, 1]\).
visibility (Optional[float]) – View range of camel.
supply_init (Optional[float]) – Initial supply \(\in (0, \infty)\).
endurance_init (Optional[float]) – Initial endurance \(\in (0, \infty)\).
min_temperature (Optional[float]) – Minimum temperature, must be true \($T_{min} < T_{max}\).
max_temperature (Optional[float]) – Maximum temperature, must be true \(T_{min} < T_{max}\).
- Name = ['CamelAlgorithm', 'CA']¶
- __init__(population_size=50, burden_factor=0.25, death_rate=0.5, visibility=0.5, supply_init=10, endurance_init=10, min_temperature=- 10, max_temperature=10, *args, **kwargs)[source]¶
Initialize CamelAlgorithm.
- Parameters
population_size (Optional[int]) – Population size \(\in [1, \infty)\).
burden_factor (Optional[float]) – Burden factor \(\in [0, 1]\).
death_rate (Optional[float]) – Dying rate \(\in [0, 1]\).
visibility (Optional[float]) – View range of camel.
supply_init (Optional[float]) – Initial supply \(\in (0, \infty)\).
endurance_init (Optional[float]) – Initial endurance \(\in (0, \infty)\).
min_temperature (Optional[float]) – Minimum temperature, must be true \($T_{min} < T_{max}\).
max_temperature (Optional[float]) – Maximum temperature, must be true \(T_{min} < T_{max}\).
- get_parameters()[source]¶
Get parameters of the algorithm.
- Returns
Algorithm Parameters.
- Return type
Dict[str, Any]
- static info()[source]¶
Get information about algorithm.
- Returns
Algorithm information
- Return type
str
See also
- init_pop(task, population_size, rng, individual_type, **_kwargs)[source]¶
Initialize starting population.
- Parameters
task (Task) – Optimization task.
population_size (int) – Number of camels in population.
rng (numpy.random.Generator) – Random number generator.
individual_type (Type[Individual]) – Individual type.
- Returns
Initialize population of camels.
Initialized populations function/fitness values.
- Return type
Tuple[numpy.ndarray[Camel], numpy.ndarray[float]]
- life_cycle(camel, task)[source]¶
Apply life cycle to Camel.
- Parameters
camel (Camel) – Camel to apply life cycle.
task (Task) – Optimization task.
- Returns
Camel with life cycle applied to it.
- Return type
Camel
- oasis(c)[source]¶
Apply oasis function to camel.
- Parameters
c (Camel) – Camel to apply oasis on.
- Returns
Camel with applied oasis on.
- Return type
Camel
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of Camel Algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray[Camel]) – Current population of Camels.
population_fitness (numpy.ndarray[float]) – Current population fitness/function values.
best_x (numpy.ndarray) – Current best Camel.
best_fitness (float) – Current best Camel fitness/function value.
**params (Dict[str, Any]) – Additional arguments.
- Returns
New population
New population function/fitness value
New global best solution
New global best fitness/objective value
Additional arguments
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, dict]
- set_parameters(population_size=50, burden_factor=0.25, death_rate=0.5, visibility=0.5, supply_init=10, endurance_init=10, min_temperature=- 10, max_temperature=10, **kwargs)[source]¶
Set the arguments of an algorithm.
- Parameters
population_size (Optional[int]) – Population size \(\in [1, \infty)\).
burden_factor (Optional[float]) – Burden factor \(\in [0, 1]\).
death_rate (Optional[float]) – Dying rate \(\in [0, 1]\).
visibility (Optional[float]) – View range of camel.
supply_init (Optional[float]) – Initial supply \(\in (0, \infty)\).
endurance_init (Optional[float]) – Initial endurance \(\in (0, \infty)\).
min_temperature (Optional[float]) – Minimum temperature, must be true \($T_{min} < T_{max}\).
max_temperature (Optional[float]) – Maximum temperature, must be true \(T_{min} < T_{max}\).
- class niapy.algorithms.basic.CatSwarmOptimization(population_size=30, mixture_ratio=0.1, c1=2.05, smp=3, spc=True, cdc=0.85, srd=0.2, max_velocity=1.9, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Cat swarm optimization algorithm.
Algorithm: Cat swarm optimization
Date: 2019
Author: Mihael Baketarić
License: MIT
Reference paper: Chu, S. C., Tsai, P. W., & Pan, J. S. (2006). Cat swarm optimization. In Pacific Rim international conference on artificial intelligence (pp. 854-858). Springer, Berlin, Heidelberg.
Initialize CatSwarmOptimization.
- Parameters
population_size (int) – Number of individuals in population.
mixture_ratio (float) – Mixture ratio.
c1 (float) – Constant in tracing mode.
smp (int) – Seeking memory pool.
spc (bool) – Self-position considering.
cdc (float) – Decides how many dimensions will be varied.
srd (float) – Seeking range of the selected dimension.
max_velocity (float) – Maximal velocity.
Also (See) –
- Name = ['CatSwarmOptimization', 'CSO']¶
- __init__(population_size=30, mixture_ratio=0.1, c1=2.05, smp=3, spc=True, cdc=0.85, srd=0.2, max_velocity=1.9, *args, **kwargs)[source]¶
Initialize CatSwarmOptimization.
- Parameters
population_size (int) – Number of individuals in population.
mixture_ratio (float) – Mixture ratio.
c1 (float) – Constant in tracing mode.
smp (int) – Seeking memory pool.
spc (bool) – Self-position considering.
cdc (float) – Decides how many dimensions will be varied.
srd (float) – Seeking range of the selected dimension.
max_velocity (float) – Maximal velocity.
Also (See) –
- static info()[source]¶
Get algorithm information.
- Returns
Algorithm information.
- Return type
str
See also
- init_population(task)[source]¶
Initialize population.
- Parameters
task (Task) – Optimization task.
- Returns
Initialized population.
Initialized populations fitness/function values.
- Additional arguments:
Dictionary of modes (seek or trace) and velocities for each cat
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float], Dict[str, Any]]
- random_seek_trace()[source]¶
Set cats into seeking/tracing mode randomly.
- Returns
One or zero. One means tracing mode. Zero means seeking mode. Length of list is equal to population_size.
- Return type
numpy.ndarray
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of Cat Swarm Optimization algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray) – Current population fitness/function values.
best_x (numpy.ndarray) – Current best individual.
best_fitness (float) – Current best cat fitness/function value.
**params (Dict[str, Any]) – Additional function arguments.
- Returns
New population.
New population fitness/function values.
New global best solution.
New global best solutions fitness/objective value.
- Additional arguments:
velocities (numpy.ndarray): velocities of cats.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- seeking_mode(task, cat, cat_fitness, pop, fpop, fxb)[source]¶
Seeking mode.
- Parameters
task (Task) – Optimization task.
cat (numpy.ndarray) – Individual from population.
cat_fitness (float) – Current individual’s fitness/function value.
pop (numpy.ndarray) – Current population.
fpop (numpy.ndarray) – Current population fitness/function values.
fxb (float) – Current best cat fitness/function value.
- Returns
Updated individual’s position
Updated individual’s fitness/function value
Updated global best position
Updated global best fitness/function value
- Return type
Tuple[numpy.ndarray, float, numpy.ndarray, float]
- set_parameters(population_size=30, mixture_ratio=0.1, c1=2.05, smp=3, spc=True, cdc=0.85, srd=0.2, max_velocity=1.9, **kwargs)[source]¶
Set the algorithm parameters.
- Parameters
population_size (int) – Number of individuals in population.
mixture_ratio (float) – Mixture ratio.
c1 (float) – Constant in tracing mode.
smp (int) – Seeking memory pool.
spc (bool) – Self-position considering.
cdc (float) – Decides how many dimensions will be varied.
srd (float) – Seeking range of the selected dimension.
max_velocity (float) – Maximal velocity.
Also (See) –
- tracing_mode(task, cat, velocity, xb)[source]¶
Tracing mode.
- Parameters
task (Task) – Optimization task.
cat (numpy.ndarray) – Individual from population.
velocity (numpy.ndarray) – Velocity of individual.
xb (numpy.ndarray) – Current best individual.
- Returns
Updated individual’s position
Updated individual’s fitness/function value
Updated individual’s velocity vector
- Return type
Tuple[numpy.ndarray, float, numpy.ndarray]
- class niapy.algorithms.basic.CenterParticleSwarmOptimization(*args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.pso.ParticleSwarmAlgorithm
Implementation of Center Particle Swarm Optimization.
- Algorithm:
Center Particle Swarm Optimization
- Date:
2019
- Authors:
Klemen Berkovič
- License:
MIT
- Reference paper:
H.-C. Tsai, Predicting strengths of concrete-type specimens using hybrid multilayer perceptrons with center-Unified particle swarm optimization, Adv. Eng. Softw. 37 (2010) 1104–1112.
See also
niapy.algorithms.basic.WeightedVelocityClampingParticleSwarmAlgorithm
Initialize CPSO.
- Name = ['CenterParticleSwarmOptimization', 'CPSO']¶
- get_parameters()[source]¶
Get value of parameters for this instance of algorithm.
- Returns
Dictionary which has parameters mapped to values.
- Return type
Dict[str, Union[int, float, numpy.ndarray]]
- static info()[source]¶
Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
str
See also
- run_iteration(task, pop, fpop, xb, fxb, **params)[source]¶
Core function of algorithm.
- Parameters
task (Task) – Optimization task.
pop (numpy.ndarray) – Current population of particles.
fpop (numpy.ndarray) – Current particles function/fitness values.
xb (numpy.ndarray) – Current global best particle.
fxb (numpy.float) – Current global best particles function/fitness value.
- Returns
New population of particles.
New populations function/fitness values.
New global best particle.
New global best particle function/fitness value.
Additional arguments.
Additional keyword arguments.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, dict]
See also
niapy.algorithm.basic.WeightedVelocityClampingParticleSwarmAlgorithm.run_iteration()
- class niapy.algorithms.basic.ClonalSelectionAlgorithm(population_size=10, clone_factor=0.1, mutation_factor=- 2.5, num_rand=1, bits_per_param=16, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Clonal Selection Algorithm.
- Algorithm:
Clonal selection algorithm
- Date:
2021
- Authors:
Andraž Peršon
- License:
MIT
- Reference paper:
Brownlee, J. “Clever Algorithms: Nature-Inspired Programming Recipes” Revision 2. 2012. 280-286.
- Variables
population_size (int) – Population size.
clone_factor (float) – Clone factor.
mutation_factor (float) – Mutation factor.
num_rand (int) – Number of random antibodies to be added to the population each generation.
bits_per_param (int) – Number of bits per parameter of solution vector.
See also
Initialize ClonalSelectionAlgorithm.
- Parameters
population_size (Optional[int]) – Population size.
clone_factor (Optional[float]) – Clone factor.
mutation_factor (Optional[float]) – Mutation factor.
num_rand (Optional[int]) – Number of random antibodies to be added to the population each generation.
bits_per_param (Optional[int]) – Number of bits per parameter of solution vector.
- Name = ['ClonalSelectionAlgorithm', 'CLONALG']¶
- __init__(population_size=10, clone_factor=0.1, mutation_factor=- 2.5, num_rand=1, bits_per_param=16, *args, **kwargs)[source]¶
Initialize ClonalSelectionAlgorithm.
- Parameters
population_size (Optional[int]) – Population size.
clone_factor (Optional[float]) – Clone factor.
mutation_factor (Optional[float]) – Mutation factor.
num_rand (Optional[int]) – Number of random antibodies to be added to the population each generation.
bits_per_param (Optional[int]) – Number of bits per parameter of solution vector.
- get_parameters()[source]¶
Get parameters of the algorithm.
- Returns
Algorithm parameters.
- Return type
Dict[str, Any]
- static info()[source]¶
Get algorithms information.
- Returns
Algorithm information.
- Return type
str
See also
- init_population(task)[source]¶
Initialize the starting population.
- Parameters
task (Task) – Optimization task
- Returns
New population.
New population fitness/function values.
- Additional arguments:
bitstring (numpy.ndarray): Binary representation of the population.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float], Dict[str, Any]]
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of Clonal Selection Algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population
population_fitness (numpy.ndarray[float]) – Current population fitness/function values
best_x (numpy.ndarray) – Current best individual
best_fitness (float) – Current best individual function/fitness value
params (Dict[str, Any]) – Additional algorithm arguments
- Returns
New population
New population fitness/function values
New global best solution
New global best fitness/objective value
- Additional arguments:
bitstring (numpy.ndarray): Binary representation of the population.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- set_parameters(population_size=10, clone_factor=0.1, mutation_factor=- 2.5, num_rand=1, bits_per_param=16, **kwargs)[source]¶
Set the parameters of the algorithm.
- Parameters
population_size (Optional[int]) – Population size.
clone_factor (Optional[float]) – Clone factor.
mutation_factor (Optional[float]) – Mutation factor.
num_rand (Optional[int]) – Random number.
bits_per_param (Optional[int]) – Number of bits per parameter of solution vector.
- class niapy.algorithms.basic.ComprehensiveLearningParticleSwarmOptimizer(m=10, w0=0.9, w1=0.4, c=1.49445, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.pso.ParticleSwarmAlgorithm
Implementation of Mutated Particle Swarm Optimization.
- Algorithm:
Comprehensive Learning Particle Swarm Optimizer
- Date:
2019
- Authors:
Klemen Berkovič
- License:
MIT
- Reference paper:
Liang, a. K. Qin, P. N. Suganthan and S. Baskar, “Comprehensive learning particle swarm optimizer for global optimization of multimodal functions,” in IEEE Transactions on Evolutionary Computation, vol. 10, no. 3, pp. 281-295, June 2006. doi: 10.1109/TEVC.2005.857610
- Reference URL:
http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1637688&isnumber=34326
- Variables
w0 (float) – Inertia weight.
w1 (float) – Inertia weight.
c (float) – Velocity constant.
m (int) – Refresh rate.
Initialize CLPSO.
- Name = ['ComprehensiveLearningParticleSwarmOptimizer', 'CLPSO']¶
- generate_personal_best_cl(i, pc, personal_best, personal_best_fitness)[source]¶
Generate new personal best position for learning.
- Parameters
i (int) – Current particle.
pc (float) – Learning probability.
personal_best (numpy.ndarray) – Personal best positions for population.
personal_best_fitness (numpy.ndarray) – Personal best positions function/fitness values for personal best position.
- Returns
Personal best for learning.
- Return type
numpy.ndarray
- get_parameters()[source]¶
Get value of parameters for this instance of algorithm.
- Returns
Dictionary which has parameters mapped to values.
- Return type
Dict[str, Union[int, float, numpy.ndarray]]
- static info()[source]¶
Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
str
See also
- init(task)[source]¶
Initialize dynamic arguments of Particle Swarm Optimization algorithm.
- Parameters
task (Task) – Optimization task.
- Returns
vMin: Minimal velocity.
vMax: Maximal velocity.
V: Initial velocity of particle.
flag: Refresh gap counter.
- Return type
Dict[str, numpy.ndarray]
- run_iteration(task, pop, fpop, xb, fxb, **params)[source]¶
Core function of algorithm.
- Parameters
task (Task) – Optimization task.
pop (numpy.ndarray) – Current populations.
fpop (numpy.ndarray) – Current population fitness/function values.
xb (numpy.ndarray) – Current best particle.
fxb (float) – Current best particle fitness/function value.
params (dict) – Additional function keyword arguments.
- Returns
New population.
New population fitness/function values.
New global best position.
New global best positions function/fitness value.
Additional arguments.
- Additional keyword arguments:
personal_best: Particles best population.
personal_best_fitness: Particles best positions function/fitness value.
min_velocity: Minimal velocity.
max_velocity: Maximal velocity.
V: Initial velocity of particle.
flag: Refresh gap counter.
pc: Learning rate.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, list, dict]
- set_parameters(m=10, w0=0.9, w1=0.4, c=1.49445, **kwargs)[source]¶
Set Particle Swarm Algorithm main parameters.
- Parameters
w0 (int) – Inertia weight.
w1 (float) – Inertia weight.
c (float) – Velocity constant.
m (float) – Refresh rate.
kwargs (dict) – Additional arguments
- update_velocity_cl(v, p, pb, w, min_velocity, max_velocity, task, **_kwargs)[source]¶
Update particle velocity.
- Parameters
v (numpy.ndarray) – Current velocity of particle.
p (numpy.ndarray) – Current position of particle.
pb (numpy.ndarray) – Personal best position of particle.
w (numpy.ndarray) – Weights for velocity adjustment.
min_velocity (numpy.ndarray) – Minimal velocity allowed.
max_velocity (numpy.ndarray) – Maximal velocity allowed.
task (Task) – Optimization task.
- Returns
Updated velocity of particle.
- Return type
numpy.ndarray
- class niapy.algorithms.basic.CoralReefsOptimization(population_size=25, phi=0.4, asexual_reproduction_prob=0.5, broadcast_prob=0.5, depredation_prob=0.3, k=25, crossover_rate=0.5, mutation_rate=0.36, sexual_crossover=<function default_sexual_crossover>, brooding=<function default_brooding>, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Coral Reefs Optimization Algorithm.
- Algorithm:
Coral Reefs Optimization Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference Paper:
S. Salcedo-Sanz, J. Del Ser, I. Landa-Torres, S. Gil-López, and J. A. Portilla-Figueras, “The Coral Reefs Optimization Algorithm: A Novel Metaheuristic for Efficiently Solving Optimization Problems,” The Scientific World Journal, vol. 2014, Article ID 739768, 15 pages, 2014.
- Reference URL:
- Variables
Name (List[str]) – List of strings representing algorithm name.
phi (float) – Range of neighborhood.
num_asexual_reproduction (int) – Number of corals used in asexual reproduction.
num_broadcast (int) – Number of corals used in brooding.
num_depredation (int) – Number of corals used in depredation.
k (int) – Number of tries for larva setting.
mutation_rate (float) – Mutation variable \(\in [0, \infty]\).
crossover_rate (float) – Crossover rate in [0, 1].
sexual_crossover (Callable[[numpy.ndarray, float, Task, numpy.random.Generator, Dict[str, Any]], Tuple[numpy.ndarray, numpy.ndarray[float]]]) – Crossover function.
brooding (Callable[[numpy.ndarray, float, Task, numpy.random.Generator, Dict[str, Any]], Tuple[numpy.ndarray, numpy.ndarray]]) – Brooding function.
See also
Initialize CoralReefsOptimization.
- Parameters
population_size (int) – population size for population initialization.
phi (int) – distance.
asexual_reproduction_prob (float) – Value $in [0, 1]$ for Asexual reproduction size.
broadcast_prob (float) – Value $in [0, 1]$ for brooding size.
depredation_prob (float) – Value $in [0, 1]$ for Depredation size.
k (int) – Tries for larvae setting.
sexual_crossover (Callable[[numpy.ndarray, float, Task, numpy.random.Generator, Dict[str, Any]], Tuple[numpy.ndarray, numpy.ndarray]]) – Crossover function.
crossover_rate (float) – Crossover rate $in [0, 1]$.
brooding (Callable[[numpy.ndarray, float, Task, numpy.random.Generator, Dict[str, Any]], Tuple[numpy.ndarray, numpy.ndarray]]) – brooding function.
mutation_rate (float) – Crossover rate $in [0, 1]$.
- Name = ['CoralReefsOptimization', 'CRO']¶
- __init__(population_size=25, phi=0.4, asexual_reproduction_prob=0.5, broadcast_prob=0.5, depredation_prob=0.3, k=25, crossover_rate=0.5, mutation_rate=0.36, sexual_crossover=<function default_sexual_crossover>, brooding=<function default_brooding>, *args, **kwargs)[source]¶
Initialize CoralReefsOptimization.
- Parameters
population_size (int) – population size for population initialization.
phi (int) – distance.
asexual_reproduction_prob (float) – Value $in [0, 1]$ for Asexual reproduction size.
broadcast_prob (float) – Value $in [0, 1]$ for brooding size.
depredation_prob (float) – Value $in [0, 1]$ for Depredation size.
k (int) – Tries for larvae setting.
sexual_crossover (Callable[[numpy.ndarray, float, Task, numpy.random.Generator, Dict[str, Any]], Tuple[numpy.ndarray, numpy.ndarray]]) – Crossover function.
crossover_rate (float) – Crossover rate $in [0, 1]$.
brooding (Callable[[numpy.ndarray, float, Task, numpy.random.Generator, Dict[str, Any]], Tuple[numpy.ndarray, numpy.ndarray]]) – brooding function.
mutation_rate (float) – Crossover rate $in [0, 1]$.
- asexual_reproduction(reef, reef_fitness, best_x, best_fitness, task)[source]¶
Asexual reproduction of corals.
- Parameters
reef (numpy.ndarray) – Current population of reefs.
reef_fitness (numpy.ndarray) – Current populations function/fitness values.
best_x (numpy.ndarray) – Global best coordinates.
best_fitness (float) – Global best fitness.
task (Task) – Optimization task.
- Returns
New population.
New population fitness/function values.
- Return type
Tuple[numpy.ndarray, numpy.ndarray]
See also
niapy.algorithms.basic.CoralReefsOptimization.setting()
niapy.algorithms.basic.default_brooding()
- depredation(reef, reef_fitness)[source]¶
Depredation operator for reefs.
- Parameters
reef (numpy.ndarray) – Current reefs.
reef_fitness (numpy.ndarray) – Current reefs function/fitness values.
- Returns
Best individual
Best individual fitness/function value
- Return type
Tuple[numpy.ndarray, numpy.ndarray]
- get_parameters()[source]¶
Get parameters values of the algorithm.
- Returns
Algorithm parameters.
- Return type
Dict[str, Any]
- static info()[source]¶
Get algorithms information.
- Returns
Algorithm information.
- Return type
str
See also
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of Coral Reefs Optimization algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray) – Current population fitness/function value.
best_x (numpy.ndarray) – Global best solution.
best_fitness (float) – Global best solution fitness/function value.
**params – Additional arguments
- Returns
New population.
New population fitness/function values.
New global best solution
New global best solutions fitness/objective value
Additional arguments:
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
See also
niapy.algorithms.basic.CoralReefsOptimization.sexual_crossover()
niapy.algorithms.basic.CoralReefsOptimization.brooding()
- set_parameters(population_size=25, phi=0.4, asexual_reproduction_prob=0.5, broadcast_prob=0.5, depredation_prob=0.3, k=25, crossover_rate=0.5, mutation_rate=0.36, sexual_crossover=<function default_sexual_crossover>, brooding=<function default_brooding>, **kwargs)[source]¶
Set the parameters of the algorithm.
- Parameters
population_size (int) – population size for population initialization.
phi (int) – distance.
asexual_reproduction_prob (float) – Value $in [0, 1]$ for Asexual reproduction size.
broadcast_prob (float) – Value $in [0, 1]$ for brooding size.
depredation_prob (float) – Value $in [0, 1]$ for Depredation size.
k (int) – Tries for larvae setting.
sexual_crossover (Callable[[numpy.ndarray, float, Task, numpy.random.Generator, Dict[str, Any]], Tuple[numpy.ndarray, numpy.ndarray]]) – Crossover function.
crossover_rate (float) – Crossover rate $in [0, 1]$.
brooding (Callable[[numpy.ndarray, float, Task, numpy.random.Generator, Dict[str, Any]], Tuple[numpy.ndarray, numpy.ndarray]]) – brooding function.
mutation_rate (float) – Crossover rate $in [0, 1]$.
- settling(reef, reef_fitness, new_reef, new_reef_fitness, best_x, best_fitness, task)[source]¶
Operator for setting reefs.
New reefs try to settle to selected position in search space. New reefs are successful if their fitness values is better or if they have no reef occupying same search space.
- Parameters
reef (numpy.ndarray) – Current population of reefs.
reef_fitness (numpy.ndarray) – Current populations function/fitness values.
new_reef (numpy.ndarray) – New population of reefs.
new_reef_fitness (numpy.ndarray) – New populations function/fitness values.
best_x (numpy.ndarray) – Global best solution.
best_fitness (float) – Global best solutions fitness/objective value.
task (Task) – Optimization task.
- Returns
New settled population.
New settled population fitness/function values.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float]
- class niapy.algorithms.basic.CuckooSearch(population_size=25, pa=0.25, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Cuckoo behaviour and levy flights.
- Algorithm:
Cuckoo Search
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference:
Yang, Xin-She, and Suash Deb. “Cuckoo search via Lévy flights.” Nature & Biologically Inspired Computing, 2009. NaBIC 2009. World Congress on. IEEE, 2009.
- Variables
Name (List[str]) – list of strings representing algorithm names.
pa (float) – Probability of a nest being abandoned.
See also
Initialize CuckooSearch.
- Parameters
population_size (int) – Population size.
pa (float) – Probability of a nest being abandoned.
- Name = ['CuckooSearch', 'CS']¶
- __init__(population_size=25, pa=0.25, *args, **kwargs)[source]¶
Initialize CuckooSearch.
- Parameters
population_size (int) – Population size.
pa (float) – Probability of a nest being abandoned.
- static info()[source]¶
Get algorithms information.
- Returns
Algorithm information.
- Return type
str
See also
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of CuckooSearch algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray) – Current populations fitness/function values.
best_x (numpy.ndarray) – Global best individual.
best_fitness (float) – Global best individual function/fitness values.
**params (Dict[str, Any]) – Additional arguments.
- Returns
Initialized population.
Initialized populations fitness/function values.
New global best solution.
New global best solutions fitness/objective value.
Additional arguments.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- class niapy.algorithms.basic.DifferentialEvolution(population_size=50, differential_weight=1, crossover_probability=0.8, strategy=<function cross_rand1>, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Differential evolution algorithm.
- Algorithm:
Differential evolution algorithm
- Date:
2018
- Author:
Uros Mlakar and Klemen Berkovič
- License:
MIT
- Reference paper:
Storn, Rainer, and Kenneth Price. “Differential evolution - a simple and efficient heuristic for global optimization over continuous spaces.” Journal of global optimization 11.4 (1997): 341-359.
- Variables
Name (List[str]) – List of string of names for algorithm.
differential_weight (float) – Scale factor.
crossover_probability (float) – Crossover probability.
strategy (Callable[numpy.ndarray, int, numpy.ndarray, float, float, numpy.random.Generator, Dict[str, Any]]) – crossover and mutation strategy.
See also
Initialize DifferentialEvolution.
- Parameters
population_size (Optional[int]) – Population size.
differential_weight (Optional[float]) – Differential weight (differential_weight).
crossover_probability (Optional[float]) – Crossover rate.
strategy (Optional[Callable[[numpy.ndarray, int, numpy.ndarray, float, float, numpy.random.Generator, list], numpy.ndarray]]) – Crossover and mutation strategy.
- Name = ['DifferentialEvolution', 'DE']¶
- __init__(population_size=50, differential_weight=1, crossover_probability=0.8, strategy=<function cross_rand1>, *args, **kwargs)[source]¶
Initialize DifferentialEvolution.
- Parameters
population_size (Optional[int]) – Population size.
differential_weight (Optional[float]) – Differential weight (differential_weight).
crossover_probability (Optional[float]) – Crossover rate.
strategy (Optional[Callable[[numpy.ndarray, int, numpy.ndarray, float, float, numpy.random.Generator, list], numpy.ndarray]]) – Crossover and mutation strategy.
- evolve(pop, xb, task, **kwargs)[source]¶
Evolve population.
- Parameters
pop (numpy.ndarray) – Current population.
xb (numpy.ndarray) – Current best individual.
task (Task) – Optimization task.
- Returns
New evolved populations.
- Return type
numpy.ndarray
- get_parameters()[source]¶
Get parameters values of the algorithm.
- Returns
Algorithm parameters.
- Return type
Dict[str, Any]
- static info()[source]¶
Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
str
See also
- post_selection(pop, task, xb, fxb, **kwargs)[source]¶
Apply additional operation after selection.
- Parameters
pop (numpy.ndarray) – Current population.
task (Task) – Optimization task.
xb (numpy.ndarray) – Global best solution.
fxb (float) – Global best fitness.
- Returns
New population.
New global best solution.
New global best solutions fitness/objective value.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, float]
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of Differential Evolution algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray) – Current populations fitness/function values.
best_x (numpy.ndarray) – Current best individual.
best_fitness (float) – Current best individual function/fitness value.
**params (dict) – Additional arguments.
- Returns
New population.
New population fitness/function values.
New global best solution.
New global best solutions fitness/objective value.
Additional arguments.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- selection(population, new_population, best_x, best_fitness, task, **kwargs)[source]¶
Operator for selection.
- Parameters
population (numpy.ndarray) – Current population.
new_population (numpy.ndarray) – New Population.
best_x (numpy.ndarray) – Current global best solution.
best_fitness (float) – Current global best solutions fitness/objective value.
task (Task) – Optimization task.
- Returns
New selected individuals.
New global best solution.
New global best solutions fitness/objective value.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, float]
- set_parameters(population_size=50, differential_weight=1, crossover_probability=0.8, strategy=<function cross_rand1>, **kwargs)[source]¶
Set the algorithm parameters.
- Parameters
population_size (Optional[int]) – Population size.
differential_weight (Optional[float]) – Differential weight (differential_weight).
crossover_probability (Optional[float]) – Crossover rate.
strategy (Optional[Callable[[numpy.ndarray, int, numpy.ndarray, float, float, numpy.random.Generator, list], numpy.ndarray]]) – Crossover and mutation strategy.
- class niapy.algorithms.basic.DynNpDifferentialEvolution(population_size=10, p_max=50, rp=3, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.de.DifferentialEvolution
Implementation of Dynamic population size Differential evolution algorithm.
- Algorithm:
Dynamic population size Differential evolution algorithm
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
- Variables
Name (List[str]) – List of strings representing algorithm names.
p_max (int) – Number of population reductions.
rp (int) – Small non-negative number which is added to value of generations.
Initialize DynNpDifferentialEvolution.
- Parameters
p_max (Optional[int]) – Number of population reductions.
rp (Optional[int]) – Small non-negative number which is added to value of generations.
- Name = ['DynNpDifferentialEvolution', 'dynNpDE']¶
- __init__(population_size=10, p_max=50, rp=3, *args, **kwargs)[source]¶
Initialize DynNpDifferentialEvolution.
- Parameters
p_max (Optional[int]) – Number of population reductions.
rp (Optional[int]) – Small non-negative number which is added to value of generations.
- static info()[source]¶
Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
str
See also
- post_selection(pop, task, xb, fxb, **kwargs)[source]¶
Post selection operator.
In this algorithm the post selection operator decrements the population at specific iterations/generations.
- Parameters
pop (numpy.ndarray) – Current population.
task (Task) – Optimization task.
xb (numpy.ndarray) – Global best individual coordinates.
fxb (float) – Global best fitness.
kwargs (Dict[str, Any]) – Additional arguments.
- Returns
Changed current population.
New global best solution.
New global best solutions fitness/objective value.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, float]
- class niapy.algorithms.basic.DynNpMultiStrategyDifferentialEvolution(population_size=40, strategies=(<function cross_rand1>, <function cross_best1>, <function cross_curr2best1>, <function cross_rand2>), *args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.de.MultiStrategyDifferentialEvolution
,niapy.algorithms.basic.de.DynNpDifferentialEvolution
Implementation of Dynamic population size Differential evolution algorithm with dynamic population size that is defined by the quality of population.
- Algorithm:
Dynamic population size Differential evolution algorithm with dynamic population size that is defined by the quality of population
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
- Variables
Name (List[str]) – List of strings representing algorithm name.
See also
Initialize MultiStrategyDifferentialEvolution.
- Parameters
strategies (Optional[Iterable[Callable[[numpy.ndarray[Individual], int, Individual, float, float, numpy.random.Generator], numpy.ndarray[Individual]]]]) – List of mutation strategies.
- Name = ['DynNpMultiStrategyDifferentialEvolution', 'dynNpMsDE']¶
- evolve(pop, xb, task, **kwargs)[source]¶
Evolve the current population.
- Parameters
pop (numpy.ndarray) – Current population.
xb (numpy.ndarray) – Global best solution.
task (Task) – Optimization task.
- Returns
Evolved new population.
- Return type
numpy.ndarray
- static info()[source]¶
Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
str
See also
- post_selection(pop, task, xb, fxb, **kwargs)[source]¶
Post selection operator.
- Parameters
pop (numpy.ndarray) – Current population.
task (Task) – Optimization task.
xb (numpy.ndarray) – Global best individual
fxb (float) – Global best fitness.
- Returns
New population.
New global best solution.
New global best solutions fitness/objective value.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, float]
- class niapy.algorithms.basic.DynamicFireworksAlgorithm(amplification_coeff=1.2, reduction_coeff=0.9, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.fwa.DynamicFireworksAlgorithmGauss
Implementation of dynamic fireworks algorithm.
- Algorithm:
Dynamic Fireworks Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6900485&isnumber=6900223
- Reference paper:
Zheng, A. Janecek, J. Li and Y. Tan, “Dynamic search in fireworks algorithm,” 2014 IEEE Congress on Evolutionary Computation (CEC), Beijing, 2014, pp. 3222-3229. doi: 10.1109/CEC.2014.6900485
- Variables
Name (List[str]) – List of strings representing algorithm name.
Initialize dynFWAG.
- Parameters
amplification_coeff (Union[int, float]) – Amplification coefficient.
reduction_coeff (Union[int, float]) – Reduction coefficient.
See also
- Name = ['DynamicFireworksAlgorithm', 'dynFWA']¶
- static info()[source]¶
Get default information of algorithm.
- Returns
Basic information.
- Return type
str
See also
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Co50re function of Dynamic Fireworks Algorithm.
- Parameters
task (Task) – Optimization task
population (numpy.ndarray) – Current population
population_fitness (numpy.ndarray[float]) – Current population fitness/function values
best_x (numpy.ndarray) – Current best solution
best_fitness (float) – Current best solution’s fitness/function value
**params –
- Returns
New population.
New population function/fitness values.
New global best solution.
New global best fitness.
Additional arguments.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float], Dict[str, Any]]
- class niapy.algorithms.basic.DynamicFireworksAlgorithmGauss(amplification_coeff=1.2, reduction_coeff=0.9, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.fwa.EnhancedFireworksAlgorithm
Implementation of dynamic fireworks algorithm.
- Algorithm:
Dynamic Fireworks Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6900485&isnumber=6900223
- Reference paper:
Zheng, A. Janecek, J. Li and Y. Tan, “Dynamic search in fireworks algorithm,” 2014 IEEE Congress on Evolutionary Computation (CEC), Beijing, 2014, pp. 3222-3229. doi: 10.1109/CEC.2014.6900485
- Variables
Name (List[str]) – List of strings representing algorithm names.
amplitude_cf (Union[float, int]) – Amplitude of the core firework.
amplification_coeff (Union[float, int]) – Amplification coefficient.
reduction_coeff (Union[float, int]) – Reduction coefficient.
Initialize dynFWAG.
- Parameters
amplification_coeff (Union[int, float]) – Amplification coefficient.
reduction_coeff (Union[int, float]) – Reduction coefficient.
See also
- Name = ['DynamicFireworksAlgorithmGauss', 'dynFWAG']¶
- __init__(amplification_coeff=1.2, reduction_coeff=0.9, *args, **kwargs)[source]¶
Initialize dynFWAG.
- Parameters
amplification_coeff (Union[int, float]) – Amplification coefficient.
reduction_coeff (Union[int, float]) – Reduction coefficient.
See also
- explosion_amplitudes(population_fitness, task=None)[source]¶
Calculate explosion amplitude for other fireworks.
- static info()[source]¶
Get default information of algorithm.
- Returns
Basic information.
- Return type
str
See also
- init_population(task)[source]¶
Initialize population.
- Parameters
task (Task) – Optimization task.
- Returns
Initialized population.
Initialized population function/fitness values.
- Additional arguments:
amplitude_cf (numpy.ndarray): Initial amplitude of the core firework.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float], Dict[str, Any]]
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of DynamicFireworksAlgorithmGauss algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray) – Current populations function/fitness values.
best_x (numpy.ndarray) – Global best individual.
best_fitness (float) – Global best fitness/function value.
**params (Dict[str, Any]) – Additional arguments.
- Returns
New population.
New populations fitness/function values.
New global best solution.
New global best solutions fitness/objective value.
- Additional arguments:
amplitude_cf (numpy.ndarray): Amplitude of the core firework.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- selection(population, population_fitness, sparks, task)[source]¶
Select fireworks for the next generation.
- set_parameters(amplification_coeff=1.2, reduction_coeff=0.9, **kwargs)[source]¶
Set core arguments of DynamicFireworksAlgorithmGauss.
- Parameters
amplification_coeff (Union[int, float]) – Amplification coefficient.
reduction_coeff (Union[int, float]) – Reduction coefficient.
See also
- update_cf(xnb, xcb, xcb_f, xb, xb_f, amplitude_cf, task)[source]¶
Update the core firework.
- Parameters
xnb – Sparks generated by core fireworks.
xcb – Current generations best spark.
xcb_f – Current generations best fitness.
xb – Global best individual.
xb_f – Global best fitness.
amplitude_cf – Amplitude of the core firework.
task (Task) – Optimization task.
- Returns
New core firework.
New core firework’s fitness.
New core firework amplitude.
- Return type
Tuple[numpy.ndarray, float, numpy.ndarray]
- class niapy.algorithms.basic.EnhancedFireworksAlgorithm(amplitude_init=0.2, amplitude_final=0.01, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.fwa.FireworksAlgorithm
Implementation of enhanced fireworks algorithm.
- Algorithm:
Enhanced Fireworks Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
- Reference paper:
Zheng, A. Janecek and Y. Tan, “Enhanced Fireworks Algorithm,” 2013 IEEE Congress on Evolutionary Computation, Cancun, 2013, pp. 2069-2077. doi: 10.1109/CEC.2013.6557813
- Variables
Name (List[str]) – List of strings representing algorithm names.
amplitude_init (float) – Initial amplitude of sparks.
amplitude_final (float) – Maximal amplitude of sparks.
Initialize EFWA.
- Parameters
amplitude_init (float) – Initial amplitude.
amplitude_final (float) – Final amplitude.
See also
- Name = ['EnhancedFireworksAlgorithm', 'EFWA']¶
- __init__(amplitude_init=0.2, amplitude_final=0.01, *args, **kwargs)[source]¶
Initialize EFWA.
- Parameters
amplitude_init (float) – Initial amplitude.
amplitude_final (float) – Final amplitude.
See also
- explosion_amplitudes(population_fitness, task=None)[source]¶
Calculate explosion amplitude.
- Parameters
population_fitness (numpy.ndarray) –
task (Task) – Optimization task.
- Returns
New amplitude.
- Return type
numpy.ndarray
- explosion_spark(x, amplitude, task)[source]¶
Explode a spark.
- Parameters
x (numpy.ndarray) – Individuals creating spark.
amplitude (float) – Amplitude of spark.
task (Task) – Optimization task.
- Returns
Sparks exploded in with specified amplitude.
- Return type
numpy.ndarray
- gaussian_spark(x, task, best_x=None)[source]¶
Create new individual.
- Parameters
x (numpy.ndarray) –
task (Task) – Optimization task.
best_x (numpy.ndarray) – Current global best individual.
- Returns
New individual generated by gaussian noise.
- Return type
numpy.ndarray
- static info()[source]¶
Get default information of algorithm.
- Returns
Basic information.
- Return type
str
See also
- mapping(x, task)[source]¶
Fix value to bounds.
- Parameters
x (numpy.ndarray) – Individual to fix.
task (Task) – Optimization task.
- Returns
Individual in search range.
- Return type
numpy.ndarray
- selection(population, population_fitness, sparks, task)[source]¶
Generate new population.
- Parameters
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray[float]) – Current populations fitness/function values.
sparks (numpy.ndarray) – New population.
task (Task) – Optimization task.
- Returns
New population.
New populations fitness/function values.
New global best individual.
New global best fitness.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float], numpy.ndarray, float]
- class niapy.algorithms.basic.EvolutionStrategy1p1(mu=1, k=10, c_a=1.1, c_r=0.5, epsilon=1e-20, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of (1 + 1) evolution strategy algorithm. Uses just one individual.
- Algorithm:
(1 + 1) Evolution Strategy Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
Reference URL:
- Reference paper:
KALYANMOY, Deb. “Multi-Objective optimization using evolutionary algorithms”. John Wiley & Sons, Ltd. Kanpur, India. 2001.
- Variables
Name (List[str]) – List of strings representing algorithm names.
mu (int) – Number of parents.
k (int) – Number of iterations before checking and fixing rho.
c_a (float) – Search range amplification factor.
c_r (float) – Search range reduction factor.
See also
Initialize EvolutionStrategy1p1.
- Parameters
mu (Optional[int]) – Number of parents
k (Optional[int]) – Number of iterations before checking and fixing rho
c_a (Optional[float]) – Search range amplification factor
c_r (Optional[float]) – Search range reduction factor
epsilon (Optional[float]) – Small number.
- Name = ['EvolutionStrategy1p1', 'EvolutionStrategy(1+1)', 'ES(1+1)']¶
- __init__(mu=1, k=10, c_a=1.1, c_r=0.5, epsilon=1e-20, *args, **kwargs)[source]¶
Initialize EvolutionStrategy1p1.
- Parameters
mu (Optional[int]) – Number of parents
k (Optional[int]) – Number of iterations before checking and fixing rho
c_a (Optional[float]) – Search range amplification factor
c_r (Optional[float]) – Search range reduction factor
epsilon (Optional[float]) – Small number.
- static info()[source]¶
Get algorithms information.
- Returns
Algorithm information.
- Return type
str
See also
- init_population(task)[source]¶
Initialize starting individual.
- Parameters
task (Task) – Optimization task.
- Returns
Initialized individual.
Initialized individual fitness/function value.
- Additional arguments:
ki (int): Number of successful rho update.
- Return type
Tuple[Individual, float, Dict[str, Any]]
- mutate(x, rho)[source]¶
Mutate individual.
- Parameters
x (numpy.ndarray) – Current individual.
rho (float) – Current standard deviation.
- Returns
Mutated individual.
- Return type
- run_iteration(task, c, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of EvolutionStrategy(1+1) algorithm.
- Parameters
task (Task) – Optimization task.
c (Individual) – Current position.
population_fitness (float) – Current position function/fitness value.
best_x (numpy.ndarray) – Global best position.
best_fitness (float) – Global best function/fitness value.
**params (Dict[str, Any]) – Additional arguments.
- Returns
Initialized individual.
Initialized individual fitness/function value.
New global best solution.
New global best solutions fitness/objective value.
- Additional arguments:
ki (int): Number of successful rho update.
- Return type
Tuple[Individual, float, Individual, float, Dict[str, Any]]
- set_parameters(mu=1, k=10, c_a=1.1, c_r=0.5, epsilon=1e-20, **kwargs)[source]¶
Set the arguments of an algorithm.
- Parameters
mu (Optional[int]) – Number of parents
k (Optional[int]) – Number of iterations before checking and fixing rho
c_a (Optional[float]) – Search range amplification factor
c_r (Optional[float]) – Search range reduction factor
epsilon (Optional[float]) – Small number.
- class niapy.algorithms.basic.EvolutionStrategyML(lam=45, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.es.EvolutionStrategyMpL
Implementation of (mu, lambda) evolution strategy algorithm. Algorithm is good for dynamic environments. Mu individual create lambda children. Only best mu children go to new generation. Mu parents are discarded.
- Algorithm:
(\(\mu + \lambda\)) Evolution Strategy Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
Reference URL:
Reference paper:
- Variables
Name (List[str]) – List of strings representing algorithm names
See also
niapy.algorithm.basic.es.EvolutionStrategyMpL
Initialize EvolutionStrategyMpL.
- Parameters
lam (int) – Number of new individual generated by mutation.
- Name = ['EvolutionStrategyML', 'EvolutionStrategy(mu,lambda)', 'ES(m,l)']¶
- static info()[source]¶
Get algorithms information.
- Returns
Algorithm information.
- Return type
str
See also
- init_population(task)[source]¶
Initialize starting population.
- Parameters
task (Task) – Optimization task.
- Returns
Initialized population.
Initialized populations fitness/function values.
Additional arguments.
- Return type
Tuple[numpy.ndarray[Individual], numpy.ndarray[float], Dict[str, Any]]
See also
niapy.algorithm.basic.es.EvolutionStrategyMpL.init_population()
- new_pop(pop)[source]¶
Return new population.
- Parameters
pop (numpy.ndarray) – Current population.
- Returns
New population.
- Return type
numpy.ndarray
- run_iteration(task, c, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of EvolutionStrategyML algorithm.
- Parameters
task (Task) – Optimization task.
c (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray) – Current population fitness/function values.
best_x (numpy.ndarray) – Global best individual.
best_fitness (float) – Global best individuals fitness/function value.
Dict[str (**params) – Additional arguments.
Any] – Additional arguments.
- Returns
New population.
New populations fitness/function values.
New global best solution.
New global best solutions fitness/objective value.
Additional arguments.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- class niapy.algorithms.basic.EvolutionStrategyMp1(mu=40, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.es.EvolutionStrategy1p1
Implementation of (mu + 1) evolution strategy algorithm. Algorithm creates mu mutants but into new generation goes only one individual.
- Algorithm:
(\(\mu + 1\)) Evolution Strategy Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
Reference URL:
Reference paper:
- Variables
Name (List[str]) – List of strings representing algorithm names.
Initialize EvolutionStrategyMp1.
- Name = ['EvolutionStrategyMp1', 'EvolutionStrategy(mu+1)', 'ES(m+1)']¶
- class niapy.algorithms.basic.EvolutionStrategyMpL(lam=45, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.es.EvolutionStrategy1p1
Implementation of (mu + lambda) evolution strategy algorithm. Mutation creates lambda individual. Lambda individual compete with mu individuals for survival, so only mu individual go to new generation.
- Algorithm:
(\(\mu + \lambda\)) Evolution Strategy Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
Reference URL:
Reference paper:
- Variables
Name (List[str]) – List of strings representing algorithm names
lam (int) – Lambda.
Initialize EvolutionStrategyMpL.
- Parameters
lam (int) – Number of new individual generated by mutation.
- Name = ['EvolutionStrategyMpL', 'EvolutionStrategy(mu+lambda)', 'ES(m+l)']¶
- __init__(lam=45, *args, **kwargs)[source]¶
Initialize EvolutionStrategyMpL.
- Parameters
lam (int) – Number of new individual generated by mutation.
- static change_count(c, cn)[source]¶
Update number of successful mutations for population.
- Parameters
c (numpy.ndarray[Individual]) – Current population.
cn (numpy.ndarray[Individual]) – New population.
- Returns
Number of successful mutations.
- Return type
int
- static info()[source]¶
Get algorithms information.
- Returns
Algorithm information.
- Return type
str
See also
- init_population(task)[source]¶
Initialize starting population.
- Parameters
task (Task) – Optimization task.
- Returns
Initialized population.
Initialized populations function/fitness values.
- Additional arguments:
ki (int): Number of successful mutations.
- Return type
Tuple[numpy.ndarray[Individual], numpy.ndarray[float], Dict[str, Any]]
See also
niapy.algorithms.algorithm.Algorithm.init_population()
- mutate_rand(pop, task)[source]¶
Mutate random individual form population.
- Parameters
pop (numpy.ndarray[Individual]) – Current population.
task (Task) – Optimization task.
- Returns
Random individual from population that was mutated.
- Return type
numpy.ndarray
- run_iteration(task, c, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of EvolutionStrategyMpL algorithm.
- Parameters
task (Task) – Optimization task.
c (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray) – Current populations fitness/function values.
best_x (numpy.ndarray) – Global best individual.
best_fitness (float) – Global best individuals fitness/function value.
**params (Dict[str, Any]) – Additional arguments.
- Returns
New population.
New populations function/fitness values.
New global best solution.
New global best solutions fitness/objective value.
- Additional arguments:
ki (int): Number of successful mutations.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- set_parameters(lam=45, **kwargs)[source]¶
Set the arguments of an algorithm.
- Parameters
lam (int) – Number of new individual generated by mutation.
See also
niapy.algorithms.basic.es.EvolutionStrategy1p1.set_parameters()
- update_rho(pop, k)[source]¶
Update standard deviation for population.
- Parameters
pop (numpy.ndarray[Individual]) – Current population.
k (int) – Number of successful mutations.
- class niapy.algorithms.basic.FireflyAlgorithm(population_size=20, alpha=1, beta0=1, gamma=0.01, theta=0.97, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Firefly algorithm.
- Algorithm:
Firefly algorithm
- Date:
2016
- Authors:
Iztok Fister Jr, Iztok Fister and Klemen Berkovič
- License:
MIT
- Reference paper:
Fister, I., Fister Jr, I., Yang, X. S., & Brest, J. (2013). A comprehensive review of firefly algorithms. Swarm and Evolutionary Computation, 13, 34-46.
- Variables
Name (List[str]) – List of strings representing algorithm name.
alpha (float) – Randomness strength.
beta0 (float) – Attractiveness constant.
gamma (float) – Absorption coefficient.
theta (float) – Randomness reduction factor.
See also
Initialize FireflyAlgorithm.
- Parameters
population_size (Optional[int]) – Population size.
alpha (Optional[float]) – Randomness strength 0–1 (highly random).
beta0 (Optional[float]) – Attractiveness constant.
gamma (Optional[float]) – Absorption coefficient.
theta (Optional[float]) – Randomness reduction factor.
- Name = ['FireflyAlgorithm', 'FA']¶
- __init__(population_size=20, alpha=1, beta0=1, gamma=0.01, theta=0.97, *args, **kwargs)[source]¶
Initialize FireflyAlgorithm.
- Parameters
population_size (Optional[int]) – Population size.
alpha (Optional[float]) – Randomness strength 0–1 (highly random).
beta0 (Optional[float]) – Attractiveness constant.
gamma (Optional[float]) – Absorption coefficient.
theta (Optional[float]) – Randomness reduction factor.
- static info()[source]¶
Get algorithms information.
- Returns
Algorithm information.
- Return type
str
See also
- init_population(task)[source]¶
Initialize the starting population.
- Parameters
task (Task) – Optimization task
- Returns
New population.
New population fitness/function values.
- Additional arguments:
alpha (float): Randomness strength.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float], Dict[str, Any]]
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of Firefly Algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray) – Current population function/fitness values.
best_x (numpy.ndarray) – Global best individual.
best_fitness (float) – Global best individual fitness/function value.
**params (Dict[str, Any]) – Additional arguments.
- Returns
New population.
New population fitness/function values.
New global best solution
New global best solutions fitness/objective value
- Additional arguments:
alpha (float): Randomness strength.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
See also
niapy.algorithms.basic.FireflyAlgorithm.move_ffa()
- set_parameters(population_size=20, alpha=1, beta0=1, gamma=0.01, theta=0.97, **kwargs)[source]¶
Set the parameters of the algorithm.
- Parameters
population_size (Optional[int]) – Population size.
alpha (Optional[float]) – Randomness strength 0–1 (highly random).
beta0 (Optional[float]) – Attractiveness constant.
gamma (Optional[float]) – Absorption coefficient.
theta (Optional[float]) – Randomness reduction factor.
- class niapy.algorithms.basic.FireworksAlgorithm(population_size=5, num_sparks=50, a=0.04, b=0.8, max_amplitude=40, num_gaussian=5, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of fireworks algorithm.
- Algorithm:
Fireworks Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
- Reference paper:
Tan, Ying. “Fireworks algorithm.” Heidelberg, Germany: Springer 10 (2015): 978-3
- Variables
Name (List[str]) – List of strings representing algorithm names.
Initialize FWA.
- Parameters
population_size (int) – Number of Fireworks
num_sparks (int) – Number of sparks
a (float) – Limitation of sparks
b (float) – Limitation of sparks
max_amplitude (float) – Initial amplitude.
num_gaussian (int) – Number of sparks to apply gaussian mutation to.
- Name = ['FireworksAlgorithm', 'FWA']¶
- __init__(population_size=5, num_sparks=50, a=0.04, b=0.8, max_amplitude=40, num_gaussian=5, *args, **kwargs)[source]¶
Initialize FWA.
- Parameters
population_size (int) – Number of Fireworks
num_sparks (int) – Number of sparks
a (float) – Limitation of sparks
b (float) – Limitation of sparks
max_amplitude (float) – Initial amplitude.
num_gaussian (int) – Number of sparks to apply gaussian mutation to.
- explosion_amplitudes(population_fitness, task=None)[source]¶
Calculate explosion amplitude.
- Parameters
population_fitness (numpy.ndarray) – Population fitness values.
task (Optional[Task]) – Optimization task (Unused in this version of the algorithm).
- Returns
Explosion amplitude of sparks.
- Return type
numpy.ndarray
- explosion_spark(x, amplitude, task)[source]¶
Explode a spark.
- Parameters
x (numpy.ndarray) – Individuals creating spark.
amplitude (float) – Amplitude of spark.
task (Task) – Optimization task.
- Returns
Sparks exploded in with specified amplitude.
- Return type
numpy.ndarray
- gaussian_spark(x, task, best_x=None)[source]¶
Create gaussian spark.
- Parameters
x (numpy.ndarray) – Individual creating a spark.
task (Task) – Optimization task.
best_x (numpy.ndarray) – Current best individual. Unused in this version of the algorithm.
- Returns
Spark exploded based on gaussian amplitude.
- Return type
numpy.ndarray
- static info()[source]¶
Get default information of algorithm.
- Returns
Basic information.
- Return type
str
See also
- mapping(x, task)[source]¶
Fix value to bounds.
- Parameters
x (numpy.ndarray) – Individual to fix.
task (Task) – Optimization task.
- Returns
Individual in search range.
- Return type
numpy.ndarray
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of Fireworks algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray[float]) – Current populations function/fitness values.
best_x (numpy.ndarray) – Global best individual.
best_fitness (float) – Global best individuals fitness/function value.
**params (Dict[str, Any) – Additional arguments
- Returns
Initialized population.
Initialized populations function/fitness values.
New global best solution.
New global best solutions fitness/objective value.
- Additional arguments:
Ah (numpy.ndarray): Initialized amplitudes.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- selection(population, population_fitness, sparks, task)[source]¶
Generate new generation of individuals.
- Parameters
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray[float]) – Currents population fitness/function values.
sparks (numpy.ndarray) – New population.
task (Task) – Optimization task.
- Returns
New population.
New populations fitness/function values.
New global best individual.
New global best fitness.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float], numpy.ndarray, float]
- set_parameters(population_size=5, num_sparks=50, a=0.04, b=0.8, max_amplitude=40, num_gaussian=5, **kwargs)[source]¶
Set the arguments of an algorithm.
- Parameters
population_size (int) – Number of Fireworks
num_sparks (int) – Number of sparks
a (float) – Limitation of sparks
b (float) – Limitation of sparks
max_amplitude (float) – Initial amplitude.
num_gaussian (int) – Number of sparks to apply gaussian mutation to.
- class niapy.algorithms.basic.FishSchoolSearch(population_size=30, step_individual_init=0.1, step_individual_final=0.0001, step_volitive_init=0.01, step_volitive_final=0.001, min_w=1.0, w_scale=500.0, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Fish School Search algorithm.
- Algorithm:
Fish School Search algorithm
- Date:
2019
- Authors:
Clodomir Santana Jr, Elliackin Figueredo, Mariana Maceds, Pedro Santos. Ported to niapy with small changes by Kristian Järvenpää (2018). Ported to niapy 2.0 by Klemen Berkovič (2019).
- License:
MIT
- Reference paper:
Bastos Filho, Lima Neto, Lins, D. O. Nascimento and P. Lima, “A novel search algorithm based on fish school behavior,” in 2008 IEEE International Conference on Systems, Man and Cybernetics, Oct 2008, pp. 2646–2651.
- Variables
Name (List[str]) – List of strings representing algorithm name.
step_individual_init (float) – Length of initial individual step.
step_individual_final (float) – Length of final individual step.
step_volitive_init (float) – Length of initial volatile step.
step_volitive_final (float) – Length of final volatile step.
min_w (float) – Minimum weight of a fish.
w_scale (float) – Maximum weight of a fish.
See also
Initialize FishSchoolSearch.
- Parameters
population_size (Optional[int]) – Number of fishes in school.
step_individual_init (Optional[float]) – Length of initial individual step.
step_individual_final (Optional[float]) – Length of final individual step.
step_volitive_init (Optional[float]) – Length of initial volatile step.
step_volitive_final (Optional[float]) – Length of final volatile step.
min_w (Optional[float]) – Minimum weight of a fish.
w_scale (Optional[float]) – Maximum weight of a fish. Recommended value: max_iterations / 2
- Name = ['FSS', 'FishSchoolSearch']¶
- __init__(population_size=30, step_individual_init=0.1, step_individual_final=0.0001, step_volitive_init=0.01, step_volitive_final=0.001, min_w=1.0, w_scale=500.0, *args, **kwargs)[source]¶
Initialize FishSchoolSearch.
- Parameters
population_size (Optional[int]) – Number of fishes in school.
step_individual_init (Optional[float]) – Length of initial individual step.
step_individual_final (Optional[float]) – Length of final individual step.
step_volitive_init (Optional[float]) – Length of initial volatile step.
step_volitive_final (Optional[float]) – Length of final volatile step.
min_w (Optional[float]) – Minimum weight of a fish.
w_scale (Optional[float]) – Maximum weight of a fish. Recommended value: max_iterations / 2
- collective_instinctive_movement(school, task)[source]¶
Perform collective instinctive movement.
- Parameters
school (numpy.ndarray) – Current population.
task (Task) – Optimization task.
- Returns
New population
- Return type
numpy.ndarray
- collective_volitive_movement(school, step_volitive, school_weight, xb, fxb, task)[source]¶
Perform collective volitive movement.
- Parameters
school (numpy.ndarray) –
step_volitive –
school_weight –
xb (numpy.ndarray) – Global best solution.
fxb (float) – Global best solutions fitness/objective value.
task (Task) – Optimization task.
- Returns
New population.
New global best individual.
New global best fitness.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, float]
- feeding(school)[source]¶
Feed all fishes.
- Parameters
school (numpy.ndarray) – Current school fish population.
- Returns
New school fish population.
- Return type
numpy.ndarray
- get_parameters()[source]¶
Get algorithm parameters.
- Returns
Algorithm parameters.
- Return type
Dict[str, Any]
- individual_movement(school, step_individual, xb, fxb, task)[source]¶
Perform individual movement for each fish.
- Parameters
school (numpy.ndarray) – School fish population.
step_individual (numpy.ndarray) – Current individual step.
xb (numpy.ndarray) – Global best solution.
fxb (float) – Global best solutions fitness/objective value.
task (Task) – Optimization task.
- Returns
New school of fishes.
New global best position.
New global best fitness.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, float]
- static info()[source]¶
Get default information of algorithm.
- Returns
Basic information.
- Return type
str
See also
- init_population(task)[source]¶
Initialize the school.
- Parameters
task (Task) – Optimization task.
- Returns
Population.
Population fitness.
- Additional arguments:
step_individual (float): Current individual step.
step_volitive (float): Current volitive step.
school_weight (float): Current school weight.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, dict]
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray) – Current population fitness.
best_x (numpy.ndarray) – Current global best individual.
best_fitness (float) – Current global best fitness.
**params – Additional parameters.
- Returns
New Population.
New Population fitness.
New global best individual.
New global best fitness.
- Additional parameters:
step_individual (float): Current individual step.
step_volitive (float): Current volitive step.
school_weight (float): Current school weight.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, dict]
- set_parameters(population_size=30, step_individual_init=0.1, step_individual_final=0.0001, step_volitive_init=0.01, step_volitive_final=0.001, min_w=1.0, w_scale=5000.0, **kwargs)[source]¶
Set core arguments of FishSchoolSearch algorithm.
- Parameters
population_size (Optional[int]) – Number of fishes in school.
step_individual_init (Optional[float]) – Length of initial individual step.
step_individual_final (Optional[float]) – Length of final individual step.
step_volitive_init (Optional[float]) – Length of initial volatile step.
step_volitive_final (Optional[float]) – Length of final volatile step.
min_w (Optional[float]) – Minimum weight of a fish.
w_scale (Optional[float]) – Maximum weight of a fish. Recommended value: max_iterations / 2
- class niapy.algorithms.basic.FlowerPollinationAlgorithm(population_size=20, p=0.8, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Flower Pollination algorithm.
- Algorithm:
Flower Pollination algorithm
- Date:
2018
- Authors:
Dusan Fister, Iztok Fister Jr. and Klemen Berkovič
- License:
MIT
- Reference paper:
Yang, Xin-She. “Flower pollination algorithm for global optimization. International conference on unconventional computing and natural computation. Springer, Berlin, Heidelberg, 2012.
- References URL:
Implementation is based on the following MATLAB code: https://www.mathworks.com/matlabcentral/fileexchange/45112-flower-pollination-algorithm?requestedDomain=true
- Variables
Name (List[str]) – List of strings representing algorithm names.
p (float) – Switch probability.
See also
Initialize FlowerPollinationAlgorithm.
- Parameters
population_size (int) – Population size.
p (float) – Switch probability.
- Name = ['FlowerPollinationAlgorithm', 'FPA']¶
- __init__(population_size=20, p=0.8, *args, **kwargs)[source]¶
Initialize FlowerPollinationAlgorithm.
- Parameters
population_size (int) – Population size.
p (float) – Switch probability.
- static info()[source]¶
Get default information of algorithm.
- Returns
Basic information.
- Return type
str
See also
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of FlowerPollinationAlgorithm algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray) – Current population fitness/function values.
best_x (numpy.ndarray) – Global best solution.
best_fitness (float) – Global best solution function/fitness value.
**params (Dict[str, Any]) – Additional arguments.
- Returns
New population.
New populations fitness/function values.
New global best solution.
New global best solution fitness/objective value.
Additional arguments.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- class niapy.algorithms.basic.ForestOptimizationAlgorithm(population_size=10, lifetime=3, area_limit=10, local_seeding_changes=1, global_seeding_changes=1, transfer_rate=0.3, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Forest Optimization Algorithm.
- Algorithm:
Forest Optimization Algorithm
- Date:
2019
- Authors:
Luka Pečnik
- License:
MIT
- Reference paper:
Manizheh Ghaemi, Mohammad-Reza Feizi-Derakhshi, Forest Optimization Algorithm, Expert Systems with Applications, Volume 41, Issue 15, 2014, Pages 6676-6687, ISSN 0957-4174, https://doi.org/10.1016/j.eswa.2014.05.009.
- References URL:
Implementation is based on the following MATLAB code: https://github.com/cominsys/FOA
- Variables
Name (List[str]) – List of strings representing algorithm name.
lifetime (int) – Life time of trees parameter.
area_limit (int) – Area limit parameter.
local_seeding_changes (int) – Local seeding changes parameter.
global_seeding_changes (int) – Global seeding changes parameter.
transfer_rate (float) – Transfer rate parameter.
See also
Initialize ForestOptimizationAlgorithm.
- Parameters
population_size (Optional[int]) – Population size.
lifetime (Optional[int]) – Life time parameter.
area_limit (Optional[int]) – Area limit parameter.
local_seeding_changes (Optional[int]) – Local seeding changes parameter.
global_seeding_changes (Optional[int]) – Global seeding changes parameter.
transfer_rate (Optional[float]) – Transfer rate parameter.
- Name = ['ForestOptimizationAlgorithm', 'FOA']¶
- __init__(population_size=10, lifetime=3, area_limit=10, local_seeding_changes=1, global_seeding_changes=1, transfer_rate=0.3, *args, **kwargs)[source]¶
Initialize ForestOptimizationAlgorithm.
- Parameters
population_size (Optional[int]) – Population size.
lifetime (Optional[int]) – Life time parameter.
area_limit (Optional[int]) – Area limit parameter.
local_seeding_changes (Optional[int]) – Local seeding changes parameter.
global_seeding_changes (Optional[int]) – Global seeding changes parameter.
transfer_rate (Optional[float]) – Transfer rate parameter.
- get_parameters()[source]¶
Get parameters values of the algorithm.
- Returns
Algorithm parameters.
- Return type
Dict[str, Any]
- global_seeding(task, candidates, size)[source]¶
Global optimum search stage that should prevent getting stuck in a local optimum.
- Parameters
task (Task) – Optimization task.
candidates (numpy.ndarray) – Candidate population for global seeding.
size (int) – Number of trees to produce.
- Returns
Resulting trees.
- Return type
numpy.ndarray
- static info()[source]¶
Get algorithms information.
- Returns
Algorithm information.
- Return type
str
See also
- init_population(task)[source]¶
Initialize the starting population.
- Parameters
task (Task) – Optimization task
- Returns
New population.
New population fitness/function values.
- Additional arguments:
age (numpy.ndarray[int32]): Age of trees.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float], Dict[str, Any]]
- local_seeding(task, trees)[source]¶
Local optimum search stage.
- Parameters
task (Task) – Optimization task.
trees (numpy.ndarray) – Zero age trees for local seeding.
- Returns
Resulting zero age trees.
- Return type
numpy.ndarray
- remove_lifetime_exceeded(trees, age)[source]¶
Remove dead trees.
- Parameters
trees (numpy.ndarray) – Population to test.
age (numpy.ndarray[int32]) – Age of trees.
- Returns
Alive trees.
New candidate population.
Age of trees.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray[int32]]
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of Forest Optimization Algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray[float]) – Current population function/fitness values.
best_x (numpy.ndarray) – Global best individual.
best_fitness (float) – Global best individual fitness/function value.
**params (Dict[str, Any]) – Additional arguments.
- Returns
New population.
New population fitness/function values.
- Additional arguments:
age (numpy.ndarray[int32]): Age of trees.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float], Dict[str, Any]]
- set_parameters(population_size=10, lifetime=3, area_limit=10, local_seeding_changes=1, global_seeding_changes=1, transfer_rate=0.3, **kwargs)[source]¶
Set the parameters of the algorithm.
- Parameters
population_size (Optional[int]) – Population size.
lifetime (Optional[int]) – Life time parameter.
area_limit (Optional[int]) – Area limit parameter.
local_seeding_changes (Optional[int]) – Local seeding changes parameter.
global_seeding_changes (Optional[int]) – Global seeding changes parameter.
transfer_rate (Optional[float]) – Transfer rate parameter.
- survival_of_the_fittest(task, trees, candidates, age)[source]¶
Evaluate and filter current population.
- Parameters
task (Task) – Optimization task.
trees (numpy.ndarray) – Population to evaluate.
candidates (numpy.ndarray) – Candidate population array to be updated.
age (numpy.ndarray[int32]) – Age of trees.
- Returns
Trees sorted by fitness value.
Updated candidate population.
Population fitness values.
Age of trees
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray[float], numpy.ndarray[int32]]
- class niapy.algorithms.basic.GeneticAlgorithm(population_size=25, tournament_size=5, mutation_rate=0.25, crossover_rate=0.25, selection=<function tournament_selection>, crossover=<function uniform_crossover>, mutation=<function uniform_mutation>, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Genetic Algorithm.
- Algorithm:
Genetic algorithm
- Date:
2018
- Author:
Klemen Berkovič
- Reference paper:
Goldberg, David (1989). Genetic Algorithms in Search, Optimization and Machine Learning. Reading, MA: Addison-Wesley Professional.
- License:
MIT
- Variables
Name (List[str]) – List of strings representing algorithm name.
tournament_size (int) – Tournament size.
mutation_rate (float) – Mutation rate.
crossover_rate (float) – Crossover rate.
selection (Callable[[numpy.ndarray[Individual], int, int, Individual, numpy.random.Generator], Individual]) – selection operator.
crossover (Callable[[numpy.ndarray[Individual], int, float, numpy.random.Generator], Individual]) – Crossover operator.
mutation (Callable[[numpy.ndarray[Individual], int, float, Task, numpy.random.Generator], Individual]) – Mutation operator.
See also
Initialize GeneticAlgorithm.
- Parameters
population_size (Optional[int]) – Population size.
tournament_size (Optional[int]) – Tournament selection.
mutation_rate (Optional[int]) – Mutation rate.
crossover_rate (Optional[float]) – Crossover rate.
selection (Optional[Callable[[numpy.ndarray[Individual], int, int, Individual, numpy.random.Generator], Individual]]) – Selection operator.
crossover (Optional[Callable[[numpy.ndarray[Individual], int, float, numpy.random.Generator], Individual]]) – Crossover operator.
mutation (Optional[Callable[[numpy.ndarray[Individual], int, float, Task, numpy.random.Generator], Individual]]) – Mutation operator.
See also
- selection:
niapy.algorithms.basic.tournament_selection()
niapy.algorithms.basic.roulette_selection()
- Crossover:
niapy.algorithms.basic.uniform_crossover()
niapy.algorithms.basic.two_point_crossover()
niapy.algorithms.basic.multi_point_crossover()
niapy.algorithms.basic.crossover_uros()
- Mutations:
niapy.algorithms.basic.uniform_mutation()
niapy.algorithms.basic.creep_mutation()
niapy.algorithms.basic.mutation_uros()
- Name = ['GeneticAlgorithm', 'GA']¶
- __init__(population_size=25, tournament_size=5, mutation_rate=0.25, crossover_rate=0.25, selection=<function tournament_selection>, crossover=<function uniform_crossover>, mutation=<function uniform_mutation>, *args, **kwargs)[source]¶
Initialize GeneticAlgorithm.
- Parameters
population_size (Optional[int]) – Population size.
tournament_size (Optional[int]) – Tournament selection.
mutation_rate (Optional[int]) – Mutation rate.
crossover_rate (Optional[float]) – Crossover rate.
selection (Optional[Callable[[numpy.ndarray[Individual], int, int, Individual, numpy.random.Generator], Individual]]) – Selection operator.
crossover (Optional[Callable[[numpy.ndarray[Individual], int, float, numpy.random.Generator], Individual]]) – Crossover operator.
mutation (Optional[Callable[[numpy.ndarray[Individual], int, float, Task, numpy.random.Generator], Individual]]) – Mutation operator.
See also
- selection:
niapy.algorithms.basic.tournament_selection()
niapy.algorithms.basic.roulette_selection()
- Crossover:
niapy.algorithms.basic.uniform_crossover()
niapy.algorithms.basic.two_point_crossover()
niapy.algorithms.basic.multi_point_crossover()
niapy.algorithms.basic.crossover_uros()
- Mutations:
niapy.algorithms.basic.uniform_mutation()
niapy.algorithms.basic.creep_mutation()
niapy.algorithms.basic.mutation_uros()
- static info()[source]¶
Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
str
See also
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of GeneticAlgorithm algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray) – Current populations fitness/function values.
best_x (numpy.ndarray) – Global best individual.
best_fitness (float) – Global best individuals function/fitness value.
**params (Dict[str, Any]) – Additional arguments.
- Returns
New population.
New populations function/fitness values.
New global best solution
New global best solutions fitness/objective value
Additional arguments.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- set_parameters(population_size=25, tournament_size=5, mutation_rate=0.25, crossover_rate=0.25, selection=<function tournament_selection>, crossover=<function uniform_crossover>, mutation=<function uniform_mutation>, **kwargs)[source]¶
Set the parameters of the algorithm.
- Parameters
population_size (Optional[int]) – Population size.
tournament_size (Optional[int]) – Tournament selection.
mutation_rate (Optional[int]) – Mutation rate.
crossover_rate (Optional[float]) – Crossover rate.
selection (Optional[Callable[[numpy.ndarray[Individual], int, int, Individual, numpy.random.Generator], Individual]]) – selection operator.
crossover (Optional[Callable[[numpy.ndarray[Individual], int, float, numpy.random.Generator], Individual]]) – Crossover operator.
mutation (Optional[Callable[[numpy.ndarray[Individual], int, float, Task, numpy.random.Generator], Individual]]) – Mutation operator.
See also
- selection:
niapy.algorithms.basic.tournament_selection()
niapy.algorithms.basic.roulette_selection()
- Crossover:
niapy.algorithms.basic.uniform_crossover()
niapy.algorithms.basic.two_point_crossover()
niapy.algorithms.basic.multi_point_crossover()
niapy.algorithms.basic.crossover_uros()
- Mutations:
niapy.algorithms.basic.uniform_mutation()
niapy.algorithms.basic.creep_mutation()
niapy.algorithms.basic.mutation_uros()
- class niapy.algorithms.basic.GlowwormSwarmOptimization(population_size=25, l0=5, nt=5, rho=0.4, gamma=0.6, beta=0.08, s=0.03, distance=<function euclidean>, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of glowworm swarm optimization.
- Algorithm:
Glowworm Swarm Optimization Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
- Reference paper:
Kaipa, Krishnanand N., and Debasish Ghose. Glowworm swarm optimization: theory, algorithms, and applications. Vol. 698. Springer, 2017.
- Variables
Name (List[str]) – List of strings representing algorithm name.
l0 (float) – Initial luciferin quantity for each glowworm.
nt (float) – Number of neighbors.
rho (float) – Luciferin decay constant.
gamma (float) – Luciferin enhancement constant.
beta (float) – Constant.
s (float) – Step size.
distance (Callable[[numpy.ndarray, numpy.ndarray], float]]) – Measure distance between two individuals.
See also
NiaPy.algorithms.algorithm.Algorithm
Initialize GlowwormSwarmOptimization.
- Parameters
population_size (Optional[int]) – Number of glowworms in population.
l0 (Optional[float]) – Initial luciferin quantity for each glowworm.
nt (Optional[int]) – Number of neighbors.
rho (Optional[float]) – Luciferin decay constant.
gamma (Optional[float]) – Luciferin enhancement constant.
beta (Optional[float]) – Constant.
s (Optional[float]) – Step size.
distance (Optional[Callable[[numpy.ndarray, numpy.ndarray], float]]]) – Measure distance between two individuals.
- Name = ['GlowwormSwarmOptimization', 'GSO']¶
- __init__(population_size=25, l0=5, nt=5, rho=0.4, gamma=0.6, beta=0.08, s=0.03, distance=<function euclidean>, *args, **kwargs)[source]¶
Initialize GlowwormSwarmOptimization.
- Parameters
population_size (Optional[int]) – Number of glowworms in population.
l0 (Optional[float]) – Initial luciferin quantity for each glowworm.
nt (Optional[int]) – Number of neighbors.
rho (Optional[float]) – Luciferin decay constant.
gamma (Optional[float]) – Luciferin enhancement constant.
beta (Optional[float]) – Constant.
s (Optional[float]) – Step size.
distance (Optional[Callable[[numpy.ndarray, numpy.ndarray], float]]]) – Measure distance between two individuals.
- get_neighbors(i, r, glowworms, luciferin)[source]¶
Get neighbours of glowworm.
- Parameters
i (int) – Index of glowworm.
r (float) – Neighborhood distance.
glowworms (numpy.ndarray) –
luciferin (numpy.ndarray[float]) – Luciferin value of glowworm.
- Returns
Indexes of neighborhood glowworms.
- Return type
numpy.ndarray[int]
- get_parameters()[source]¶
Get algorithms parameters values.
- Returns
Algorithm parameters.
- Return type
Dict[str, Any]
- static info()[source]¶
Get basic information of algorithm.
- Returns
Basic information.
- Return type
str
- init_population(task)[source]¶
Initialize population.
- Parameters
task (Task) – Optimization task.
- Returns
Initialized population of glowworms.
Initialized populations function/fitness values.
- Additional arguments:
luciferin (numpy.ndarray): Luciferin values of glowworms.
ranges (numpy.ndarray): Ranges.
sensing_range (float): Sensing range.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float], Dict[str, Any]]
- move_select(pb, i)[source]¶
Get move index for the i-th glowworm.
- Parameters
pb (numpy.ndarray) – Probabilities.
i (int) – Index of the glowworm.
- Returns
Index i-th glowworm will move towards.
- Return type
int
- probabilities(i, neighbors, luciferin)[source]¶
Calculate probabilities for glowworm to movement.
- Parameters
i (int) – Index of glowworm to search for probable movement.
neighbors (numpy.ndarray[float]) –
luciferin (numpy.ndarray[float]) –
- Returns
Probabilities for each glowworm in swarm.
- Return type
numpy.ndarray[float]
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of GlowwormSwarmOptimization algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray) – Current populations fitness/function values.
best_x (numpy.ndarray) – Global best individual.
best_fitness (float) – Global best individuals function/fitness value.
Dict[str (**params) – Additional arguments.
Any] – Additional arguments.
- Returns
Initialized population of glowworms.
Initialized populations function/fitness values.
New global best solution
New global best solutions fitness/objective value.
- Additional arguments:
luciferin (numpy.ndarray): Luciferin values of glowworms.
ranges (numpy.ndarray): Ranges.
sensing_range (float): Sensing range.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- set_parameters(population_size=25, l0=5, nt=5, rho=0.4, gamma=0.6, beta=0.08, s=0.03, distance=<function euclidean>, **kwargs)[source]¶
Set the arguments of an algorithm.
- Parameters
population_size (Optional[int]) – Number of glowworms in population.
l0 (Optional[float]) – Initial luciferin quantity for each glowworm.
nt (Optional[int]) – Number of neighbors.
rho (Optional[float]) – Luciferin decay constant.
gamma (Optional[float]) – Luciferin enhancement constant.
beta (Optional[float]) – Constant.
s (Optional[float]) – Step size.
distance (Optional[Callable[[numpy.ndarray, numpy.ndarray], float]]]) – Measure distance between two individuals.
- class niapy.algorithms.basic.GlowwormSwarmOptimizationV1(population_size=25, l0=5, nt=5, rho=0.4, gamma=0.6, beta=0.08, s=0.03, distance=<function euclidean>, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.gso.GlowwormSwarmOptimization
Implementation of glowworm swarm optimization.
- Algorithm:
Glowworm Swarm Optimization Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
- Reference paper:
Kaipa, Krishnanand N., and Debasish Ghose. Glowworm swarm optimization: theory, algorithms, and applications. Vol. 698. Springer, 2017.
- Variables
Name (List[str]) – List of strings representing algorithm names.
See also
NiaPy.algorithms.basic.GlowwormSwarmOptimization
Initialize GlowwormSwarmOptimization.
- Parameters
population_size (Optional[int]) – Number of glowworms in population.
l0 (Optional[float]) – Initial luciferin quantity for each glowworm.
nt (Optional[int]) – Number of neighbors.
rho (Optional[float]) – Luciferin decay constant.
gamma (Optional[float]) – Luciferin enhancement constant.
beta (Optional[float]) – Constant.
s (Optional[float]) – Step size.
distance (Optional[Callable[[numpy.ndarray, numpy.ndarray], float]]]) – Measure distance between two individuals.
- Name = ['GlowwormSwarmOptimizationV1', 'GSOv1']¶
- class niapy.algorithms.basic.GlowwormSwarmOptimizationV2(alpha=0.2, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.gso.GlowwormSwarmOptimization
Implementation of glowworm swarm optimization.
- Algorithm:
Glowworm Swarm Optimization Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
- Reference paper:
Kaipa, Krishnanand N., and Debasish Ghose. Glowworm swarm optimization: theory, algorithms, and applications. Vol. 698. Springer, 2017.
- Variables
Name (List[str]) – List of strings representing algorithm names.
alpha (float) –
–
See also
NiaPy.algorithms.basic.GlowwormSwarmOptimization
Initialize GlowwormSwarmOptimizationV2.
- Parameters
alpha (Optional[float]) – Alpha parameter.
- Name = ['GlowwormSwarmOptimizationV2', 'GSOv2']¶
- __init__(alpha=0.2, *args, **kwargs)[source]¶
Initialize GlowwormSwarmOptimizationV2.
- Parameters
alpha (Optional[float]) – Alpha parameter.
- class niapy.algorithms.basic.GlowwormSwarmOptimizationV3(beta1=0.2, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.gso.GlowwormSwarmOptimization
Implementation of glowworm swarm optimization.
- Algorithm:
Glowworm Swarm Optimization Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
- Reference paper:
Kaipa, Krishnanand N., and Debasish Ghose. Glowworm swarm optimization: theory, algorithms, and applications. Vol. 698. Springer, 2017.
- Variables
Name (List[str]) – List of strings representing algorithm names.
beta1 (float) –
–
See also
NiaPy.algorithms.basic.GlowwormSwarmOptimization
Initialize GlowwormSwarmOptimizationV3.
- Parameters
beta1 (Optional[float]) – Beta1 parameter.
- Name = ['GlowwormSwarmOptimizationV3', 'GSOv3']¶
- __init__(beta1=0.2, *args, **kwargs)[source]¶
Initialize GlowwormSwarmOptimizationV3.
- Parameters
beta1 (Optional[float]) – Beta1 parameter.
- class niapy.algorithms.basic.GravitationalSearchAlgorithm(population_size=40, g0=2.467, epsilon=1e-17, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Gravitational Search Algorithm.
- Algorithm:
Gravitational Search Algorithm
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
- Reference URL:
- Reference paper:
Esmat Rashedi, Hossein Nezamabadi-pour, Saeid Saryazdi, GSA: A Gravitational Search Algorithm, Information Sciences, Volume 179, Issue 13, 2009, Pages 2232-2248, ISSN 0020-0255
- Variables
Name (List[str]) – List of strings representing algorithm name.
See also
Initialize GravitationalSearchAlgorithm.
- Parameters
population_size (Optional[int]) – Population size.
g0 (Optional[float]) – Starting gravitational constant.
epsilon (Optional[float]) – Small number.
See also
niapy.algorithms.algorithm.Algorithm.__init__()
- Name = ['GravitationalSearchAlgorithm', 'GSA']¶
- __init__(population_size=40, g0=2.467, epsilon=1e-17, *args, **kwargs)[source]¶
Initialize GravitationalSearchAlgorithm.
- Parameters
population_size (Optional[int]) – Population size.
g0 (Optional[float]) – Starting gravitational constant.
epsilon (Optional[float]) – Small number.
See also
niapy.algorithms.algorithm.Algorithm.__init__()
- get_parameters()[source]¶
Get algorithm parameters values.
- Returns
Algorithm parameters.
- Return type
Dict[str, Any]
See also
niapy.algorithms.algorithm.Algorithm.get_parameters()
- gravity(t)[source]¶
Get new gravitational constant.
- Parameters
t (int) – Time (Current iteration).
- Returns
New gravitational constant.
- Return type
float
- init_population(task)[source]¶
Initialize staring population.
- Parameters
task (Task) – Optimization task.
- Returns
Initialized population.
Initialized populations fitness/function values.
- Additional arguments:
velocities (numpy.ndarray[float]): Velocities
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float], Dict[str, Any]]
See also
niapy.algorithms.algorithm.Algorithm.init_population()
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of GravitationalSearchAlgorithm algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray) – Current populations fitness/function values.
best_x (numpy.ndarray) – Global best solution.
best_fitness (float) – Global best fitness/function value.
**params (Dict[str, Any]) – Additional arguments.
- Returns
New population.
New populations fitness/function values.
New global best solution
New global best solutions fitness/objective value
- Additional arguments:
velocities (numpy.ndarray): Velocities.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- set_parameters(population_size=40, g0=2.467, epsilon=1e-17, **kwargs)[source]¶
Set the algorithm parameters.
- Parameters
population_size (Optional[int]) – Population size.
g0 (Optional[float]) – Starting gravitational constant.
epsilon (Optional[float]) – Small number.
See also
niapy.algorithms.algorithm.Algorithm.set_parameters()
- class niapy.algorithms.basic.GreyWolfOptimizer(population_size=50, initialization_function=<function default_numpy_init>, individual_type=None, seed=None, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Grey wolf optimizer.
- Algorithm:
Grey wolf optimizer
- Date:
2018
- Author:
Iztok Fister Jr. and Klemen Berkovič
- License:
MIT
- Reference paper:
Mirjalili, Seyedali, Seyed Mohammad Mirjalili, and Andrew Lewis. “Grey wolf optimizer.” Advances in engineering software 69 (2014): 46-61.
Grey Wolf Optimizer (GWO) source code version 1.0 (MATLAB) from MathWorks
- Variables
Name (List[str]) – List of strings representing algorithm names.
See also
Initialize algorithm and create name for an algorithm.
- Parameters
population_size (Optional[int]) – Population size.
initialization_function (Optional[Callable[[int, Task, numpy.random.Generator, Dict[str, Any]], Tuple[numpy.ndarray, numpy.ndarray[float]]]]) – Population initialization function.
individual_type (Optional[Type[Individual]]) – Individual type used in population, default is Numpy array.
seed (Optional[int]) – Starting seed for random generator.
- Name = ['GreyWolfOptimizer', 'GWO']¶
- static info()[source]¶
Get algorithm information.
- Returns
Algorithm information.
- Return type
str
See also
- init_population(task)[source]¶
Initialize population.
- Parameters
task (Task) – Optimization task.
- Returns
Initialized population.
Initialized populations fitness/function values.
- Additional arguments:
alpha (numpy.ndarray): Alpha of the pack (Best solution)
alpha_fitness (float): Best fitness.
beta (numpy.ndarray): Beta of the pack (Second best solution)
beta_fitness (float): Second best fitness.
delta (numpy.ndarray): Delta of the pack (Third best solution)
delta_fitness (float): Third best fitness.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, Dict[str, Any]]
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of GreyWolfOptimizer algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray) – Current populations function/fitness values.
best_x (numpy.ndarray) –
best_fitness (float) –
**params (Dict[str, Any]) – Additional arguments.
- Returns
New population
New population fitness/function values
- Additional arguments:
alpha (numpy.ndarray): Alpha of the pack (Best solution)
alpha_fitness (float): Best fitness.
beta (numpy.ndarray): Beta of the pack (Second best solution)
beta_fitness (float): Second best fitness.
delta (numpy.ndarray): Delta of the pack (Third best solution)
delta_fitness (float): Third best fitness.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- class niapy.algorithms.basic.HarmonySearch(population_size=30, r_accept=0.7, r_pa=0.35, b_range=1.42, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Harmony Search algorithm.
- Algorithm:
Harmony Search Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
- Reference paper:
Geem, Z. W., Kim, J. H., & Loganathan, G. V. (2001). A new heuristic optimization algorithm: harmony search. Simulation, 76(2), 60-68.
- Variables
Name (List[str]) – List of strings representing algorithm names
r_accept (float) – Probability of accepting new bandwidth into harmony.
r_pa (float) – Probability of accepting random bandwidth into harmony.
b_range (float) – Range of bandwidth.
See also
Initialize HarmonySearch.
- Parameters
population_size (Optional[int]) – Number of harmony in the memory.
r_accept (Optional[float]) – Probability of accepting new bandwidth to harmony.
r_pa (Optional[float]) – Probability of accepting random bandwidth into harmony.
b_range (Optional[float]) – Bandwidth range.
- Name = ['HarmonySearch', 'HS']¶
- __init__(population_size=30, r_accept=0.7, r_pa=0.35, b_range=1.42, *args, **kwargs)[source]¶
Initialize HarmonySearch.
- Parameters
population_size (Optional[int]) – Number of harmony in the memory.
r_accept (Optional[float]) – Probability of accepting new bandwidth to harmony.
r_pa (Optional[float]) – Probability of accepting random bandwidth into harmony.
b_range (Optional[float]) – Bandwidth range.
- adjustment(x, task)[source]¶
Adjust value based on bandwidth.
- Parameters
x (Union[int, float]) – Current position.
task (Task) – Optimization task.
- Returns
New position.
- Return type
float
- bw(task)[source]¶
Get bandwidth.
- Parameters
task (Task) – Optimization task.
- Returns
Bandwidth.
- Return type
float
- improvise(harmonies, task)[source]¶
Create new individual.
- Parameters
harmonies (numpy.ndarray) – Current population.
task (Task) – Optimization task.
- Returns
New individual.
- Return type
numpy.ndarray
- static info()[source]¶
Get basic information about the algorithm.
- Returns
Basic information.
- Return type
str
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of HarmonySearch algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray) – Current populations function/fitness values.
best_x (numpy.ndarray) – Global best individual.
best_fitness (float) – Global best fitness/function value.
**params (Dict[str, Any]) – Additional arguments.
- Returns
New harmony/population.
New populations function/fitness values.
New global best solution
New global best solution fitness/objective value
Additional arguments.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- set_parameters(population_size=30, r_accept=0.7, r_pa=0.35, b_range=1.42, **kwargs)[source]¶
Set the arguments of the algorithm.
- Parameters
population_size (Optional[int]) – Number of harmony in the memory.
r_accept (Optional[float]) – Probability of accepting new bandwidth to harmony.
r_pa (Optional[float]) – Probability of accepting random bandwidth into harmony.
b_range (Optional[float]) – Bandwidth range.
See also
niapy.algorithms.algorithm.Algorithm.set_parameters()
- class niapy.algorithms.basic.HarmonySearchV1(bw_min=1, bw_max=2, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.hs.HarmonySearch
Implementation of harmony search algorithm.
- Algorithm:
Harmony Search Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
https://link.springer.com/chapter/10.1007/978-3-642-00185-7_1
- Reference paper:
Yang, Xin-She. “Harmony search as a metaheuristic algorithm.” Music-inspired harmony search algorithm. Springer, Berlin, Heidelberg, 2009. 1-14.
- Variables
Name (List[str]) – List of strings representing algorithm name.
bw_min (float) – Minimal bandwidth.
bw_max (float) – Maximal bandwidth.
Initialize HarmonySearchV1.
- Parameters
bw_min (Optional[float]) – Minimal bandwidth.
bw_max (Optional[float]) – Maximal bandwidth.
- Name = ['HarmonySearchV1', 'HSv1']¶
- __init__(bw_min=1, bw_max=2, *args, **kwargs)[source]¶
Initialize HarmonySearchV1.
- Parameters
bw_min (Optional[float]) – Minimal bandwidth.
bw_max (Optional[float]) – Maximal bandwidth.
- bw(task)[source]¶
Get new bandwidth.
- Parameters
task (Task) – Optimization task.
- Returns
New bandwidth.
- Return type
float
- class niapy.algorithms.basic.HarrisHawksOptimization(population_size=40, levy=0.01, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Harris Hawks Optimization algorithm.
- Algorithm:
Harris Hawks Optimization
- Date:
2020
- Authors:
Francisco Jose Solis-Munoz
- License:
MIT
- Reference paper:
Heidari et al. “Harris hawks optimization: Algorithm and applications”. Future Generation Computer Systems. 2019. Vol. 97. 849-872.
- Variables
Name (List[str]) – List of strings representing algorithm name.
levy (float) – Levy factor.
See also
Initialize HarrisHawksOptimization.
- Parameters
population_size (Optional[int]) – Population size.
levy (Optional[float]) – Levy factor.
- Name = ['HarrisHawksOptimization', 'HHO']¶
- __init__(population_size=40, levy=0.01, *args, **kwargs)[source]¶
Initialize HarrisHawksOptimization.
- Parameters
population_size (Optional[int]) – Population size.
levy (Optional[float]) – Levy factor.
- get_parameters()[source]¶
Get parameters of the algorithm.
- Returns
Algorithm parameters.
- Return type
Dict[str, Any]
- static info()[source]¶
Get algorithms information.
- Returns
Algorithm information.
- Return type
str
See also
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of Harris Hawks Optimization.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population
population_fitness (numpy.ndarray[float]) – Current population fitness/function values
best_x (numpy.ndarray) – Current best individual
best_fitness (float) – Current best individual function/fitness value
params (Dict[str, Any]) – Additional algorithm arguments
- Returns
New population
New population fitness/function values
New global best solution
New global best fitness/objective value
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- class niapy.algorithms.basic.KrillHerd(population_size=50, n_max=0.01, foraging_speed=0.02, diffusion_speed=0.002, c_t=0.93, w_neighbor=0.42, w_foraging=0.38, d_s=2.63, max_neighbors=5, crossover_rate=0.2, mutation_rate=0.05, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of krill herd algorithm.
- Algorithm:
Krill Herd Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
http://www.sciencedirect.com/science/article/pii/S1007570412002171
- Reference paper:
Amir Hossein Gandomi, Amir Hossein Alavi, Krill herd: A new bio-inspired optimization algorithm, Communications in Nonlinear Science and Numerical Simulation, Volume 17, Issue 12, 2012, Pages 4831-4845, ISSN 1007-5704, https://doi.org/10.1016/j.cnsns.2012.05.010.
- Variables
Name (List[str]) – List of strings representing algorithm names.
population_size (int) – Number of krill herds in population.
N_max (float) – Maximum induced speed.
V_f (float) – Foraging speed.
D_max (float) – Maximum diffusion speed.
C_t (float) – Constant \(\in [0, 2]\)
W_n (Union[int, float, numpy.ndarray]) – Inertia weights of the motion induced from neighbors \(\in [0, 1]\).
W_f (Union[int, float, numpy.ndarray]) – Inertia weights of the motion induced from foraging :math`in [0, 1]`.
d_s (float) – Maximum euclidean distance for neighbors.
nn (int) – Maximum neighbors for neighbors effect.
epsilon (float) – Small numbers for division.
See also
Initialize KrillHerd.
- Parameters
population_size (Optional[int]) – Number of krill herds in population.
n_max (Optional[float]) – Maximum induced speed.
foraging_speed (Optional[float]) – Foraging speed.
diffusion_speed (Optional[float]) – Maximum diffusion speed.
c_t (Optional[float]) – Constant $in [0, 2]$.
w_neighbor (Optional[Union[int, float, numpy.ndarray]]) – Inertia weights of the motion induced from neighbors \(\in [0, 1]\).
w_foraging (Optional[Union[int, float, numpy.ndarray]]) – Inertia weights of the motion induced from foraging \(\in [0, 1]\).
d_s (Optional[float]) – Maximum euclidean distance for neighbors.
max_neighbors (Optional[int]) – Maximum neighbors for neighbors effect.
crossover_rate (Optional[float]) – Crossover probability.
mutation_rate (Optional[float]) – Mutation probability.
See also
niapy.algorithms.algorithm.Algorithm.__init__()
- Name = ['KrillHerd', 'KH']¶
- __init__(population_size=50, n_max=0.01, foraging_speed=0.02, diffusion_speed=0.002, c_t=0.93, w_neighbor=0.42, w_foraging=0.38, d_s=2.63, max_neighbors=5, crossover_rate=0.2, mutation_rate=0.05, *args, **kwargs)[source]¶
Initialize KrillHerd.
- Parameters
population_size (Optional[int]) – Number of krill herds in population.
n_max (Optional[float]) – Maximum induced speed.
foraging_speed (Optional[float]) – Foraging speed.
diffusion_speed (Optional[float]) – Maximum diffusion speed.
c_t (Optional[float]) – Constant $in [0, 2]$.
w_neighbor (Optional[Union[int, float, numpy.ndarray]]) – Inertia weights of the motion induced from neighbors \(\in [0, 1]\).
w_foraging (Optional[Union[int, float, numpy.ndarray]]) – Inertia weights of the motion induced from foraging \(\in [0, 1]\).
d_s (Optional[float]) – Maximum euclidean distance for neighbors.
max_neighbors (Optional[int]) – Maximum neighbors for neighbors effect.
crossover_rate (Optional[float]) – Crossover probability.
mutation_rate (Optional[float]) – Mutation probability.
See also
niapy.algorithms.algorithm.Algorithm.__init__()
- crossover(x, xo, crossover_rate)[source]¶
Crossover operator.
- Parameters
x (numpy.ndarray) – Krill/individual being applied with operator.
xo (numpy.ndarray) – Krill/individual being used in conjunction within operator.
crossover_rate (float) – Crossover probability.
- Returns
New krill/individual.
- Return type
numpy.ndarray
- crossover_rate(xf, yf, xf_best, xf_worst)[source]¶
Get crossover probability.
- Parameters
xf (float) –
yf (float) –
xf_best (float) –
xf_worst (float) –
- Returns
New crossover probability.
- Return type
float
- delta_t(task)[source]¶
Get new delta for all dimensions.
- Parameters
task (Task) – Optimization task.
- Returns
–
- Return type
numpy.ndarray
- get_food_location(population, population_fitness, task)[source]¶
Get food location for krill heard.
- Parameters
population (numpy.ndarray) – Current heard/population.
population_fitness (numpy.ndarray[float]) – Current heard/populations function/fitness values.
task (Task) – Optimization task.
- Returns
Location of food.
Foods function/fitness value.
- Return type
Tuple[numpy.ndarray, float]
- get_k(x, y, b, w)[source]¶
Get k values.
- Parameters
x (float) – First krill/individual.
y (float) – Second krill/individual.
b (float) – Best krill/individual.
w (float) – Worst krill/individual.
- Returns
- Return type
numpy.ndarray
- get_neighbours(i, ids, population)[source]¶
Get neighbours.
- Parameters
i (int) – Individual looking for neighbours.
ids (float) – Maximal distance for being a neighbour.
population (numpy.ndarray) – Current population.
- Returns
Neighbours of krill heard.
- Return type
numpy.ndarray
- get_parameters()[source]¶
Get parameter values for the algorithm.
- Returns
Algorithm parameters.
- Return type
Dict[str, Any]
- get_x(x, y)[source]¶
Get x values.
- Parameters
x (numpy.ndarray) – First krill/individual.
y (numpy.ndarray) – Second krill/individual.
- Returns
–
- Return type
numpy.ndarray
- induce_foraging_motion(i, x, x_f, f, weights, population, population_fitness, best_index, worst_index, task)[source]¶
Induced foraging motion operator.
- Parameters
i (int) – Index of current krill being operated.
x (numpy.ndarray) – Position of food.
x_f (float) – Fitness/function values of food.
f –
weights (numpy.ndarray[float]) – Weights for this operator.
population (numpy.ndarray) – Current population/heard.
population_fitness (numpy.ndarray[float]) – Current heard/populations function/fitness values.
best_index (numpy.ndarray) – Index of current best krill in heard.
worst_index (numpy.ndarray) – Index of current worst krill in heard.
task (Task) – Optimization task.
- Returns
Moved krill.
- Return type
numpy.ndarray
- induce_neighbors_motion(i, n, weights, population, population_fitness, best_index, worst_index, task)[source]¶
Induced neighbours motion operator.
- Parameters
i (int) – Index of individual being applied with operator.
n –
weights (numpy.ndarray[float]) – Weights for this operator.
population (numpy.ndarray) – Current heard/population.
population_fitness (numpy.ndarray[float]) – Current populations/heard function/fitness values.
best_index (numpy.ndarray) – Current best krill in heard/population.
worst_index (numpy.ndarray) – Current worst krill in heard/population.
task (Task) – Optimization task.
- Returns
Moved krill.
- Return type
numpy.ndarray
- induce_physical_diffusion(task)[source]¶
Induced physical diffusion operator.
- Parameters
task (Task) – Optimization task.
- Returns
- Return type
numpy.ndarray
- static info()[source]¶
Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
str
See also
- init_population(task)[source]¶
Initialize stating population.
- Parameters
task (Task) – Optimization task.
- Returns
Initialized population.
Initialized populations function/fitness values.
- Additional arguments:
w_neighbor (numpy.ndarray): Weights neighborhood.
w_foraging (numpy.ndarray): Weights foraging.
induced_speed (numpy.ndarray): Induced speed.
foraging_speed (numpy.ndarray): Foraging speed.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, Dict[str, Any]]
See also
niapy.algorithms.algorithm.Algorithm.init_population()
- init_weights(task)[source]¶
Initialize weights.
- Parameters
task (Task) – Optimization task.
- Returns
Weights for neighborhood.
Weights for foraging.
- Return type
Tuple[numpy.ndarray, numpy.ndarray]
- mutate(x, x_b, mutation_rate)[source]¶
Mutate operator.
- Parameters
x (numpy.ndarray) – Individual being mutated.
x_b (numpy.ndarray) – Global best individual.
mutation_rate (float) – Probability of mutations.
- Returns
Mutated krill.
- Return type
numpy.ndarray
- mutation_rate(xf, yf, xf_best, xf_worst)[source]¶
Get mutation probability.
- Parameters
xf (float) –
yf (float) –
xf_best (float) –
xf_worst (float) –
- Returns
New mutation probability.
- Return type
float
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of KrillHerd algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current heard/population.
population_fitness (numpy.ndarray[float]) – Current heard/populations function/fitness values.
best_x (numpy.ndarray) – Global best individual.
best_fitness (float) – Global best individuals function fitness values.
**params (Dict[str, Any]) – Additional arguments.
- Returns
New herd/population
New herd/populations function/fitness values.
New global best solution.
New global best solutions fitness/objective value.
- Additional arguments:
w_neighbor (numpy.ndarray): –
w_foraging (numpy.ndarray): –
induced_speed (numpy.ndarray): –
foraging_speed (numpy.ndarray): –
- Return type
Tuple [numpy.ndarray, numpy.ndarray, numpy.ndarray, float Dict[str, Any]]
- sense_range(ki, population)[source]¶
Calculate sense range for selected individual.
- Parameters
ki (int) – Selected individual.
population (numpy.ndarray) – Krill heard population.
- Returns
Sense range for krill.
- Return type
float
- set_parameters(population_size=50, n_max=0.01, foraging_speed=0.02, diffusion_speed=0.002, c_t=0.93, w_neighbor=0.42, w_foraging=0.38, d_s=2.63, max_neighbors=5, crossover_rate=0.2, mutation_rate=0.05, **kwargs)[source]¶
Set the arguments of an algorithm.
- Parameters
population_size (Optional[int]) – Number of krill herds in population.
n_max (Optional[float]) – Maximum induced speed.
foraging_speed (Optional[float]) – Foraging speed.
diffusion_speed (Optional[float]) – Maximum diffusion speed.
c_t (Optional[float]) – Constant $in [0, 2]$.
w_neighbor (Optional[Union[int, float, numpy.ndarray]]) – Inertia weights of the motion induced from neighbors \(\in [0, 1]\).
w_foraging (Optional[Union[int, float, numpy.ndarray]]) – Inertia weights of the motion induced from foraging \(\in [0, 1]\).
d_s (Optional[float]) – Maximum euclidean distance for neighbors.
max_neighbors (Optional[int]) – Maximum neighbors for neighbors effect.
crossover_rate (Optional[float]) – Crossover probability.
mutation_rate (Optional[float]) – Mutation probability.
See also
niapy.algorithms.algorithm.Algorithm.set_parameters()
- class niapy.algorithms.basic.MonarchButterflyOptimization(population_size=20, partition=0.4166666666666667, period=1.2, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Monarch Butterfly Optimization.
- Algorithm:
Monarch Butterfly Optimization
- Date:
2019
- Authors:
Jan Banko
- License:
MIT
- Reference paper:
Wang, G. G., Deb, S., & Cui, Z. (2019). Monarch butterfly optimization. Neural computing and applications, 31(7), 1995-2014.
- Variables
Name (List[str]) – List of strings representing algorithm name.
PAR (float) – Partition.
PER (float) – Period.
See also
Initialize MonarchButterflyOptimization.
- Parameters
population_size (Optional[int]) – Population size.
partition (Optional[int]) – Partition.
period (Optional[int]) – Period.
- Name = ['MonarchButterflyOptimization', 'MBO']¶
- __init__(population_size=20, partition=0.4166666666666667, period=1.2, *args, **kwargs)[source]¶
Initialize MonarchButterflyOptimization.
- Parameters
population_size (Optional[int]) – Population size.
partition (Optional[int]) – Partition.
period (Optional[int]) – Period.
- adjusting_operator(t, max_t, dimension, np1, np2, butterflies, best)[source]¶
Apply the adjusting operator.
- Parameters
t (int) – Current generation.
max_t (int) – Maximum generation.
dimension (int) – Number of dimensions.
np1 (int) – Number of butterflies in Land 1.
np2 (int) – Number of butterflies in Land 2.
butterflies (numpy.ndarray) – Current butterfly population.
best (numpy.ndarray) – The best butterfly currently.
- Returns
Adjusted butterfly population.
- Return type
numpy.ndarray
- static evaluate_and_sort(task, butterflies)[source]¶
Evaluate and sort the butterfly population.
- Parameters
task (Task) – Optimization task
butterflies (numpy.ndarray) – Current butterfly population.
- Returns
- Tuple[numpy.ndarray, float, numpy.ndarray]:
Best butterfly according to the evaluation.
The best fitness value.
Butterfly population.
- Return type
numpy.ndarray
- get_parameters()[source]¶
Get parameters values for the algorithm.
- Returns
Algorithm parameters.
- Return type
Dict[str, Any]
- static info()[source]¶
Get information of the algorithm.
- Returns
Algorithm information.
- Return type
str
See also
niapy.algorithms.algorithm.Algorithm.info()
- init_population(task)[source]¶
Initialize the starting population.
- Parameters
task (Task) – Optimization task
- Returns
New population.
New population fitness/function values.
- Additional arguments:
current_best (numpy.ndarray): Current generation’s best individual.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float], Dict[str, Any]]
- levy(_step_size, dimension)[source]¶
Calculate levy flight.
- Parameters
_step_size (float) – Size of the walk step.
dimension (int) – Number of dimensions.
- Returns
Calculated values for levy flight.
- Return type
numpy.ndarray
- migration_operator(dimension, np1, np2, butterflies)[source]¶
Apply the migration operator.
- Parameters
dimension (int) – Number of dimensions.
np1 (int) – Number of butterflies in Land 1.
np2 (int) – Number of butterflies in Land 2.
butterflies (numpy.ndarray) – Current butterfly population.
- Returns
Adjusted butterfly population.
- Return type
numpy.ndarray
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of Forest Optimization Algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray[float]) – Current population function/fitness values.
best_x (numpy.ndarray) – Global best individual.
best_fitness (float) – Global best individual fitness/function value.
**params (Dict[str, Any]) – Additional arguments.
- Returns
New population.
New population fitness/function values.
New global best solution.
New global best solutions fitness/objective value.
- Additional arguments:
current_best (numpy.ndarray): Current generation’s best individual.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- class niapy.algorithms.basic.MonkeyKingEvolutionV1(population_size=40, fluctuation_coeff=0.7, population_rate=0.3, c=3, fc=0.5, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of monkey king evolution algorithm version 1.
- Algorithm:
Monkey King Evolution version 1
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
https://www.sciencedirect.com/science/article/pii/S0950705116000198
- Reference paper:
Zhenyu Meng, Jeng-Shyang Pan, Monkey King Evolution: A new memetic evolutionary algorithm and its application in vehicle fuel consumption optimization, Knowledge-Based Systems, Volume 97, 2016, Pages 144-157, ISSN 0950-7051, https://doi.org/10.1016/j.knosys.2016.01.009.
- Variables
Name (List[str]) – List of strings representing algorithm names.
fluctuation_coeff (float) – Scale factor for normal particles.
population_rate (float) – Percent value of now many new particle Monkey King particle creates.
c (int) – Number of new particles generated by Monkey King particle.
fc (float) – Scale factor for Monkey King particles.
See also
Initialize MonkeyKingEvolutionV1.
- Parameters
population_size (int) – Population size.
fluctuation_coeff (float) – Scale factor for normal particle.
population_rate (float) – Percent value of now many new particle Monkey King particle creates. Value in rage [0, 1].
c (int) – Number of new particles generated by Monkey King particle.
fc (float) – Scale factor for Monkey King particles.
See also
niapy.algorithms.algorithm.Algorithm.__init__()
- Name = ['MonkeyKingEvolutionV1', 'MKEv1']¶
- __init__(population_size=40, fluctuation_coeff=0.7, population_rate=0.3, c=3, fc=0.5, *args, **kwargs)[source]¶
Initialize MonkeyKingEvolutionV1.
- Parameters
population_size (int) – Population size.
fluctuation_coeff (float) – Scale factor for normal particle.
population_rate (float) – Percent value of now many new particle Monkey King particle creates. Value in rage [0, 1].
c (int) – Number of new particles generated by Monkey King particle.
fc (float) – Scale factor for Monkey King particles.
See also
niapy.algorithms.algorithm.Algorithm.__init__()
- static info()[source]¶
Get basic information of algorithm.
- Returns
Basic information.
- Return type
str
See also
- init_population(task)[source]¶
Init population.
- Parameters
task (Task) – Optimization task
- Returns
Initialized solutions
Fitness/function values of solution
Additional arguments
- Return type
Tuple(numpy.ndarray[MkeSolution], numpy.ndarray[float], Dict[str, Any]]
- move_mk(x, task)[source]¶
Move Monkey King particle.
For moving Monkey King particles algorithm uses next formula: \(\mathbf{x} + \mathit{fc} \odot \mathbf{population_rate} \odot \mathbf{x}\) where \(\mathbf{population_rate}\) is two dimensional array with shape {c * D, D}. Components of this array are in range [0, 1]
- Parameters
x (numpy.ndarray) – Monkey King patricle position.
task (Task) – Optimization task.
- Returns
New particles generated by Monkey King particle.
- Return type
numpy.ndarray
- move_monkey_king_particle(p, task)[source]¶
Move Monkey King Particles.
- Parameters
p (MkeSolution) – Monkey King particle to apply this function on.
task (Task) – Optimization task.
- move_p(x, x_pb, x_b, task)[source]¶
Move normal particle in search space.
For moving particles algorithm uses next formula: \(\mathbf{x_{pb} - \mathit{differential_weight} \odot \mathbf{r} \odot (\mathbf{x_b} - \mathbf{x})\) where \(\mathbf{r}\) is one dimension array with D components. Components in this vector are in range [0, 1].
- Parameters
x (numpy.ndarray) – Particle position.
x_pb (numpy.ndarray) – Particle best position.
x_b (numpy.ndarray) – Best particle position.
task (Task) – Optimization task.
- Returns
Particle new position.
- Return type
numpy.ndarray
- move_particle(p, p_b, task)[source]¶
Move particles.
- Parameters
p (MkeSolution) – Monkey particle.
p_b (numpy.ndarray) – Population best particle.
task (Task) – Optimization task.
- move_population(pop, xb, task)[source]¶
Move population.
- Parameters
pop (numpy.ndarray[MkeSolution]) – Current population.
xb (numpy.ndarray) – Current best solution.
task (Task) – Optimization task.
- Returns
New particles.
- Return type
numpy.ndarray[MkeSolution]
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of Monkey King Evolution v1 algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray[MkeSolution]) – Current population.
population_fitness (numpy.ndarray[float]) – Current population fitness/function values.
best_x (numpy.ndarray) – Current best solution.
best_fitness (float) – Current best solutions function/fitness value.
**params (Dict[str, Any]) – Additional arguments.
- Returns
Initialized solutions.
Fitness/function values of solution.
Additional arguments.
- Return type
Tuple(numpy.ndarray[MkeSolution], numpy.ndarray[float], Dict[str, Any]]
- set_parameters(population_size=40, fluctuation_coeff=0.7, population_rate=0.3, c=3, fc=0.5, **kwargs)[source]¶
Set Monkey King Evolution v1 algorithms static parameters.
- Parameters
population_size (int) – Population size.
fluctuation_coeff (float) – Scale factor for normal particle.
population_rate (float) – Percent value of now many new particle Monkey King particle creates. Value in rage [0, 1].
c (int) – Number of new particles generated by Monkey King particle.
fc (float) – Scale factor for Monkey King particles.
See also
niapy.algorithms.algorithm.Algorithm.set_parameters()
- class niapy.algorithms.basic.MonkeyKingEvolutionV2(population_size=40, fluctuation_coeff=0.7, population_rate=0.3, c=3, fc=0.5, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.mke.MonkeyKingEvolutionV1
Implementation of monkey king evolution algorithm version 2.
- Algorithm:
Monkey King Evolution version 2
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
https://www.sciencedirect.com/science/article/pii/S0950705116000198
- Reference paper:
Zhenyu Meng, Jeng-Shyang Pan, Monkey King Evolution: A new memetic evolutionary algorithm and its application in vehicle fuel consumption optimization, Knowledge-Based Systems, Volume 97, 2016, Pages 144-157, ISSN 0950-7051, https://doi.org/10.1016/j.knosys.2016.01.009.
- Variables
Name (List[str]) – List of strings representing algorithm names.
Initialize MonkeyKingEvolutionV1.
- Parameters
population_size (int) – Population size.
fluctuation_coeff (float) – Scale factor for normal particle.
population_rate (float) – Percent value of now many new particle Monkey King particle creates. Value in rage [0, 1].
c (int) – Number of new particles generated by Monkey King particle.
fc (float) – Scale factor for Monkey King particles.
See also
niapy.algorithms.algorithm.Algorithm.__init__()
- Name = ['MonkeyKingEvolutionV2', 'MKEv2']¶
- static info()[source]¶
Get basic information of algorithm.
- Returns
Basic information.
- Return type
str
See also
- move_mk(x, task, dx=None)[source]¶
Move Monkey King particle.
For movement of particles algorithm uses next formula: \(\mathbf{x} - \mathit{fc} \odot \mathbf{dx}\)
- Parameters
x (numpy.ndarray) – Particle to apply movement on.
task (Task) – Optimization task.
dx (numpy.ndarray) – Difference between to random particles in population.
- Returns
Moved particles.
- Return type
numpy.ndarray
- class niapy.algorithms.basic.MonkeyKingEvolutionV3(*args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.mke.MonkeyKingEvolutionV1
Implementation of monkey king evolution algorithm version 3.
- Algorithm:
Monkey King Evolution version 3
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
https://www.sciencedirect.com/science/article/pii/S0950705116000198
- Reference paper:
Zhenyu Meng, Jeng-Shyang Pan, Monkey King Evolution: A new memetic evolutionary algorithm and its application in vehicle fuel consumption optimization, Knowledge-Based Systems, Volume 97, 2016, Pages 144-157, ISSN 0950-7051, https://doi.org/10.1016/j.knosys.2016.01.009.
- Variables
Name (List[str]) – List of strings that represent algorithm names.
Initialize MonkeyKingEvolutionV3.
- Name = ['MonkeyKingEvolutionV3', 'MKEv3']¶
- static info()[source]¶
Get basic information of algorithm.
- Returns
Basic information.
- Return type
str
See also
- init_population(task)[source]¶
Initialize the population.
- Parameters
task (Task) – Optimization task.
- Returns
Initialized population.
Initialized population function/fitness values.
- Additional arguments:
k (int): Starting number of rows to include from lower triangular matrix.
c (int): Constant.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float], Dict[str, Any]]
See also
niapy.algorithms.algorithm.Algorithm.init_population()
- static neg(x)[source]¶
Transform function.
- Parameters
x (Union[int, float]) – Should be 0 or 1.
- Returns
If 0 then 1 else 0.
- Return type
float
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of Monkey King Evolution v3 algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray[float]) – Current population fitness/function values.
best_x (numpy.ndarray) – Current best individual.
best_fitness (float) – Current best individual function/fitness value.
**params – Additional arguments
- Returns
Initialized population.
Initialized population function/fitness values.
- Additional arguments:
k (int): Starting number of rows to include from lower triangular matrix.
c (int): Constant.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float], Dict[str, Any]]
- class niapy.algorithms.basic.MothFlameOptimizer(population_size=50, initialization_function=<function default_numpy_init>, individual_type=None, seed=None, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
MothFlameOptimizer of Moth flame optimizer.
- Algorithm:
Moth flame optimizer
- Date:
2018
- Author:
Kivanc Guckiran and Klemen Berkovič
- License:
MIT
- Reference paper:
Mirjalili, Seyedali. “Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm.” Knowledge-Based Systems 89 (2015): 228-249.
- Variables
Name (List[str]) – List of strings representing algorithm name.
See also
Initialize algorithm and create name for an algorithm.
- Parameters
population_size (Optional[int]) – Population size.
initialization_function (Optional[Callable[[int, Task, numpy.random.Generator, Dict[str, Any]], Tuple[numpy.ndarray, numpy.ndarray[float]]]]) – Population initialization function.
individual_type (Optional[Type[Individual]]) – Individual type used in population, default is Numpy array.
seed (Optional[int]) – Starting seed for random generator.
- Name = ['MothFlameOptimizer', 'MFO']¶
- static info()[source]¶
Get basic information of algorithm.
- Returns
Basic information.
- Return type
str
See also
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of MothFlameOptimizer algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray) – Current population fitness/function values.
best_x (numpy.ndarray) – Current population best individual.
best_fitness (float) – Current best individual.
**params (Dict[str, Any]) – Additional parameters
- Returns
New population.
New population fitness/function values.
New global best solution.
New global best fitness/objective value.
- Additional arguments:
best_flames (numpy.ndarray): Best individuals.
best_flame_fitness (numpy.ndarray): Best individuals fitness/function values.
previous_population (numpy.ndarray): Previous population.
previous_fitness (numpy.ndarray): Previous population fitness/function values.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- class niapy.algorithms.basic.MultiStrategyDifferentialEvolution(population_size=40, strategies=(<function cross_rand1>, <function cross_best1>, <function cross_curr2best1>, <function cross_rand2>), *args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.de.DifferentialEvolution
Implementation of Differential evolution algorithm with multiple mutation strategies.
- Algorithm:
Implementation of Differential evolution algorithm with multiple mutation strategies
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
- Variables
Name (List[str]) – List of strings representing algorithm names.
strategies (Iterable[Callable[[numpy.ndarray[Individual], int, Individual, float, float, numpy.random.Generator], numpy.ndarray[Individual]]]) – List of mutation strategies.
Initialize MultiStrategyDifferentialEvolution.
- Parameters
strategies (Optional[Iterable[Callable[[numpy.ndarray[Individual], int, Individual, float, float, numpy.random.Generator], numpy.ndarray[Individual]]]]) – List of mutation strategies.
- Name = ['MultiStrategyDifferentialEvolution', 'MsDE']¶
- __init__(population_size=40, strategies=(<function cross_rand1>, <function cross_best1>, <function cross_curr2best1>, <function cross_rand2>), *args, **kwargs)[source]¶
Initialize MultiStrategyDifferentialEvolution.
- Parameters
strategies (Optional[Iterable[Callable[[numpy.ndarray[Individual], int, Individual, float, float, numpy.random.Generator], numpy.ndarray[Individual]]]]) – List of mutation strategies.
- evolve(pop, xb, task, **kwargs)[source]¶
Evolve population with the help multiple mutation strategies.
- Parameters
pop (numpy.ndarray) – Current population.
xb (numpy.ndarray) – Current best individual.
task (Task) – Optimization task.
- Returns
New population of individuals.
- Return type
numpy.ndarray
- get_parameters()[source]¶
Get parameters values of the algorithm.
- Returns
Algorithm parameters.
- Return type
Dict[str, Any]
- static info()[source]¶
Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
str
See also
- set_parameters(strategies=(<function cross_rand1>, <function cross_best1>, <function cross_curr2best1>, <function cross_rand2>), **kwargs)[source]¶
Set the arguments of the algorithm.
- Parameters
strategies (Optional[Iterable[Callable[[numpy.ndarray[Individual], int, Individual, float, float, numpy.random.Generator], numpy.ndarray[Individual]]]]) – List of mutation strategies.
- class niapy.algorithms.basic.MutatedCenterParticleSwarmOptimization(num_mutations=10, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.pso.CenterParticleSwarmOptimization
Implementation of Mutated Particle Swarm Optimization.
- Algorithm:
Mutated Center Particle Swarm Optimization
- Date:
2019
- Authors:
Klemen Berkovič
- License:
MIT
- Reference paper:
TODO find one
- Variables
num_mutations (int) – Number of mutations of global best particle.
Initialize MCPSO.
- Name = ['MutatedCenterParticleSwarmOptimization', 'MCPSO']¶
- get_parameters()[source]¶
Get value of parameters for this instance of algorithm.
- Returns
Dictionary which has parameters mapped to values.
- Return type
Dict[str, Union[int, float, numpy.ndarray]]
- static info()[source]¶
Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
str
See also
- run_iteration(task, pop, fpop, xb, fxb, **params)[source]¶
Core function of algorithm.
- Parameters
task (Task) – Optimization task.
pop (numpy.ndarray) – Current population of particles.
fpop (numpy.ndarray) – Current particles function/fitness values.
xb (numpy.ndarray) – Current global best particle.
(float (fxb) – Current global best particles function/fitness value.
- Returns
New population of particles.
New populations function/fitness values.
New global best particle.
New global best particle function/fitness value.
Additional arguments.
Additional keyword arguments.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, list, dict]
See also
niapy.algorithm.basic.WeightedVelocityClampingParticleSwarmAlgorithm.run_iteration()
- class niapy.algorithms.basic.MutatedCenterUnifiedParticleSwarmOptimization(num_mutations=10, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.pso.MutatedCenterParticleSwarmOptimization
Implementation of Mutated Particle Swarm Optimization.
- Algorithm:
Mutated Center Unified Particle Swarm Optimization
- Date:
2019
- Authors:
Klemen Berkovič
- License:
MIT
- Reference paper:
Tsai, Hsing-Chih. “Unified particle swarm delivers high efficiency to particle swarm optimization.” Applied Soft Computing 55 (2017): 371-383.
- Variables
Name (List[str]) – Names of algorithm.
Initialize MCPSO.
- Name = ['MutatedCenterUnifiedParticleSwarmOptimization', 'MCUPSO']¶
- static info()[source]¶
Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
str
See also
- update_velocity(v, p, pb, gb, w, min_velocity, max_velocity, task, **kwargs)[source]¶
Update particle velocity.
- Parameters
v (numpy.ndarray) – Current velocity of particle.
p (numpy.ndarray) – Current position of particle.
pb (numpy.ndarray) – Personal best position of particle.
gb (numpy.ndarray) – Global best position of particle.
w (numpy.ndarray) – Weights for velocity adjustment.
min_velocity (numpy.ndarray) – Minimal velocity allowed.
max_velocity (numpy.ndarray) – Maximal velocity allowed.
task (Task) – Optimization task.
kwargs (dict) – Additional arguments.
- Returns
Updated velocity of particle.
- Return type
numpy.ndarray
- class niapy.algorithms.basic.MutatedParticleSwarmOptimization(num_mutations=10, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.pso.ParticleSwarmAlgorithm
Implementation of Mutated Particle Swarm Optimization.
- Algorithm:
Mutated Particle Swarm Optimization
- Date:
2019
- Authors:
Klemen Berkovič
- License:
MIT
- Reference paper:
Wang, C. Li, Y. Liu, S. Zeng, a hybrid particle swarm algorithm with cauchy mutation, Proceedings of the 2007 IEEE Swarm Intelligence Symposium (2007) 356–360.
- Variables
num_mutations (int) – Number of mutations of global best particle.
See also
niapy.algorithms.basic.WeightedVelocityClampingParticleSwarmAlgorithm
Initialize MPSO.
- Name = ['MutatedParticleSwarmOptimization', 'MPSO']¶
- get_parameters()[source]¶
Get value of parameters for this instance of algorithm.
- Returns
Dictionary which has parameters mapped to values.
- Return type
Dict[str, Union[int, float, numpy.ndarray]]
- static info()[source]¶
Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
str
See also
- run_iteration(task, pop, fpop, xb, fxb, **params)[source]¶
Core function of algorithm.
- Parameters
task (Task) – Optimization task.
pop (numpy.ndarray) – Current population of particles.
fpop (numpy.ndarray) – Current particles function/fitness values.
xb (numpy.ndarray) – Current global best particle.
fxb (float) – Current global best particles function/fitness value.
- Returns
New population of particles.
New populations function/fitness values.
New global best particle.
New global best particle function/fitness value.
Additional arguments.
Additional keyword arguments.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, list, dict]
See also
niapy.algorithm.basic.WeightedVelocityClampingParticleSwarmAlgorithm.run_iteration()
- class niapy.algorithms.basic.OppositionVelocityClampingParticleSwarmOptimization(p0=0.3, w_min=0.4, w_max=0.9, sigma=0.1, c1=1.49612, c2=1.49612, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.pso.ParticleSwarmAlgorithm
Implementation of Opposition-Based Particle Swarm Optimization with Velocity Clamping.
- Algorithm:
Opposition-Based Particle Swarm Optimization with Velocity Clamping
- Date:
2019
- Authors:
Klemen Berkovič
- License:
MIT
- Reference paper:
Shahzad, Farrukh, et al. “Opposition-based particle swarm optimization with velocity clamping (OVCPSO).” Advances in Computational Intelligence. Springer, Berlin, Heidelberg, 2009. 339-348
- Variables
p0 – Probability of opposite learning phase.
w_min – Minimum inertial weight.
w_max – Maximum inertial weight.
sigma – Velocity scaling factor.
Initialize OppositionVelocityClampingParticleSwarmOptimization.
- Parameters
p0 (float) – Probability of running Opposite learning.
w_min (numpy.ndarray) – Minimal value of weights.
w_max (numpy.ndarray) – Maximum value of weights.
sigma (numpy.ndarray) – Velocity range factor.
c1 (float) – Cognitive component.
c2 (float) – Social component.
See also
niapy.algorithm.basic.ParticleSwarmAlgorithm.__init__()
- Name = ['OppositionVelocityClampingParticleSwarmOptimization', 'OVCPSO']¶
- __init__(p0=0.3, w_min=0.4, w_max=0.9, sigma=0.1, c1=1.49612, c2=1.49612, *args, **kwargs)[source]¶
Initialize OppositionVelocityClampingParticleSwarmOptimization.
- Parameters
p0 (float) – Probability of running Opposite learning.
w_min (numpy.ndarray) – Minimal value of weights.
w_max (numpy.ndarray) – Maximum value of weights.
sigma (numpy.ndarray) – Velocity range factor.
c1 (float) – Cognitive component.
c2 (float) – Social component.
See also
niapy.algorithm.basic.ParticleSwarmAlgorithm.__init__()
- get_parameters()[source]¶
Get value of parameters for this instance of algorithm.
- Returns
Dictionary which has parameters mapped to values.
- Return type
Dict[str, Union[int, float, numpy.ndarray]]
- static info()[source]¶
Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
str
See also
- init_population(task)[source]¶
Init starting population and dynamic parameters.
- Parameters
task (Task) – Optimization task.
- Returns
Initialized population.
Initialized populations function/fitness values.
Additional arguments.
- Additional keyword arguments:
personal_best (numpy.ndarray): particles best population.
personal_best_fitness (numpy.ndarray[float]): particles best positions function/fitness value.
vMin (numpy.ndarray): Minimal velocity.
vMax (numpy.ndarray): Maximal velocity.
V (numpy.ndarray): Initial velocity of particle.
S_u (numpy.ndarray): upper bound for opposite learning.
S_l (numpy.ndarray): lower bound for opposite learning.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, list, dict]
- static opposite_learning(s_l, s_h, pop, fpop, task)[source]¶
Run opposite learning phase.
- Parameters
s_l (numpy.ndarray) – lower limit of opposite particles.
s_h (numpy.ndarray) – upper limit of opposite particles.
pop (numpy.ndarray) – Current populations positions.
fpop (numpy.ndarray) – Current populations functions/fitness values.
task (Task) – Optimization task.
- Returns
New particles position
New particles function/fitness values
New best position of opposite learning phase
new best function/fitness value of opposite learning phase
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float]
- run_iteration(task, pop, fpop, xb, fxb, **params)[source]¶
Core function of Opposite-based Particle Swarm Optimization with velocity clamping algorithm.
- Parameters
task (Task) – Optimization task.
pop (numpy.ndarray) – Current population.
fpop (numpy.ndarray) – Current populations function/fitness values.
xb (numpy.ndarray) – Current global best position.
fxb (float) – Current global best positions function/fitness value.
- Returns
New population.
New populations function/fitness values.
New global best position.
New global best positions function/fitness value.
Additional arguments.
- Additional keyword arguments:
personal_best: particles best population.
personal_best_fitness: particles best positions function/fitness value.
min_velocity: Minimal velocity.
max_velocity: Maximal velocity.
v: Initial velocity of particle.
s_h: upper bound for opposite learning.
s_l: lower bound for opposite learning.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, list, dict]
- set_parameters(p0=0.3, w_min=0.4, w_max=0.9, sigma=0.1, c1=1.49612, c2=1.49612, **kwargs)[source]¶
Set core algorithm parameters.
- Parameters
p0 (float) – Probability of running Opposite learning.
w_min (numpy.ndarray) – Minimal value of weights.
w_max (numpy.ndarray) – Maximum value of weights.
sigma (numpy.ndarray) – Velocity range factor.
c1 (float) – Cognitive component.
c2 (float) – Social component.
See also
niapy.algorithm.basic.ParticleSwarmAlgorithm.set_parameters()
- class niapy.algorithms.basic.ParticleSwarmAlgorithm(population_size=25, c1=2.0, c2=2.0, w=0.7, min_velocity=-1.5, max_velocity=1.5, repair=<function reflect>, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Particle Swarm Optimization algorithm.
- Algorithm:
Particle Swarm Optimization algorithm
- Date:
2018
- Authors:
Lucija Brezočnik, Grega Vrbančič, Iztok Fister Jr. and Klemen Berkovič
- License:
MIT
- Reference paper:
Kennedy, J. and Eberhart, R. “Particle Swarm Optimization”. Proceedings of IEEE International Conference on Neural Networks. IV. pp. 1942–1948, 1995.
- Variables
Name (List[str]) – List of strings representing algorithm names
c1 (float) – Cognitive component.
c2 (float) – Social component.
w (Union[float, numpy.ndarray[float]]) – Inertial weight.
min_velocity (Union[float, numpy.ndarray[float]]) – Minimal velocity.
max_velocity (Union[float, numpy.ndarray[float]]) – Maximal velocity.
repair (Callable[[numpy.ndarray, numpy.ndarray, numpy.ndarray, Optional[numpy.random.Generator]], numpy.ndarray]) – Repair method for velocity.
See also
Initialize ParticleSwarmAlgorithm.
- Parameters
population_size (int) – Population size
c1 (float) – Cognitive component.
c2 (float) – Social component.
w (Union[float, numpy.ndarray]) – Inertial weight.
min_velocity (Union[float, numpy.ndarray]) – Minimal velocity.
max_velocity (Union[float, numpy.ndarray]) – Maximal velocity.
repair (Callable[[np.ndarray, np.ndarray, np.ndarray, dict], np.ndarray]) – Repair method for velocity.
- Name = ['WeightedVelocityClampingParticleSwarmAlgorithm', 'WVCPSO']¶
- __init__(population_size=25, c1=2.0, c2=2.0, w=0.7, min_velocity=-1.5, max_velocity=1.5, repair=<function reflect>, *args, **kwargs)[source]¶
Initialize ParticleSwarmAlgorithm.
- Parameters
population_size (int) – Population size
c1 (float) – Cognitive component.
c2 (float) – Social component.
w (Union[float, numpy.ndarray]) – Inertial weight.
min_velocity (Union[float, numpy.ndarray]) – Minimal velocity.
max_velocity (Union[float, numpy.ndarray]) – Maximal velocity.
repair (Callable[[np.ndarray, np.ndarray, np.ndarray, dict], np.ndarray]) – Repair method for velocity.
- get_parameters()[source]¶
Get value of parameters for this instance of algorithm.
- Returns
Dictionary which has parameters mapped to values.
- Return type
Dict[str, Union[int, float, numpy.ndarray]]
- static info()[source]¶
Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
str
See also
- init(task)[source]¶
Initialize dynamic arguments of Particle Swarm Optimization algorithm.
- Parameters
task (Task) – Optimization task.
- Returns
w (numpy.ndarray): Inertial weight.
min_velocity (numpy.ndarray): Minimal velocity.
max_velocity (numpy.ndarray): Maximal velocity.
v (numpy.ndarray): Initial velocity of particle.
- Return type
Dict[str, Union[float, numpy.ndarray]]
- init_population(task)[source]¶
Initialize population and dynamic arguments of the Particle Swarm Optimization algorithm.
- Parameters
task – Optimization task.
- Returns
Initial population.
Initial population fitness/function values.
Additional arguments.
- Additional keyword arguments:
personal_best (numpy.ndarray): particles best population.
personal_best_fitness (numpy.ndarray[float]): particles best positions function/fitness value.
w (numpy.ndarray): Inertial weight.
min_velocity (numpy.ndarray): Minimal velocity.
max_velocity (numpy.ndarray): Maximal velocity.
v (numpy.ndarray): Initial velocity of particle.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, list, dict]
- run_iteration(task, pop, fpop, xb, fxb, **params)[source]¶
Core function of Particle Swarm Optimization algorithm.
- Parameters
task (Task) – Optimization task.
pop (numpy.ndarray) – Current populations.
fpop (numpy.ndarray) – Current population fitness/function values.
xb (numpy.ndarray) – Current best particle.
fxb (float) – Current best particle fitness/function value.
params (dict) – Additional function keyword arguments.
- Returns
New population.
New population fitness/function values.
New global best position.
New global best positions function/fitness value.
Additional arguments.
- Additional keyword arguments:
personal_best (numpy.ndarray): Particles best population.
personal_best_fitness (numpy.ndarray[float]): Particles best positions function/fitness value.
w (numpy.ndarray): Inertial weight.
min_velocity (numpy.ndarray): Minimal velocity.
max_velocity (numpy.ndarray): Maximal velocity.
v (numpy.ndarray): Initial velocity of particle.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, dict]
See also
niapy.algorithms.algorithm.Algorithm.run_iteration
- set_parameters(population_size=25, c1=2.0, c2=2.0, w=0.7, min_velocity=-1.5, max_velocity=1.5, repair=<function reflect>, **kwargs)[source]¶
Set Particle Swarm Algorithm main parameters.
- Parameters
population_size (int) – Population size
c1 (float) – Cognitive component.
c2 (float) – Social component.
w (Union[float, numpy.ndarray]) – Inertial weight.
min_velocity (Union[float, numpy.ndarray]) – Minimal velocity.
max_velocity (Union[float, numpy.ndarray]) – Maximal velocity.
repair (Callable[[np.ndarray, np.ndarray, np.ndarray, dict], np.ndarray]) – Repair method for velocity.
- update_velocity(v, p, pb, gb, w, min_velocity, max_velocity, task, **kwargs)[source]¶
Update particle velocity.
- Parameters
v (numpy.ndarray) – Current velocity of particle.
p (numpy.ndarray) – Current position of particle.
pb (numpy.ndarray) – Personal best position of particle.
gb (numpy.ndarray) – Global best position of particle.
w (Union[float, numpy.ndarray]) – Weights for velocity adjustment.
min_velocity (numpy.ndarray) – Minimal velocity allowed.
max_velocity (numpy.ndarray) – Maximal velocity allowed.
task (Task) – Optimization task.
kwargs – Additional arguments.
- Returns
Updated velocity of particle.
- Return type
numpy.ndarray
- class niapy.algorithms.basic.ParticleSwarmOptimization(*args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.pso.ParticleSwarmAlgorithm
Implementation of Particle Swarm Optimization algorithm.
- Algorithm:
Particle Swarm Optimization algorithm
- Date:
2018
- Authors:
Lucija Brezočnik, Grega Vrbančič, Iztok Fister Jr. and Klemen Berkovič
- License:
MIT
- Reference paper:
Kennedy, J. and Eberhart, R. “Particle Swarm Optimization”. Proceedings of IEEE International Conference on Neural Networks. IV. pp. 1942–1948, 1995.
- Variables
Name (List[str]) – List of strings representing algorithm names
See also
niapy.algorithms.basic.WeightedVelocityClampingParticleSwarmAlgorithm
Initialize ParticleSwarmOptimization.
- Name = ['ParticleSwarmAlgorithm', 'PSO']¶
- class niapy.algorithms.basic.SineCosineAlgorithm(population_size=25, a=3, r_min=0, r_max=2, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of sine cosine algorithm.
- Algorithm:
Sine Cosine Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
https://www.sciencedirect.com/science/article/pii/S0950705115005043
- Reference paper:
Seyedali Mirjalili, SCA: A Sine Cosine Algorithm for solving optimization problems, Knowledge-Based Systems, Volume 96, 2016, Pages 120-133, ISSN 0950-7051, https://doi.org/10.1016/j.knosys.2015.12.022.
- Variables
Name (List[str]) – List of string representing algorithm names.
a (float) – Parameter for control in \(r_1\) value
r_min (float) – Minimum value for \(r_3\) value
r_max (float) – Maximum value for \(r_3\) value
See also
Initialize SineCosineAlgorithm.
- Parameters
population_size (Optional[int]) – Number of individual in population
a (Optional[float]) – Parameter for control in \(r_1\) value
r_min (Optional[float]) – Minimum value for \(r_3\) value
r_max (Optional[float]) – Maximum value for \(r_3\) value
See also
niapy.algorithms.algorithm.Algorithm.__init__()
- Name = ['SineCosineAlgorithm', 'SCA']¶
- __init__(population_size=25, a=3, r_min=0, r_max=2, *args, **kwargs)[source]¶
Initialize SineCosineAlgorithm.
- Parameters
population_size (Optional[int]) – Number of individual in population
a (Optional[float]) – Parameter for control in \(r_1\) value
r_min (Optional[float]) – Minimum value for \(r_3\) value
r_max (Optional[float]) – Maximum value for \(r_3\) value
See also
niapy.algorithms.algorithm.Algorithm.__init__()
- get_parameters()[source]¶
Get algorithm parameters values.
- Returns
- Return type
Dict[str, Any]
See also
niapy.algorithms.algorithm.Algorithm.get_parameters()
- static info()[source]¶
Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
str
See also
- next_position(x, best_x, r1, r2, r3, r4, task)[source]¶
Move individual to new position in search space.
- Parameters
x (numpy.ndarray) – Individual represented with components.
best_x (numpy.ndarray) – Best individual represented with components.
r1 (float) – Number dependent on algorithm iteration/generations.
r2 (float) – Random number in range of 0 and 2 * PI.
r3 (float) – Random number in range [r_min, r_max].
r4 (float) – Random number in range [0, 1].
task (Task) – Optimization task.
- Returns
New individual that is moved based on individual
x
.- Return type
numpy.ndarray
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of Sine Cosine Algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population individuals.
population_fitness (numpy.ndarray[float]) – Current population individuals function/fitness values.
best_x (numpy.ndarray) – Current best solution to optimization task.
best_fitness (float) – Current best function/fitness value.
params (Dict[str, Any]) – Additional parameters.
- Returns
New population.
New populations fitness/function values.
New global best solution.
New global best fitness/objective value.
Additional arguments.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- set_parameters(population_size=25, a=3, r_min=0, r_max=2, **kwargs)[source]¶
Set the arguments of an algorithm.
- Parameters
population_size (Optional[int]) – Number of individual in population
a (Optional[float]) – Parameter for control in \(r_1\) value
r_min (Optional[float]) – Minimum value for \(r_3\) value
r_max (Optional[float]) – Maximum value for \(r_3\) value
See also
niapy.algorithms.algorithm.Algorithm.set_parameters()
- niapy.algorithms.basic.multi_mutations(pop, i, xb, differential_weight, crossover_probability, rng, task, individual_type, strategies, **_kwargs)[source]¶
Mutation strategy that takes more than one strategy and applies them to individual.
- Parameters
pop (numpy.ndarray[Individual]) – Current population.
i (int) – Index of current individual.
xb (Individual) – Current best individual.
differential_weight (float) – Scale factor.
crossover_probability (float) – Crossover probability.
rng (numpy.random.Generator) – Random generator.
task (Task) – Optimization task.
individual_type (Type[Individual]) – Individual type used in algorithm.
strategies (Iterable[Callable[[numpy.ndarray[Individual], int, Individual, float, float, numpy.random.Generator], numpy.ndarray[Individual]]]) – List of mutation strategies.
- Returns
Best individual from applied mutations strategies.
- Return type
niapy.algorithms.modified
¶
Implementation of modified nature-inspired algorithms.
- class niapy.algorithms.modified.AdaptiveBatAlgorithm(population_size=100, starting_loudness=0.5, epsilon=0.001, alpha=1.0, pulse_rate=0.5, min_frequency=0.0, max_frequency=2.0, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Adaptive bat algorithm.
- Algorithm:
Adaptive bat algorithm
- Date:
April 2019
- Authors:
Klemen Berkovič
- License:
MIT
- Variables
Name (List[str]) – List of strings representing algorithm name.
epsilon (float) – Scaling factor.
alpha (float) – Constant for updating loudness.
pulse_rate (float) – Pulse rate.
min_frequency (float) – Minimum frequency.
max_frequency (float) – Maximum frequency.
See also
Initialize AdaptiveBatAlgorithm.
- Parameters
population_size (Optional[int]) – Population size.
starting_loudness (Optional[float]) – Starting loudness.
epsilon (Optional[float]) – Scaling factor.
alpha (Optional[float]) – Constant for updating loudness.
pulse_rate (Optional[float]) – Pulse rate.
min_frequency (Optional[float]) – Minimum frequency.
max_frequency (Optional[float]) – Maximum frequency.
- Name = ['AdaptiveBatAlgorithm', 'ABA']¶
- __init__(population_size=100, starting_loudness=0.5, epsilon=0.001, alpha=1.0, pulse_rate=0.5, min_frequency=0.0, max_frequency=2.0, *args, **kwargs)[source]¶
Initialize AdaptiveBatAlgorithm.
- Parameters
population_size (Optional[int]) – Population size.
starting_loudness (Optional[float]) – Starting loudness.
epsilon (Optional[float]) – Scaling factor.
alpha (Optional[float]) – Constant for updating loudness.
pulse_rate (Optional[float]) – Pulse rate.
min_frequency (Optional[float]) – Minimum frequency.
max_frequency (Optional[float]) – Maximum frequency.
- get_parameters()[source]¶
Get algorithm parameters.
- Returns
Arguments values.
- Return type
Dict[str, Any]
See also
niapy.algorithms.algorithm.Algorithm.get_parameters()
- static info()[source]¶
Get basic information about the algorithm.
- Returns
Basic information.
- Return type
str
See also
- init_population(task)[source]¶
Initialize the starting population.
- Parameters
task (Task) – Optimization task
- Returns
New population.
New population fitness/function values.
- Additional arguments:
loudness (float): Loudness.
velocities (numpy.ndarray[float]): Velocity.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float], Dict[str, Any]]
- local_search(best, loudness, task, **kwargs)[source]¶
Improve the best solution according to the Yang (2010).
- Parameters
best (numpy.ndarray) – Global best individual.
loudness (float) – Loudness.
task (Task) – Optimization task.
- Returns
New solution based on global best individual.
- Return type
numpy.ndarray
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of Bat Algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population
population_fitness (numpy.ndarray[float]) – Current population fitness/function values
best_x (numpy.ndarray) – Current best individual
best_fitness (float) – Current best individual function/fitness value
params (Dict[str, Any]) – Additional algorithm arguments
- Returns
New population
New population fitness/function values
- Additional arguments:
loudness (numpy.ndarray[float]): Loudness.
velocities (numpy.ndarray[float]): Velocities.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float], Dict[str, Any]]
- set_parameters(population_size=100, starting_loudness=0.5, epsilon=0.001, alpha=1.0, pulse_rate=0.5, min_frequency=0.0, max_frequency=2.0, **kwargs)[source]¶
Set the parameters of the algorithm.
- Parameters
population_size (Optional[int]) – Population size.
starting_loudness (Optional[float]) – Starting loudness.
epsilon (Optional[float]) – Scaling factor.
alpha (Optional[float]) – Constant for updating loudness.
pulse_rate (Optional[float]) – Pulse rate.
min_frequency (Optional[float]) – Minimum frequency.
max_frequency (Optional[float]) – Maximum frequency.
- class niapy.algorithms.modified.DifferentialEvolutionMTS(population_size=40, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.de.DifferentialEvolution
,niapy.algorithms.other.mts.MultipleTrajectorySearch
Implementation of Differential Evolution with MTS local searches.
- Algorithm:
Differential Evolution with MTS local searches
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
- Variables
Name (List[str]) – List of strings representing algorithm names.
See also
Initialize DifferentialEvolutionMTS.
- Name = ['DifferentialEvolutionMTS', 'DEMTS']¶
- static info()[source]¶
Get basic information about the algorithm.
- Returns
Basic information.
- Return type
str
See also
- post_selection(population, task, xb, fxb, **kwargs)[source]¶
Post selection operator.
- Parameters
population (numpy.ndarray) – Current population.
task (Task) – Optimization task.
xb (numpy.ndarray) – Global best individual.
fxb (float) – Global best fitness.
- Returns
New population.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, float]
- class niapy.algorithms.modified.DifferentialEvolutionMTSv1(*args, **kwargs)[source]¶
Bases:
niapy.algorithms.modified.hde.DifferentialEvolutionMTS
Implementation of Differential Evolution with MTSv1 local searches.
- Algorithm:
Differential Evolution with MTSv1 local searches
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
- Variables
Name (List[str]) – List of strings representing algorithm name.
Initialize DifferentialEvolutionMTSv1.
- Name = ['DifferentialEvolutionMTSv1', 'DEMTSv1']¶
- class niapy.algorithms.modified.DynNpDifferentialEvolutionMTS(*args, **kwargs)[source]¶
Bases:
niapy.algorithms.modified.hde.DifferentialEvolutionMTS
,niapy.algorithms.basic.de.DynNpDifferentialEvolution
Implementation of Differential Evolution with MTS local searches dynamic and population size.
- Algorithm:
Differential Evolution with MTS local searches and dynamic population size
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
- Variables
Name (List[str]) – List of strings representing algorithm name
See also
Initialize DynNpDifferentialEvolutionMTS.
- Name = ['DynNpDifferentialEvolutionMTS', 'dynNpDEMTS']¶
- static info()[source]¶
Get basic information about the algorithm.
- Returns
Basic information.
- Return type
str
See also
- post_selection(population, task, xb, fxb, **kwargs)[source]¶
Post selection operator.
- Parameters
population (numpy.ndarray) – Current population.
task (Task) – Optimization task.
xb (numpy.ndarray) – Global best individual.
fxb (float) – Global best fitness.
- Returns
New population.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, float]
- set_parameters(p_max=10, rp=3, **kwargs)[source]¶
Set core parameters or DynNpDifferentialEvolutionMTS algorithm.
- Parameters
p_max (Optional[int]) –
rp (Optional[float]) –
See also
niapy.algorithms.modified.hde.DifferentialEvolutionMTS.set_parameters()
:func`niapy.algorithms.basic.de.DynNpDifferentialEvolution.set_parameters`
- class niapy.algorithms.modified.DynNpDifferentialEvolutionMTSv1(*args, **kwargs)[source]¶
Bases:
niapy.algorithms.modified.hde.DynNpDifferentialEvolutionMTS
Implementation of Differential Evolution with MTSv1 local searches and dynamic population size.
- Algorithm:
Differential Evolution with MTSv1 local searches and dynamic population size
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
- Variables
Name (List[str]) – List of strings representing algorithm name.
Initialize DynNpDifferentialEvolutionMTSv1.
- Name = ['DynNpDifferentialEvolutionMTSv1', 'dynNpDEMTSv1']¶
- class niapy.algorithms.modified.DynNpMultiStrategyDifferentialEvolutionMTS(*args, **kwargs)[source]¶
Bases:
niapy.algorithms.modified.hde.MultiStrategyDifferentialEvolutionMTS
,niapy.algorithms.modified.hde.DynNpDifferentialEvolutionMTS
Implementation of Differential Evolution with MTS local searches, multiple mutation strategies and dynamic population size.
- Algorithm:
Differential Evolution with MTS local searches, multiple mutation strategies and dynamic population size
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
- Variables
Name (List[str]) – List of strings representing algorithm name
See also
Initialize DynNpMultiStrategyDifferentialEvolutionMTS.
- Name = ['DynNpMultiStrategyDifferentialEvolutionMTS', 'dynNpMSDEMTS']¶
- class niapy.algorithms.modified.DynNpMultiStrategyDifferentialEvolutionMTSv1(*args, **kwargs)[source]¶
Bases:
niapy.algorithms.modified.hde.DynNpMultiStrategyDifferentialEvolutionMTS
Implementation of Differential Evolution with MTSv1 local searches, multiple mutation strategies and dynamic population size.
- Algorithm:
Differential Evolution with MTSv1 local searches, multiple mutation strategies and dynamic population size
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
- Variables
Name (List[str]) – List of strings representing algorithm name.
See also
niapy.algorithm.modified.DynNpMultiStrategyDifferentialEvolutionMTS
Initialize DynNpMultiStrategyDifferentialEvolutionMTSv1.
- Name = ['DynNpMultiStrategyDifferentialEvolutionMTSv1', 'dynNpMSDEMTSv1']¶
- class niapy.algorithms.modified.HybridBatAlgorithm(differential_weight=0.5, crossover_probability=0.9, strategy=<function cross_best1>, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.ba.BatAlgorithm
Implementation of Hybrid bat algorithm.
- Algorithm:
Hybrid bat algorithm
- Date:
2018
- Author:
Grega Vrbančič and Klemen Berkovič
- License:
MIT
- Reference paper:
Fister Jr., Iztok and Fister, Dusan and Yang, Xin-She. “A Hybrid Bat Algorithm”. Elektrotehniški vestnik, 2013. 1-7.
- Variables
Name (List[str]) – List of strings representing algorithm name.
F (float) – Scaling factor.
CR (float) – Crossover.
See also
Initialize HybridBatAlgorithm.
- Parameters
differential_weight (Optional[float]) – Differential weight.
crossover_probability (Optional[float]) – Crossover rate.
strategy (Optional[Callable]) – DE Crossover and mutation strategy.
- Name = ['HybridBatAlgorithm', 'HBA']¶
- __init__(differential_weight=0.5, crossover_probability=0.9, strategy=<function cross_best1>, *args, **kwargs)[source]¶
Initialize HybridBatAlgorithm.
- Parameters
differential_weight (Optional[float]) – Differential weight.
crossover_probability (Optional[float]) – Crossover rate.
strategy (Optional[Callable]) – DE Crossover and mutation strategy.
- static info()[source]¶
Get basic information about the algorithm.
- Returns
Basic information.
- Return type
str
See also
- local_search(best, task, i=None, population=None, **kwargs)[source]¶
Improve the best solution.
- Parameters
best (numpy.ndarray) – Global best individual.
task (Task) – Optimization task.
i (int) – Index of current individual.
population (numpy.ndarray) – Current best population.
- Returns
New solution based on global best individual.
- Return type
numpy.ndarray
- set_parameters(differential_weight=0.5, crossover_probability=0.9, strategy=<function cross_best1>, **kwargs)[source]¶
Set core parameters of HybridBatAlgorithm algorithm.
- Parameters
differential_weight (Optional[float]) – Differential weight.
crossover_probability (Optional[float]) – Crossover rate.
strategy (Callable) – DE Crossover and mutation strategy.
- class niapy.algorithms.modified.HybridSelfAdaptiveBatAlgorithm(differential_weight=0.9, crossover_probability=0.85, strategy=<function cross_best1>, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.modified.saba.SelfAdaptiveBatAlgorithm
Implementation of Hybrid self adaptive bat algorithm.
- Algorithm:
Hybrid self adaptive bat algorithm
- Date:
April 2019
- Author:
Klemen Berkovič
- License:
MIT
- Reference paper:
Fister, Iztok, Simon Fong, and Janez Brest. “A novel hybrid self-adaptive bat algorithm.” The Scientific World Journal 2014 (2014).
- Reference URL:
- Variables
Name (List[str]) – List of strings representing algorithm name.
F (float) – Scaling factor for local search.
CR (float) – Probability of crossover for local search.
CrossMutt (Callable[[numpy.ndarray, int, numpy.ndarray, float, float, numpy.random.Generator, Dict[str, Any]) – Local search method based of Differential evolution strategy.
See also
Initialize HybridSelfAdaptiveBatAlgorithm.
- Parameters
differential_weight (Optional[float]) – Scaling factor for local search.
crossover_probability (Optional[float]) – Probability of crossover for local search.
strategy (Optional[Callable[[numpy.ndarray, int, numpy.ndarray, float, float, numpy.random.Generator, Dict[str, Any], numpy.ndarray]]) – Local search method based of Differential evolution strategy.
- Name = ['HybridSelfAdaptiveBatAlgorithm', 'HSABA']¶
- __init__(differential_weight=0.9, crossover_probability=0.85, strategy=<function cross_best1>, *args, **kwargs)[source]¶
Initialize HybridSelfAdaptiveBatAlgorithm.
- Parameters
differential_weight (Optional[float]) – Scaling factor for local search.
crossover_probability (Optional[float]) – Probability of crossover for local search.
strategy (Optional[Callable[[numpy.ndarray, int, numpy.ndarray, float, float, numpy.random.Generator, Dict[str, Any], numpy.ndarray]]) – Local search method based of Differential evolution strategy.
- get_parameters()[source]¶
Get parameters of the algorithm.
- Returns
Parameters of the algorithm.
- Return type
Dict[str, Any]
- static info()[source]¶
Get basic information about the algorithm.
- Returns
Basic information.
- Return type
str
See also
- local_search(best, loudness, task, i=None, population=None, **kwargs)[source]¶
Improve the best solution.
- Parameters
best (numpy.ndarray) – Global best individual.
loudness (float) – Loudness.
task (Task) – Optimization task.
i (int) – Index of current individual.
population (numpy.ndarray) – Current best population.
- Returns
New solution based on global best individual.
- Return type
numpy.ndarray
- set_parameters(differential_weight=0.9, crossover_probability=0.85, strategy=<function cross_best1>, **kwargs)[source]¶
Set core parameters of HybridBatAlgorithm algorithm.
- Parameters
differential_weight (Optional[float]) – Scaling factor for local search.
crossover_probability (Optional[float]) – Probability of crossover for local search.
strategy (Optional[Callable[[numpy.ndarray, int, numpy.ndarray, float, float, numpy.random.Generator, Dict[str, Any], numpy.ndarray]]) – Local search method based of Differential evolution strategy.
- class niapy.algorithms.modified.MultiStrategyDifferentialEvolutionMTS(*args, **kwargs)[source]¶
Bases:
niapy.algorithms.modified.hde.DifferentialEvolutionMTS
,niapy.algorithms.basic.de.MultiStrategyDifferentialEvolution
Implementation of Differential Evolution with MTS local searches and multiple mutation strategies.
- Algorithm:
Differential Evolution with MTS local searches and multiple mutation strategies
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
- Variables
Name (List[str]) – List of strings representing algorithm name.
See also
Initialize MultiStrategyDifferentialEvolutionMTS.
- Name = ['MultiStrategyDifferentialEvolutionMTS', 'MSDEMTS']¶
- evolve(pop, xb, task, **kwargs)[source]¶
Evolve population.
- Parameters
pop (numpy.ndarray[Individual]) – Current population of individuals.
xb (numpy.ndarray) – Global best individual.
task (Task) – Optimization task.
- Returns
Evolved population.
- Return type
numpy.ndarray[Individual]
- class niapy.algorithms.modified.MultiStrategyDifferentialEvolutionMTSv1(*args, **kwargs)[source]¶
Bases:
niapy.algorithms.modified.hde.MultiStrategyDifferentialEvolutionMTS
Implementation of Differential Evolution with MTSv1 local searches and multiple mutation strategies.
- Algorithm:
Differential Evolution with MTSv1 local searches and multiple mutation strategies
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
- Variables
Name (List[str]) – List of stings representing algorithm name.
Initialize MultiStrategyDifferentialEvolutionMTSv1.
- Name = ['MultiStrategyDifferentialEvolutionMTSv1', 'MSDEMTSv1']¶
- class niapy.algorithms.modified.MultiStrategySelfAdaptiveDifferentialEvolution(strategies=(<function cross_curr2rand1>, <function cross_curr2best1>, <function cross_rand1>, <function cross_best1>, <function cross_best2>), *args, **kwargs)[source]¶
Bases:
niapy.algorithms.modified.jde.SelfAdaptiveDifferentialEvolution
Implementation of self-adaptive differential evolution algorithm with multiple mutation strategies.
- Algorithm:
Self-adaptive differential evolution algorithm with multiple mutation strategies
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
- Variables
Name (List[str]) – List of strings representing algorithm name
Initialize MultiStrategySelfAdaptiveDifferentialEvolution.
- Parameters
strategies (Optional[Iterable[Callable]]) – Mutations strategies to use in algorithm.
- Name = ['MultiStrategySelfAdaptiveDifferentialEvolution', 'MsjDE']¶
- __init__(strategies=(<function cross_curr2rand1>, <function cross_curr2best1>, <function cross_rand1>, <function cross_best1>, <function cross_best2>), *args, **kwargs)[source]¶
Initialize MultiStrategySelfAdaptiveDifferentialEvolution.
- Parameters
strategies (Optional[Iterable[Callable]]) – Mutations strategies to use in algorithm.
- evolve(pop, xb, task, **kwargs)[source]¶
Evolve population with the help multiple mutation strategies.
- Parameters
pop (numpy.ndarray[Individual]) – Current population.
xb (Individual) – Current best individual.
task (Task) – Optimization task.
- Returns
New population of individuals.
- Return type
numpy.ndarray[Individual]
- set_parameters(strategies=(<function cross_curr2rand1>, <function cross_curr2best1>, <function cross_rand1>, <function cross_best1>, <function cross_best2>), **kwargs)[source]¶
Set core parameters of MultiStrategySelfAdaptiveDifferentialEvolution algorithm.
- Parameters
strategies (Optional[Iterable[Callable]]) – Mutations strategies to use in algorithm.
- class niapy.algorithms.modified.ParameterFreeBatAlgorithm(*args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Parameter-free Bat algorithm.
- Algorithm:
Parameter-free Bat algorithm
- Date:
2020
- Authors:
Iztok Fister Jr. This implementation is based on the implementation of basic BA from niapy
- License:
MIT
- Reference paper:
Iztok Fister Jr., Iztok Fister, Xin-She Yang. Towards the development of a parameter-free bat algorithm . In: FISTER Jr., Iztok (Ed.), BRODNIK, Andrej (Ed.). StuCoSReC : proceedings of the 2015 2nd Student Computer Science Research Conference. Koper: University of Primorska, 2015, pp. 31-34.
- Variables
Name (List[str]) – List of strings representing algorithm name.
See also
Initialize ParameterFreeBatAlgorithm.
- Name = ['ParameterFreeBatAlgorithm', 'PLBA']¶
- static info()[source]¶
Get algorithms information.
- Returns
Algorithm information.
- Return type
str
See also
- init_population(task)[source]¶
Initialize the initial population.
- Parameters
task (Task) – Optimization task
- Returns
New population.
New population fitness/function values.
- Additional arguments:
velocities (numpy.ndarray[float]): Velocities
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float], Dict[str, Any]]
- local_search(best, task, **_kwargs)[source]¶
Improve the best solution according to the Yang (2010).
- Parameters
best (numpy.ndarray) – Global best individual.
task (Task) – Optimization task.
- Returns
New solution based on global best individual.
- Return type
numpy.ndarray
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of Parameter-free Bat Algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population
population_fitness (numpy.ndarray[float]) – Current population fitness/function values
best_x (numpy.ndarray) – Current best individual
best_fitness (float) – Current best individual function/fitness value
params (Dict[str, Any]) – Additional algorithm arguments
- Returns
New population
New population fitness/function values
New global best solution
New global best fitness/objective value
- Additional arguments:
velocities (numpy.ndarray): Velocities
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- class niapy.algorithms.modified.SelfAdaptiveBatAlgorithm(min_loudness=0.9, max_loudness=1.0, min_pulse_rate=0.001, max_pulse_rate=0.1, tao_1=0.1, tao_2=0.1, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.modified.saba.AdaptiveBatAlgorithm
Implementation of Hybrid bat algorithm.
- Algorithm:
Self Adaptive Bat Algorithm
- Date:
April 2019
- Author:
Klemen Berkovič
- License:
MIT
- Reference paper:
Fister Jr., Iztok and Fister, Dusan and Yang, Xin-She. “A Hybrid Bat Algorithm”. Elektrotehniški vestnik, 2013. 1-7.
- Variables
Name (List[str]) – List of strings representing algorithm name.
A_l (Optional[float]) – Lower limit of loudness.
A_u (Optional[float]) – Upper limit of loudness.
r_l (Optional[float]) – Lower limit of pulse rate.
r_u (Optional[float]) – Upper limit of pulse rate.
tao_1 (Optional[float]) – Learning rate for loudness.
tao_2 (Optional[float]) – Learning rate for pulse rate.
See also
Initialize SelfAdaptiveBatAlgorithm.
- Parameters
min_loudness (Optional[float]) – Lower limit of loudness.
max_loudness (Optional[float]) – Upper limit of loudness.
min_pulse_rate (Optional[float]) – Lower limit of pulse rate.
max_pulse_rate (Optional[float]) – Upper limit of pulse rate.
tao_1 (Optional[float]) – Learning rate for loudness.
tao_2 (Optional[float]) – Learning rate for pulse rate.
- Name = ['SelfAdaptiveBatAlgorithm', 'SABA']¶
- __init__(min_loudness=0.9, max_loudness=1.0, min_pulse_rate=0.001, max_pulse_rate=0.1, tao_1=0.1, tao_2=0.1, *args, **kwargs)[source]¶
Initialize SelfAdaptiveBatAlgorithm.
- Parameters
min_loudness (Optional[float]) – Lower limit of loudness.
max_loudness (Optional[float]) – Upper limit of loudness.
min_pulse_rate (Optional[float]) – Lower limit of pulse rate.
max_pulse_rate (Optional[float]) – Upper limit of pulse rate.
tao_1 (Optional[float]) – Learning rate for loudness.
tao_2 (Optional[float]) – Learning rate for pulse rate.
- get_parameters()[source]¶
Get parameters of the algorithm.
- Returns
Parameters of the algorithm.
- Return type
Dict[str, Any]
- static info()[source]¶
Get basic information about the algorithm.
- Returns
Basic information.
- Return type
str
See also
- init_population(task)[source]¶
Initialize the starting population.
- Parameters
task (Task) – Optimization task
- Returns
New population.
New population fitness/function values.
- Additional arguments:
loudness (float): Loudness.
velocities (numpy.ndarray[float]): Velocity.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float], Dict[str, Any]]
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of Bat Algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population
population_fitness (numpy.ndarray[float]) – Current population fitness/function values
best_x (numpy.ndarray) – Current best individual
best_fitness (float) – Current best individual function/fitness value
params (Dict[str, Any]) – Additional algorithm arguments
- Returns
New population
New population fitness/function values
- Additional arguments:
loudness (numpy.ndarray[float]): Loudness.
pulse_rates (numpy.ndarray[float]): Pulse rate.
velocities (numpy.ndarray[float]): Velocities.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float], Dict[str, Any]]
- self_adaptation(loudness, pulse_rate)[source]¶
Adaptation step.
- Parameters
loudness (float) – Current loudness.
pulse_rate (float) – Current pulse rate.
- Returns
New loudness.
Nwq pulse rate.
- Return type
Tuple[float, float]
- set_parameters(min_loudness=0.9, max_loudness=1.0, min_pulse_rate=0.001, max_pulse_rate=0.1, tao_1=0.1, tao_2=0.1, **kwargs)[source]¶
Set core parameters of HybridBatAlgorithm algorithm.
- Parameters
min_loudness (Optional[float]) – Lower limit of loudness.
max_loudness (Optional[float]) – Upper limit of loudness.
min_pulse_rate (Optional[float]) – Lower limit of pulse rate.
max_pulse_rate (Optional[float]) – Upper limit of pulse rate.
tao_1 (Optional[float]) – Learning rate for loudness.
tao_2 (Optional[float]) – Learning rate for pulse rate.
- class niapy.algorithms.modified.SelfAdaptiveDifferentialEvolution(f_lower=0.0, f_upper=1.0, tao1=0.4, tao2=0.2, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.basic.de.DifferentialEvolution
Implementation of Self-adaptive differential evolution algorithm.
- Algorithm:
Self-adaptive differential evolution algorithm
- Date:
2018
- Author:
Uros Mlakar and Klemen Berkovič
- License:
MIT
- Reference paper:
Brest, J., Greiner, S., Boskovic, B., Mernik, M., Zumer, V. Self-adapting control parameters in differential evolution: A comparative study on numerical benchmark problems. IEEE transactions on evolutionary computation, 10(6), 646-657, 2006.
- Variables
Name (List[str]) – List of strings representing algorithm name
f_lower (float) – Scaling factor lower limit.
f_upper (float) – Scaling factor upper limit.
tao1 (float) – Change rate for differential_weight parameter update.
tao2 (float) – Change rate for crossover_probability parameter update.
Initialize SelfAdaptiveDifferentialEvolution.
- Parameters
f_lower (Optional[float]) – Scaling factor lower limit.
f_upper (Optional[float]) – Scaling factor upper limit.
tao1 (Optional[float]) – Change rate for differential_weight parameter update.
tao2 (Optional[float]) – Change rate for crossover_probability parameter update.
- Name = ['SelfAdaptiveDifferentialEvolution', 'jDE']¶
- __init__(f_lower=0.0, f_upper=1.0, tao1=0.4, tao2=0.2, *args, **kwargs)[source]¶
Initialize SelfAdaptiveDifferentialEvolution.
- Parameters
f_lower (Optional[float]) – Scaling factor lower limit.
f_upper (Optional[float]) – Scaling factor upper limit.
tao1 (Optional[float]) – Change rate for differential_weight parameter update.
tao2 (Optional[float]) – Change rate for crossover_probability parameter update.
- adaptive_gen(x)[source]¶
Adaptive update scale factor in crossover probability.
- Parameters
x (IndividualJDE) – Individual to apply function on.
- Returns
New individual with new parameters
- Return type
- evolve(pop, xb, task, **_kwargs)[source]¶
Evolve current population.
- Parameters
pop (numpy.ndarray[Individual]) – Current population.
xb (Individual) – Global best individual.
task (Task) – Optimization task.
- Returns
New population.
- Return type
numpy.ndarray
- get_parameters()[source]¶
Get algorithm parameters.
- Returns
Algorithm parameters.
- Return type
Dict[str, Any]
- static info()[source]¶
Get algorithm information.
- Returns
Algorithm information.
- Return type
str
See also
- set_parameters(f_lower=0.0, f_upper=1.0, tao1=0.4, tao2=0.2, **kwargs)[source]¶
Set the parameters of an algorithm.
- Parameters
f_lower (Optional[float]) – Scaling factor lower limit.
f_upper (Optional[float]) – Scaling factor upper limit.
tao1 (Optional[float]) – Change rate for differential_weight parameter update.
tao2 (Optional[float]) – Change rate for crossover_probability parameter update.
niapy.algorithms.other
¶
Implementation of other algorithms.
- class niapy.algorithms.other.AnarchicSocietyOptimization(population_size=43, alpha=(1, 0.83), gamma=(1.17, 0.56), theta=(0.932, 0.832), d=<function euclidean>, dn=<function euclidean>, nl=1, mutation_rate=1.2, crossover_rate=0.25, combination=<function elitism>, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Anarchic Society Optimization algorithm.
- Algorithm:
Anarchic Society Optimization algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference paper:
Ahmadi-Javid, Amir. “Anarchic Society Optimization: A human-inspired method.” Evolutionary Computation (CEC), 2011 IEEE Congress on. IEEE, 2011.
- Variables
Name (list of str) – List of stings representing name of algorithm.
alpha (List[float]) – Factor for fickleness index function \(\in [0, 1]\).
gamma (List[float]) – Factor for external irregularity index function \(\in [0, \infty)\).
theta (List[float]) – Factor for internal irregularity index function \(\in [0, \infty)\).
d (Callable[[float, float], float]) – function that takes two arguments that are function values and calculates the distance between them.
dn (Callable[[numpy.ndarray, numpy.ndarray], float]) – function that takes two arguments that are points in function landscape and calculates the distance between them.
nl (float) – Normalized range for neighborhood search \(\in (0, 1]\).
F (float) – Mutation parameter.
CR (float) – Crossover parameter \(\in [0, 1]\).
Combination (Callable[numpy.ndarray, numpy.ndarray, numpy.ndarray, numpy.ndarray, float, float, float, float, float, float, Task, numpy.random.Generator]) – Function for combining individuals to get new position/individual.
See also
Initialize AnarchicSocietyOptimization.
- Parameters
population_size (Optional[int]) – Population size.
alpha (Optional[Tuple[float, ...]]) – Factor for fickleness index function \(\in [0, 1]\).
gamma (Optional[Tuple[float, ...]]) – Factor for external irregularity index function \(\in [0, \infty)\).
theta (Optional[List[float]]) – Factor for internal irregularity index function \(\in [0, \infty)\).
d (Optional[Callable[[float, float], float]]) – function that takes two arguments that are function values and calculates the distance between them.
dn (Optional[Callable[[numpy.ndarray, numpy.ndarray], float]]) – function that takes two arguments that are points in function landscape and calculates the distance between them.
nl (Optional[float]) – Normalized range for neighborhood search \(\in (0, 1]\).
mutation_rate (Optional[float]) – Mutation parameter.
crossover_rate (Optional[float]) – Crossover parameter \(\in [0, 1]\).
combination (Optional[Callable[numpy.ndarray, numpy.ndarray, numpy.ndarray, numpy.ndarray, float, float, float, float, float, float, Task, numpy.random.Generator]]) – Function for combining individuals to get new position/individual.
- Name = ['AnarchicSocietyOptimization', 'ASO']¶
- __init__(population_size=43, alpha=(1, 0.83), gamma=(1.17, 0.56), theta=(0.932, 0.832), d=<function euclidean>, dn=<function euclidean>, nl=1, mutation_rate=1.2, crossover_rate=0.25, combination=<function elitism>, *args, **kwargs)[source]¶
Initialize AnarchicSocietyOptimization.
- Parameters
population_size (Optional[int]) – Population size.
alpha (Optional[Tuple[float, ...]]) – Factor for fickleness index function \(\in [0, 1]\).
gamma (Optional[Tuple[float, ...]]) – Factor for external irregularity index function \(\in [0, \infty)\).
theta (Optional[List[float]]) – Factor for internal irregularity index function \(\in [0, \infty)\).
d (Optional[Callable[[float, float], float]]) – function that takes two arguments that are function values and calculates the distance between them.
dn (Optional[Callable[[numpy.ndarray, numpy.ndarray], float]]) – function that takes two arguments that are points in function landscape and calculates the distance between them.
nl (Optional[float]) – Normalized range for neighborhood search \(\in (0, 1]\).
mutation_rate (Optional[float]) – Mutation parameter.
crossover_rate (Optional[float]) – Crossover parameter \(\in [0, 1]\).
combination (Optional[Callable[numpy.ndarray, numpy.ndarray, numpy.ndarray, numpy.ndarray, float, float, float, float, float, float, Task, numpy.random.Generator]]) – Function for combining individuals to get new position/individual.
- external_irregularity(x_f, xnb_f, gamma)[source]¶
Get external irregularity index.
- Parameters
x_f (float) – Individuals fitness/function value.
xnb_f (float) – Individuals new fitness/function value.
gamma (float) – External irregularity factor.
- Returns
External irregularity index.
- Return type
float
- static fickleness_index(x_f, xpb_f, xb_f, alpha)[source]¶
Get fickleness index.
- Parameters
x_f (float) – Individuals fitness/function value.
xpb_f (float) – Individuals personal best fitness/function value.
xb_f (float) – Current best found individuals fitness/function value.
alpha (float) – Fickleness factor.
- Returns
Fickleness index.
- Return type
float
- get_best_neighbors(i, population, population_fitness, rs)[source]¶
Get neighbors of individual.
Measurement of distance for neighborhood is defined with self.nl. Function for calculating distances is define with self.dn.
- Parameters
i (int) – Index of individual for hum we are looking for neighbours.
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray[float]) – Current population fitness/function values.
rs (numpy.ndarray[float]) – distance between individuals.
- Returns
Indexes that represent individuals closest to i-th individual.
- Return type
numpy.ndarray[int]
- static info()[source]¶
Get basic information about the algorithm.
- Returns
Basic information.
- Return type
str
See also
niapy.algorithms.algorithm.Algorithm.info()
- init(_task)[source]¶
Initialize dynamic parameters of algorithm.
- Parameters
_task (Task) – Optimization task.
- Returns
- Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray]
Array of self.alpha propagated values
Array of self.gamma propagated values
Array of self.theta propagated values
- init_population(task)[source]¶
Initialize first population and additional arguments.
- Parameters
task (Task) – Optimization task
- Returns
Initialized population
Initialized population fitness/function values
- Dict[str, Any]:
x_best (numpy.ndarray): Initialized populations best positions.
x_best_fitness (numpy.ndarray): Initialized populations best positions function/fitness values.
alpha (numpy.ndarray):
gamma (numpy.ndarray):
theta (numpy.ndarray):
rs (float): distance of search space.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, dict]
See also
niapy.algorithms.algorithm.Algorithm.init_population()
niapy.algorithms.other.aso.AnarchicSocietyOptimization.init()
- irregularity_index(x_f, xpb_f, theta)[source]¶
Get internal irregularity index.
- Parameters
x_f (float) – Individuals fitness/function value.
xpb_f (float) – Individuals personal best fitness/function value.
theta (float) – Internal irregularity factor.
- Returns
Internal irregularity index
- Return type
float
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of AnarchicSocietyOptimization algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current populations positions.
population_fitness (numpy.ndarray) – Current populations function/fitness values.
best_x (numpy.ndarray) – Current global best individuals position.
best_fitness (float) – Current global best individual function/fitness value.
**params – Additional arguments.
- Returns
Initialized population
Initialized population fitness/function values
New global best solution
New global best solutions fitness/objective value
- Dict[str, Union[float, int, numpy.ndarray]:
x_best (numpy.ndarray): Initialized populations best positions.
x_best_fitness (numpy.ndarray): Initialized populations best positions function/fitness values.
alpha (numpy.ndarray):
gamma (numpy.ndarray):
theta (numpy.ndarray):
rs (float): distance of search space.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, dict]
- set_parameters(population_size=43, alpha=(1, 0.83), gamma=(1.17, 0.56), theta=(0.932, 0.832), d=<function euclidean>, dn=<function euclidean>, nl=1, mutation_rate=1.2, crossover_rate=0.25, combination=<function elitism>, **kwargs)[source]¶
Set the parameters for the algorithm.
- Parameters
population_size (Optional[int]) – Population size.
alpha (Optional[Tuple[float, ...]]) – Factor for fickleness index function \(\in [0, 1]\).
gamma (Optional[Tuple[float, ...]]) – Factor for external irregularity index function \(\in [0, \infty)\).
theta (Optional[List[float]]) – Factor for internal irregularity index function \(\in [0, \infty)\).
d (Optional[Callable[[float, float], float]]) – function that takes two arguments that are function values and calculates the distance between them.
dn (Optional[Callable[[numpy.ndarray, numpy.ndarray], float]]) – function that takes two arguments that are points in function landscape and calculates the distance between them.
nl (Optional[float]) – Normalized range for neighborhood search \(\in (0, 1]\).
mutation_rate (Optional[float]) – Mutation parameter.
crossover_rate (Optional[float]) – Crossover parameter \(\in [0, 1]\).
combination (Optional[Callable[numpy.ndarray, numpy.ndarray, numpy.ndarray, numpy.ndarray, float, float, float, float, float, float, Task, numpy.random.Generator]]) – Function for combining individuals to get new position/individual.
See also
- Combination methods:
niapy.algorithms.other.elitism()
niapy.algorithms.other.crossover()
niapy.algorithms.other.sequential()
- static update_personal_best(population, population_fitness, personal_best, personal_best_fitness)[source]¶
Update personal best solution of all individuals in population.
- Parameters
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray[float]) – Current population fitness/function values.
personal_best (numpy.ndarray) – Current population best positions.
personal_best_fitness (numpy.ndarray[float]) – Current populations best positions fitness/function values.
- Returns
New personal best positions for current population.
New personal best positions function/fitness values for current population.
New best individual.
New best individual fitness/function value.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float], numpy.ndarray, float]
- class niapy.algorithms.other.HillClimbAlgorithm(delta=0.5, neighborhood_function=<function neighborhood>, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of iterative hill climbing algorithm.
- Algorithm:
Hill Climbing Algorithm
- Date:
2018
- Authors:
Jan Popič
- License:
MIT
Reference URL:
Reference paper:
See also
- Variables
delta (float) – Change for searching in neighborhood.
neighborhood_function (Callable[numpy.ndarray, float, Task], Tuple[numpy.ndarray, float]]) – Function for getting neighbours.
Initialize HillClimbAlgorithm.
- Parameters
delta (*) – Change for searching in neighborhood.
neighborhood_function (*) – Function for getting neighbours.
- Name = ['HillClimbAlgorithm', 'HC']¶
- __init__(delta=0.5, neighborhood_function=<function neighborhood>, *args, **kwargs)[source]¶
Initialize HillClimbAlgorithm.
- Parameters
delta (*) – Change for searching in neighborhood.
neighborhood_function (*) – Function for getting neighbours.
- get_parameters()[source]¶
Get parameters of the algorithm.
- Returns
Parameter name (str): Represents a parameter name
Value of parameter (Any): Represents the value of the parameter
- Return type
Dict[str, Any]
- static info()[source]¶
Get basic information about the algorithm.
- Returns
Basic information.
- Return type
str
See also
niapy.algorithms.algorithm.Algorithm.info()
- init_population(task)[source]¶
Initialize stating point.
- Parameters
task (Task) – Optimization task.
- Returns
New individual.
New individual function/fitness value.
Additional arguments.
- Return type
Tuple[numpy.ndarray, float, Dict[str, Any]]
- run_iteration(task, x, fx, best_x, best_fitness, **params)[source]¶
Core function of HillClimbAlgorithm algorithm.
- Parameters
task (Task) – Optimization task.
x (numpy.ndarray) – Current solution.
fx (float) – Current solutions fitness/function value.
best_x (numpy.ndarray) – Global best solution.
best_fitness (float) – Global best solutions function/fitness value.
**params (Dict[str, Any]) – Additional arguments.
- Returns
New solution.
New solutions function/fitness value.
Additional arguments.
- Return type
Tuple[numpy.ndarray, float, numpy.ndarray, float, Dict[str, Any]]
- class niapy.algorithms.other.MultipleTrajectorySearch(population_size=40, num_tests=5, num_searches=5, num_searches_best=5, num_enabled=17, bonus1=10, bonus2=1, local_searches=(<function mts_ls1>, <function mts_ls2>, <function mts_ls3>), *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Multiple trajectory search.
- Algorithm:
Multiple trajectory search
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
- Reference paper:
Lin-Yu Tseng and Chun Chen, “Multiple trajectory search for Large Scale Global Optimization,” 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence), Hong Kong, 2008, pp. 3052-3059. doi: 10.1109/CEC.2008.4631210
- Variables
Name (List[Str]) – List of strings representing algorithm name.
local_searches (Iterable[Callable[[numpy.ndarray, float, numpy.ndarray, float, bool, numpy.ndarray, Task, Dict[str, Any]], Tuple[numpy.ndarray, float, numpy.ndarray, float, bool, int, numpy.ndarray]]]) – Local searches to use.
bonus1 (int) – Bonus for improving global best solution.
bonus2 (int) – Bonus for improving solution.
num_tests (int) – Number of test runs on local search algorithms.
num_searches (int) – Number of local search algorithm runs.
num_searches_best (int) – Number of locals search algorithm runs on best solution.
num_enabled (int) – Number of best solution for testing.
See also
Initialize MultipleTrajectorySearch.
- Parameters
population_size (int) – Number of individuals in population.
num_tests (int) – Number of test runs on local search algorithms.
num_searches (int) – Number of local search algorithm runs.
num_searches_best (int) – Number of locals search algorithm runs on best solution.
num_enabled (int) – Number of best solution for testing.
bonus1 (int) – Bonus for improving global best solution.
bonus2 (int) – Bonus for improving self.
local_searches (Iterable[Callable[[numpy.ndarray, float, numpy.ndarray, float, bool, numpy.ndarray, Task, Dict[str, Any]], Tuple[numpy.ndarray, float, numpy.ndarray, float, bool, int, numpy.ndarray]]]) – Local searches to use.
- Name = ['MultipleTrajectorySearch', 'MTS']¶
- __init__(population_size=40, num_tests=5, num_searches=5, num_searches_best=5, num_enabled=17, bonus1=10, bonus2=1, local_searches=(<function mts_ls1>, <function mts_ls2>, <function mts_ls3>), *args, **kwargs)[source]¶
Initialize MultipleTrajectorySearch.
- Parameters
population_size (int) – Number of individuals in population.
num_tests (int) – Number of test runs on local search algorithms.
num_searches (int) – Number of local search algorithm runs.
num_searches_best (int) – Number of locals search algorithm runs on best solution.
num_enabled (int) – Number of best solution for testing.
bonus1 (int) – Bonus for improving global best solution.
bonus2 (int) – Bonus for improving self.
local_searches (Iterable[Callable[[numpy.ndarray, float, numpy.ndarray, float, bool, numpy.ndarray, Task, Dict[str, Any]], Tuple[numpy.ndarray, float, numpy.ndarray, float, bool, int, numpy.ndarray]]]) – Local searches to use.
- get_parameters()[source]¶
Get parameters values for the algorithm.
- Returns
Algorithm parameters.
- Return type
Dict[str, Any]
- grading_run(x, x_f, xb, fxb, improve, search_range, task)[source]¶
Run local search for getting scores of local searches.
- Parameters
x (numpy.ndarray) – Solution for grading.
x_f (float) – Solutions fitness/function value.
xb (numpy.ndarray) – Global best solution.
fxb (float) – Global best solutions function/fitness value.
improve (bool) – Info if solution has improved.
search_range (numpy.ndarray) – Search range.
task (Task) – Optimization task.
- Returns
New solution.
New solutions function/fitness value.
Global best solution.
Global best solutions fitness/function value.
- Return type
Tuple[numpy.ndarray, float, numpy.ndarray, float]
- static info()[source]¶
Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
str
See also
- init_population(task)[source]¶
Initialize starting population.
- Parameters
task (Task) – Optimization task.
- Returns
Initialized population.
Initialized populations function/fitness value.
- Additional arguments:
enable (numpy.ndarray): If solution/individual is enabled.
improve (numpy.ndarray): If solution/individual is improved.
search_range (numpy.ndarray): Search range.
grades (numpy.ndarray): Grade of solution/individual.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, Dict[str, Any]]
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core function of MultipleTrajectorySearch algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population of individuals.
population_fitness (numpy.ndarray) – Current individuals function/fitness values.
best_x (numpy.ndarray) – Global best individual.
best_fitness (float) – Global best individual function/fitness value.
**params (Dict[str, Any]) – Additional arguments.
- Returns
Initialized population.
Initialized populations function/fitness value.
New global best solution.
New global best solutions fitness/objective value.
- Additional arguments:
enable (numpy.ndarray): If solution/individual is enabled.
improve (numpy.ndarray): If solution/individual is improved.
search_range (numpy.ndarray): Search range.
grades (numpy.ndarray): Grade of solution/individual.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- run_local_search(k, x, x_f, xb, fxb, improve, search_range, g, task)[source]¶
Run a selected local search.
- Parameters
k (int) – Index of local search.
x (numpy.ndarray) – Current solution.
x_f (float) – Current solutions function/fitness value.
xb (numpy.ndarray) – Global best solution.
fxb (float) – Global best solutions fitness/function value.
improve (bool) – If the solution has improved.
search_range (numpy.ndarray) – Search range.
g (int) – Grade.
task (Task) – Optimization task.
- Returns
New best solution found.
New best solutions found function/fitness value.
Global best solution.
Global best solutions function/fitness value.
If the solution has improved.
Grade of local search run.
- Return type
Tuple[numpy.ndarray, float, numpy.ndarray, float, bool, numpy.ndarray, int]
- set_parameters(population_size=40, num_tests=5, num_searches=5, num_searches_best=5, num_enabled=17, bonus1=10, bonus2=1, local_searches=(<function mts_ls1>, <function mts_ls2>, <function mts_ls3>), **kwargs)[source]¶
Set the arguments of the algorithm.
- Parameters
population_size (int) – Number of individuals in population.
num_tests (int) – Number of test runs on local search algorithms.
num_searches (int) – Number of local search algorithm runs.
num_searches_best (int) – Number of locals search algorithm runs on best solution.
num_enabled (int) – Number of best solution for testing.
bonus1 (int) – Bonus for improving global best solution.
bonus2 (int) – Bonus for improving self.
local_searches (Iterable[Callable[[numpy.ndarray, float, numpy.ndarray, float, bool, numpy.ndarray, Task, Dict[str, Any]], Tuple[numpy.ndarray, float, numpy.ndarray, float, bool, int, numpy.ndarray]]]) – Local searches to use.
- class niapy.algorithms.other.MultipleTrajectorySearchV1(population_size=40, num_tests=5, num_searches=5, num_enabled=17, bonus1=10, bonus2=1, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.other.mts.MultipleTrajectorySearch
Implementation of Multiple trajectory search.
- Algorithm:
Multiple trajectory search
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
- Reference paper:
Tseng, Lin-Yu, and Chun Chen. “Multiple trajectory search for unconstrained/constrained multi-objective optimization.” Evolutionary Computation, 2009. CEC’09. IEEE Congress on. IEEE, 2009.
- Variables
Name (List[str]) – List of strings representing algorithm name.
See also
niapy.algorithms.other.MultipleTrajectorySearch`
Initialize MultipleTrajectorySearchV1.
- Parameters
population_size (int) – Number of individuals in population.
num_tests (int) – Number of test runs on local search algorithms.
num_searches (int) – Number of local search algorithm runs.
num_enabled (int) – Number of best solution for testing.
bonus1 (int) – Bonus for improving global best solution.
bonus2 (int) – Bonus for improving self.
- Name = ['MultipleTrajectorySearchV1', 'MTSv1']¶
- __init__(population_size=40, num_tests=5, num_searches=5, num_enabled=17, bonus1=10, bonus2=1, *args, **kwargs)[source]¶
Initialize MultipleTrajectorySearchV1.
- Parameters
population_size (int) – Number of individuals in population.
num_tests (int) – Number of test runs on local search algorithms.
num_searches (int) – Number of local search algorithm runs.
num_enabled (int) – Number of best solution for testing.
bonus1 (int) – Bonus for improving global best solution.
bonus2 (int) – Bonus for improving self.
- static info()[source]¶
Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
str
See also
- set_parameters(population_size=40, num_tests=5, num_searches=5, num_enabled=17, bonus1=10, bonus2=1, **kwargs)[source]¶
Set core parameters of MultipleTrajectorySearchV1 algorithm.
- Parameters
population_size (int) – Number of individuals in population.
num_tests (int) – Number of test runs on local search algorithms.
num_searches (int) – Number of local search algorithm runs.
num_enabled (int) – Number of best solution for testing.
bonus1 (int) – Bonus for improving global best solution.
bonus2 (int) – Bonus for improving self.
- class niapy.algorithms.other.NelderMeadMethod(population_size=None, alpha=0.1, gamma=0.3, rho=- 0.2, sigma=- 0.2, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Nelder Mead method or downhill simplex method or amoeba method.
- Algorithm:
Nelder Mead Method
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
- Variables
Name (List[str]) – list of strings representing algorithm name
alpha (float) – Reflection coefficient parameter
gamma (float) – Expansion coefficient parameter
rho (float) – Contraction coefficient parameter
sigma (float) – Shrink coefficient parameter
See also
Initialize NelderMeadMethod.
- Parameters
population_size (Optional[int]) – Number of individuals.
alpha (Optional[float]) – Reflection coefficient parameter
gamma (Optional[float]) – Expansion coefficient parameter
rho (Optional[float]) – Contraction coefficient parameter
sigma (Optional[float]) – Shrink coefficient parameter
- Name = ['NelderMeadMethod', 'NMM']¶
- __init__(population_size=None, alpha=0.1, gamma=0.3, rho=- 0.2, sigma=- 0.2, *args, **kwargs)[source]¶
Initialize NelderMeadMethod.
- Parameters
population_size (Optional[int]) – Number of individuals.
alpha (Optional[float]) – Reflection coefficient parameter
gamma (Optional[float]) – Expansion coefficient parameter
rho (Optional[float]) – Contraction coefficient parameter
sigma (Optional[float]) – Shrink coefficient parameter
- get_parameters()[source]¶
Get parameters of the algorithm.
- Returns
Parameter name (str): Represents a parameter name
Value of parameter (Any): Represents the value of the parameter
- Return type
Dict[str, Any]
- static info()[source]¶
Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
str
See also
- init_pop(task, population_size, **_kwargs)[source]¶
Init starting population.
- Parameters
population_size (int) – Number of individuals in population.
task (Task) – Optimization task.
- Returns
New initialized population.
New initialized population fitness/function values.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float]]
- method(population, population_fitness, task)[source]¶
Run the main function.
- Parameters
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray[float]) – Current population function/fitness values.
task (Task) – Optimization task.
- Returns
New population.
New population fitness/function values.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float]]
- run_iteration(task, population, population_fitness, best_x, best_fitness, **params)[source]¶
Core iteration function of NelderMeadMethod algorithm.
- Parameters
task (Task) – Optimization task.
population (numpy.ndarray) – Current population.
population_fitness (numpy.ndarray) – Current populations fitness/function values.
best_x (numpy.ndarray) – Global best individual.
best_fitness (float) – Global best function/fitness value.
**params (Dict[str, Any]) – Additional arguments.
- Returns
New population.
New population fitness/function values.
New global best solution
New global best solutions fitness/objective value
Additional arguments.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
- set_parameters(population_size=None, alpha=0.1, gamma=0.3, rho=- 0.2, sigma=- 0.2, **kwargs)[source]¶
Set the arguments of an algorithm.
- Parameters
population_size (Optional[int]) – Number of individuals.
alpha (Optional[float]) – Reflection coefficient parameter
gamma (Optional[float]) – Expansion coefficient parameter
rho (Optional[float]) – Contraction coefficient parameter
sigma (Optional[float]) – Shrink coefficient parameter
- class niapy.algorithms.other.RandomSearch(*args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of a simple Random Algorithm.
- Algorithm:
Random Search
- Date:
11.10.2020
- Authors:
Iztok Fister Jr., Grega Vrbančič
- License:
MIT
Reference URL: https://en.wikipedia.org/wiki/Random_search
- Variables
Name (List[str]) – List of strings representing algorithm name.
See also
Initialize RandomSearch.
- Name = ['RandomSearch', 'RS']¶
- static info()[source]¶
Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
str
See also
- init_population(task)[source]¶
Initialize the starting population.
- Parameters
task (Task) – Optimization task.
- Returns
Initial solution
Initial solutions fitness/objective value
Additional arguments
- Return type
Tuple[numpy.ndarray, float, dict]
- run_iteration(task, x, x_fit, best_x, best_fitness, **params)[source]¶
Core function of the algorithm.
- Parameters
task (Task) –
x (numpy.ndarray) –
x_fit (float) –
best_x (numpy.ndarray) –
best_fitness (float) –
**params (dict) – Additional arguments.
- Returns
New solution
New solutions fitness/objective value
New global best solution
New global best solutions fitness/objective value
Additional arguments
- Return type
Tuple[numpy.ndarray, float, numpy.ndarray, float, dict]
- class niapy.algorithms.other.SimulatedAnnealing(delta=0.5, starting_temperature=2000, delta_temperature=0.8, cooling_method=<function cool_delta>, epsilon=1e-23, *args, **kwargs)[source]¶
Bases:
niapy.algorithms.algorithm.Algorithm
Implementation of Simulated Annealing Algorithm.
- Algorithm:
Simulated Annealing Algorithm
- Date:
2018
- Authors:
Jan Popič and Klemen Berkovič
- License:
MIT
Reference URL:
Reference paper:
- Variables
Name (List[str]) – List of strings representing algorithm name.
delta (float) – Movement for neighbour search.
starting_temperature (float) –
delta_temperature (float) – Change in temperature.
cooling_method (Callable) – Neighbourhood function.
epsilon (float) – Error value.
See also
Initialize SimulatedAnnealing.
- Parameters
delta (Optional[float]) – Movement for neighbour search.
starting_temperature (Optional[float]) –
delta_temperature (Optional[float]) – Change in temperature.
cooling_method (Optional[Callable]) – Neighbourhood function.
epsilon (Optional[float]) – Error value.
- Name = ['SimulatedAnnealing', 'SA']¶
- __init__(delta=0.5, starting_temperature=2000, delta_temperature=0.8, cooling_method=<function cool_delta>, epsilon=1e-23, *args, **kwargs)[source]¶
Initialize SimulatedAnnealing.
- Parameters
delta (Optional[float]) – Movement for neighbour search.
starting_temperature (Optional[float]) –
delta_temperature (Optional[float]) – Change in temperature.
cooling_method (Optional[Callable]) – Neighbourhood function.
epsilon (Optional[float]) – Error value.
- static info()[source]¶
Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
str
See also
- init_population(task)[source]¶
Initialize the starting population.
- Parameters
task (Task) – Optimization task.
- Returns
Initial solution
Initial solutions fitness/objective value
Additional arguments
- Return type
Tuple[numpy.ndarray, float, dict]
- run_iteration(task, x, x_fit, best_x, best_fitness, **params)[source]¶
Core function of the algorithm.
- Parameters
task (Task) –
x (numpy.ndarray) –
x_fit (float) –
best_x (numpy.ndarray) –
best_fitness (float) –
**params (dict) – Additional arguments.
- Returns
New solution
New solutions fitness/objective value
New global best solution
New global best solutions fitness/objective value
Additional arguments
- Return type
Tuple[numpy.ndarray, float, numpy.ndarray, float, dict]
- set_parameters(delta=0.5, starting_temperature=2000, delta_temperature=0.8, cooling_method=<function cool_delta>, epsilon=1e-23, **kwargs)[source]¶
Set the algorithm parameters/arguments.
- Parameters
delta (Optional[float]) – Movement for neighbour search.
starting_temperature (Optional[float]) –
delta_temperature (Optional[float]) – Change in temperature.
cooling_method (Optional[Callable]) – Neighbourhood function.
epsilon (Optional[float]) – Error value.
- niapy.algorithms.other.mts_ls1(current_x, current_fitness, best_x, best_fitness, improve, search_range, task, rng, bonus1=10, bonus2=1, sr_fix=0.4, **_kwargs)[source]¶
Multiple trajectory local search one.
- Parameters
current_x (numpy.ndarray) – Current solution.
current_fitness (float) – Current solutions fitness/function value.
best_x (numpy.ndarray) – Global best solution.
best_fitness (float) – Global best solutions fitness/function value.
improve (bool) – Has the solution been improved.
search_range (numpy.ndarray) – Search range.
task (Task) – Optimization task.
rng (numpy.random.Generator) – Random number generator.
bonus1 (int) – Bonus reward for improving global best solution.
bonus2 (int) – Bonus reward for improving solution.
sr_fix (numpy.ndarray) – Fix when search range is to small.
- Returns
New solution.
New solutions fitness/function value.
Global best if found else old global best.
Global bests function/fitness value.
If solution has improved.
Search range.
- Return type
Tuple[numpy.ndarray, float, numpy.ndarray, float, bool, numpy.ndarray]
- niapy.algorithms.other.mts_ls1v1(current_x, current_fitness, best_x, best_fitness, improve, search_range, task, rng, bonus1=10, bonus2=1, sr_fix=0.4, **_kwargs)[source]¶
Multiple trajectory local search one version two.
- Parameters
current_x (numpy.ndarray) – Current solution.
current_fitness (float) – Current solutions fitness/function value.
best_x (numpy.ndarray) – Global best solution.
best_fitness (float) – Global best solutions fitness/function value.
improve (bool) – Has the solution been improved.
search_range (numpy.ndarray) – Search range.
task (Task) – Optimization task.
rng (numpy.random.Generator) – Random number generator.
bonus1 (int) – Bonus reward for improving global best solution.
bonus2 (int) – Bonus reward for improving solution.
sr_fix (numpy.ndarray) – Fix when search range is to small.
- Returns
New solution.
New solutions fitness/function value.
Global best if found else old global best.
Global bests function/fitness value.
If solution has improved.
Search range.
- Return type
Tuple[numpy.ndarray, float, numpy.ndarray, float, bool, numpy.ndarray]
- niapy.algorithms.other.mts_ls2(current_x, current_fitness, best_x, best_fitness, improve, search_range, task, rng, bonus1=10, bonus2=1, sr_fix=0.4, **_kwargs)[source]¶
Multiple trajectory local search two.
- Parameters
current_x (numpy.ndarray) – Current solution.
current_fitness (float) – Current solutions fitness/function value.
best_x (numpy.ndarray) – Global best solution.
best_fitness (float) – Global best solutions fitness/function value.
improve (bool) – Has the solution been improved.
search_range (numpy.ndarray) – Search range.
task (Task) – Optimization task.
rng (numpy.random.Generator) – Random number generator.
bonus1 (int) – Bonus reward for improving global best solution.
bonus2 (int) – Bonus reward for improving solution.
sr_fix (numpy.ndarray) – Fix when search range is to small.
- Returns
New solution.
New solutions fitness/function value.
Global best if found else old global best.
Global bests function/fitness value.
If solution has improved.
Search range.
- Return type
Tuple[numpy.ndarray, float, numpy.ndarray, float, bool, numpy.ndarray]
See also
niapy.algorithms.other.move_x()
- niapy.algorithms.other.mts_ls3(current_x, current_fitness, best_x, best_fitness, improve, search_range, task, rng, bonus1=10, bonus2=1, **_kwargs)[source]¶
Multiple trajectory local search three.
- Parameters
current_x (numpy.ndarray) – Current solution.
current_fitness (float) – Current solutions fitness/function value.
best_x (numpy.ndarray) – Global best solution.
best_fitness (float) – Global best solutions fitness/function value.
improve (bool) – Has the solution been improved.
search_range (numpy.ndarray) – Search range.
task (Task) – Optimization task.
rng (numpy.random.Generator) – Random number generator.
bonus1 (int) – Bonus reward for improving global best solution.
bonus2 (int) – Bonus reward for improving solution.
- Returns
New solution.
New solutions fitness/function value.
Global best if found else old global best.
Global bests function/fitness value.
If solution has improved.
Search range.
- Return type
Tuple[numpy.ndarray, float, numpy.ndarray, float, bool, numpy.ndarray]
- niapy.algorithms.other.mts_ls3v1(current_x, current_fitness, best_x, best_fitness, improve, search_range, task, rng, bonus1=10, bonus2=1, phi=3, **_kwargs)[source]¶
Multiple trajectory local search three version one.
- Parameters
current_x (numpy.ndarray) – Current solution.
current_fitness (float) – Current solutions fitness/function value.
best_x (numpy.ndarray) – Global best solution.
best_fitness (float) – Global best solutions fitness/function value.
improve (bool) – Has the solution been improved.
search_range (numpy.ndarray) – Search range.
task (Task) – Optimization task.
rng (numpy.random.Generator) – Random number generator.
phi (int) – Number of new generated positions.
bonus1 (int) – Bonus reward for improving global best solution.
bonus2 (int) – Bonus reward for improving solution.
- Returns
New solution.
New solutions fitness/function value.
Global best if found else old global best.
Global bests function/fitness value.
If solution has improved.
Search range.
- Return type
Tuple[numpy.ndarray, float, numpy.ndarray, float, bool, numpy.ndarray]