Contributed by: Tine Van Calster, Bart Baesens, Wilfried Lemahieu

*This article first appeared in Data Science Briefings, the **DataMiningApps** newsletter. **Subscribe now** for free if you want to be the first to receive our feature articles, or follow us **@DataMiningApps**. Do you also wish to contribute to Data Science Briefings? Shoot us an e-mail over at briefings@dataminingapps.com and let’s get in touch!*

Inspiration comes from everywhere. Researchers have used – and are still using – their surroundings and their knowledge about the natural world to come up with new methodologies to answer their research questions. These nature-inspired algorithms have been around for a long time in artificial intelligence, machine learning and optimization. Artificial Neural Networks emerged in the 1950’s, Genetic Algorithms in the 1960’s, Simulated Annealing in the late 1970’s and so on. Many of these algorithms have continued on to be extremely popular, even for recent applications. In this column, we will move away from the most well-known algorithms and take a look at more recently developed search heuristics. These optimization algorithms aim to maximize or minimize a given fitness function and are still in some way derived from nature, ranging from birds and fish, to flowers and even gravity. For each technique, we will give a short overview of its inspiration and the general steps of the algorithm.

### Cuckoo search

In 2009, Cuckoo Search (Yang et al., 2009) was created by taking a closer look at the breeding habits of certain species of cuckoos. These birds lay their eggs in the nests of other birds, which can potentially lead to abandonment of the egg, or even the entire nest, if discovered. It is therefore crucial that the cuckoo eggs are very similar to the host bird’s eggs. The optimization algorithm translates these eggs to possible solutions, where only the best eggs (i.e. undetectable eggs) can survive to the next generation. It further assumes that cuckoos can only lay one egg at a time and will hide it in a random nest, with a pre-defined number of possible nests and a pre-defined number of possible solutions in each nest. New solutions are only accepted when they are better than (one of) the previous solutions in the nests according to a chosen fitness function. Each solution has a probability P_{a} of being discovered, as the fraction P_{a }of worst nests will be abandoned at each iteration, while the remaining nests will survive to the next generation. Random new solutions can best be generated by Lévy flights, as reported by (Yang et al., 2009). This cycle continues until a stopping criterion is met. The main advantage of Cuckoo search is that only the number of nests and solutions, and the discovery probability P_{a} need to be determined, so it is easy to implement for many optimization problems.

### Gravitational search algorithm

The Gravitational Search Algorithm is based on the law of gravity and how it influences the interaction between different masses (Rashedi et al., 2009). For optimization purposes, each solution consists of an object in the solution space, and its mass is determined by the chosen fitness function. Solutions are therefore more attracted to the best solutions and will move in their direction. Each solution is aware of the mass of other solutions, so information is shared across the different objects. At each iteration, the position and the mass of an object will both be updated. This mass is further divided into the inertia mass, which makes good solutions move slower in the solution space, and active and passive gravitational masses, which make good solutions more attractive to other solutions. In order to find a good balance between the exploration of the solution space and the exploitation of good solutions, the number of possible solutions should be reduced in time. Furthermore, the gravitational constant that is used to determine the influence of one mass on the other, is decreased in time to control the search accuracy, similarly to the temperature in Simulated Annealing.

### Flower pollination algorithm

A second algorithm that was developed by Yang is the Flower Pollination Algorithm (Yang, 2012). It is inspired by both biotic pollination, where pollen is carried by pollinators such as insects, and abiotic pollination, where pollinators do not carry the pollen, but other factors such as wind play a role in the pollination process. Furthermore, the author also takes into account the difference between self-pollination, i.e. the pollen is only fruitfully exchanged between flowers of the same plant, and cross-pollination, where the pollination is carried out across different plants of the same species. For optimization purposes, self-pollination and abiotic pollination are considered to be local, while biotic and cross-pollination are the global pollination methods. Each solution is represented by a plant, which for simplicity reasons has only one flower. The algorithm has two steps that respectively represent global and local pollination. The probability P_{s} of switching between those steps has to be pre-defined. Firstly, the global pollination ensures that the pollen can be transferred over a long distance, which translates into the chosen solution being compared to the current best solution. Secondly, in the local pollination, the chosen flower is compared to a randomly chosen flower from the same population. These two steps are repeated until a stopping criterion is met. The advantage of combining local and global optimization is that the algorithm does not easily get stuck in local optima.

### Cuttlefish optimization algorithm

The last algorithm in his column, was inspired by the ability of cuttlefish to camouflage themselves by reflecting light from different layers of cells in their skin. They can change colour and pattern by the reflecting different combinations of these skin cells. The optimization algorithm by Eesa et al. (2013) incorporates both the reflection process and the visibility of the patterns to find the optimal solution. The initial population of cells is divided into four groups, based on the characteristics of the actual types of cells in the skin of cuttlefish. These four groups independently generate new solutions out of the population of cells, based on their respective reflection and visibility equations. The first and the last group both consist of a global search and are used to explore the search space by taking random factors into account. The second and third groups represent local search, as they consistently compare the new solution to the best solution and its surroundings. The reflection equations of these groups are either based on the best solution or the current solution, depending on the local or global search strategy, while the visibility equations are always based on comparing the current solution to the best solution in some way. The algorithm therefore always tracks the best global solution and only replaces the solution in a cell if the new solution has a better fitness value than the old one.

### References:

- Yang, Xin-She, and Suash Deb. “Cuckoo search via Lévy flights.” Nature & Biologically Inspired Computing, 2009. NaBIC 2009. World Congress on. IEEE, 2009.
- Rashedi, Esmat, Hossein Nezamabadi-Pour, and Saeid Saryazdi. “GSA: a gravitational search algorithm.” Information sciences 179.13 (2009): 2232-2248.
- Yang, Xin-She. “Flower pollination algorithm for global optimization.” International Conference on Unconventional Computing and Natural Computation. Springer Berlin Heidelberg, 2012.
- Eesa, Adel Sabry, Adnan Mohsin Abdulazeez, and Zeynep Orman. “Cuttlefish algorithm–a novel bio-inspired optimization algorithm.” International Journal of Scientific and Engineering Research 4.9 (2013): 1978-1986.