SciELO - Scientific Electronic Library Online

 
vol.27 número2A Partitional Clustering Approach for the Identification and Analysis of Coexisting Bacteria in Groups of Bacterial Vaginosis PatientsMethodology for the Classification of Types of Land Use in the Metropolitan Area of the Valley of Mexico based on the Spectral Signature of Satellite Images and its Effect on the Rainfall Simulation with the WRF Model índice de autoresíndice de assuntospesquisa de artigos
Home Pagelista alfabética de periódicos  

Serviços Personalizados

Journal

Artigo

Indicadores

Links relacionados

  • Não possue artigos similaresSimilares em SciELO

Compartilhar


Computación y Sistemas

versão On-line ISSN 2007-9737versão impressa ISSN 1405-5546

Comp. y Sist. vol.27 no.2 Ciudad de México Abr./Jun. 2023  Epub 18-Set-2023

https://doi.org/10.13053/cys-27-2-4622 

Articles

Comparative Analysis of the Bacterial Foraging Algorithm and Differential Evolution in Global Optimization Problems

Adrián García-López1  * 

Óscar Chávez-Bosquez1 

José Hernández-Torruco1 

Betania Hernández-Ocaña1 

11 Universidad Juárez Autónoma de Tabasco, División Académica de Ciencias y Tecnologías de la Información, Mexico. oscar.chavez@ujat.mx, jose.hernandezt@ujat.mx, betania.hernandez@ujat.mx.


Abstract:

There are bio-inspired metaheuristics in nature rarely used in áreas where there is not domain or knowledge of computational algorithms, to mention some, medicine, finance and administration. TS-MBFOA, a bacteria-based algorithm and the Differential Evolution Algorithm (DEA), are metaheuristic algorithms proposed for the optimization of complex problems mathematically modeled as linear or non-linear problems. In this paper, these algorithms are implemented to analyze their performance in the search for better solutions in constrained optimization problems. Tests were conducted on four optimization problems known in the literature as benchmark problems. Both algorithms were run in 30 independent executions for each problem with the same number of generations and evaluations. Although the parameters of each algorithm are different, the number of evaluations was selected for a fair comparison. Results are similar for both algorithms, however, DEA obtains better results for the problem with the larger number of constraints. Additionally, DEA generates solutions in less time than TS-MBFOA. The nonparametric Wilcoxon Signed Rank Test indicates significant differences in only 3 problems. The convergence graph of both algorithms for each problem shows that after 50 generations, both algorithms are close to the best known solution in the state of the art.

Keywords: Bacterial foraging; differential evolution; global optimization; metaheuristics

1 Introduction

Bio-inspired algorithms are computational techniques inspired by nature, primarily the simulation or emulation of simple and intelligent processes of certain animals, insects, or bacteria in search of food or shelter.

These algorithms arise to improving search algorithms to solve numerical and combinatorial optimization problems [9]. These algorithms are classified as metaheuristics and incorporate techniques and strategies to design or improve mathematical procedures aimed at obtaining high performance. [18].

Metaheuristics generate a set of results for a particular problem that is totally or approximately global optimum.

These algorithms are classified into two groups based on different natural phenomena: Evolutionary Algorithms (EAs) emulate the evolutionary process of the species [2] and Swarm Intelligence Algorithms (SIAs) emulate the collaborative behaviour of certain simple and intelligent species such as bacteria [14], bees [8], ants [1], among others [3]. Metaheuristics were created to solve unconstrained optimization problems.

However, to handle this problems, mechanisms such as feasibility rules, special operators, decoders, among others are implemented. The use of metaheuristics is an effective alternative for solving Constraint Numerical optimization Problems (CNOPs) [12].

Generally, a CNOP is known as a general nonlinear programming problem and can be defined as:

minimize:f(x)

subject to:

gi(x)=0,i=1,2,,mor (1)

hj(x)0,j=1,2,,p, (2)

where xn such that n1, where x is the solution vector x=[x1,x2,x3,,xn]T, where each xi, i=1,2,3,,n is delimited by the lower and upper limit LixiUi, k=1,2,,D; D is the number of design variables, m is the number of inequality constraints, and p is the number of equality constraints (in both cases, the constraints can be linear or non-linear).

If we denote by F the feasible region (where all the solutions that satisfy the problem are found) and by S the entire search space, then FS [7].

There are different EAs techniques used to solve CNOP, highlighting: Genetic Programming (GP), Genetic Algorithms (GA), Evolutionary Programming (EP) and Differential Evolution Algorithm (DEA).

DEA is a simple and easy to implement technique using the basic operators of genetic algorithms: mutation, crossover and selection. Despite its simplicity and smallthe number of parameters, it generates good results in CNOPs.

Since then, DEA has been proven in competitions such as the International Contest on Evolutionary optimization (ICEO) of the IEEE [15, 16] and in a wide variety of real-world applications, such as the optimization of the four-bar mechanism [19] or for global optimization of engineering and chemical processes [10].

The Bacterial Foraging Optimization Algorithm (BFOA) is a SIA based on foraging of Escherichia Coli bacteria [14], which simulates the process of chemotaxis (swim and tumble), swarming, reproduction, and elimination-dispersal.

These bacteria in the process face several problems in their search for food [11]. From this algorithm, significant modifications were made: the number of parameters was reduced, an operator for handling constraints was incorporated [13] and a mutation operator similar to an EAs [6]; this is called Two-Swim Modified-BFOA (TS-MBFOA).

This metaheuristic allows competitive and favourable results when solving CNOPs, with a good configuration of its parameters. This approach has been successfully used to solve engineering design problems, such as the well-known Tension Compression Spring [7], the generation of healthy menus [4] and optimization of a Smart-Grid [5].

In the real world, TS-MBFOA and DEA have implementations for solving optimization problems, however, these algorithms are not fully exploited in different áreas where researchers are not aware of their adaptation and implementation.

Therefore, this research is motivated to explore the capabilities of both algorithms in the solution of particular CNOPs known as: Pressure Vessel, Process Synthesis MINLP, Tension Compression Spring and Quadratically constrained quadratic program. These algorithms are implemented in a free and cross-platform programming language.

Results obtained were validated using basic statistics such as best value, mean, median, standard deviation and worst value.

Also, the nonparametric Wilcoxon Signed Rank Test (WSRT) was applied to measure the consistency of the results.

Finally, convergence graphs of the median number of executions for each algorithm in each problem are presented to notice the performance of the algorithms.

2 Two-Swim Modified Bacterial Foraging Optimization Algorithm (TS-MBFOA)

The TS-MBFOA is a proposed algorithm for solving CNOPs [7], where bacteria i is a potential solution and is denoted as θi(j,G), where j is the chemotaxis loop and G is the generational loop (chemotaxis, swarming, reproduction and elimination-dispersal).

The chemotaxis process is interleaved with exploitation or exploration swim in each cycle.

The process begins with the classic swim (exploration and mutation between bacteria) and is calculated with Eq. 3, where a bacterium will not necessarily interleave exploration and exploitation swims, because if the new position of a given swim θi(j+1,G) has better fitness than the original position θi(j,G), then another swim at the same direction will occur in the next loop.

Otherwise, a new tumble will be calculated. The process stops after Nc attempts. The exploration swim uses the mutation between bacteria and is calculated by:

θi(j+1,G)=θi(j,G)+(β)(θ1r(j,G)θ2r(j,G)). (3)

The swim operator is calculated with Eq. 4:

θi(j+1,G)=θi(j,G)+C(i,G)ϕ(i), (4)

where ϕ(i) is calculated with the original BFOA tumble operator defined in Eq. 5:

ϕ(i)=Δ(i)Δ(i)TΔ(i), (5)

where Δ(i)T is a random vector generated with elements inside an interval [1,1]. C(i,G) is the random step size of each bacteria updated with Eq. 6:

C(i,G)=RΘ(i), (6)

where Θ(i) is a random vector of size n with elements within the range of each decision variable: [Uk,Lk], k=1,,n, and R is a user-defined parameter for scaling the step size.

In the middle cycle of the chemotaxic process, the swarming operator is applied with Eq. 7:

θi(j+1,G)=θi(j,G)+β(θB(G)θi(j,G)). (7)

Bacteria are ordered in the reproduction process, eliminating the worst bacteria SbSr and the best bacteria are duplicated every certain number of loops.

In elimination-dispersión, the worst bacteria are eliminated from the population θw(j,G) and a new one is random generated. In this proposal, the original bias mechanism of the TS-MBFOA is not used to consume less computational cost. TS-MBFOA pseudocode is presented in Algorithm 1.

Algorithm 1 TS-MBFOA pseudocode. Sb is the number of bacteria, Nc is the number of chemotaxis cycles, β is the scaling factor, R is the stepsize, Sr is the number of bacteria to reproduce, Repcycle is the reproduction frequency and GMAX is the number of generations. 

3 Differential Evolution Algorithm (DEA)

DEA, developed by Storn and Price in 1995, was proposed to solve numerical optimization problems [17].

This algorithm is competitive in global optimization problems, its strategy is based on population search. The algorithm starts from a population of NP D-dimensional individuals, also called parent vectors.

An individual of the NP population represents a solution to the problem and is computed as in Eq. 8:

xj,i,g,i=1,2,3,,NP,j=1,,D,g=1,2,,GMAX, (8)

where xj,i,g is an individual with j dimensions or number of variables. i is the individual in the population NP and g is the number of generations in the process the algorithm to the maximum number GMAX. The process of the DE algorithm is described below.

Initialization process: individuals are randomly generated within the search space limited by the upper and lower limit of each problem variable, i.e.: LjxjUj. In each generation, individuals mutate, recombine and select to produce new offspring. If the descendant performs better than the parent, it is integrated into the next generation.

Mutation process: the search direction controls the magnitude of displacement in the search space and the speed of convergence to the optimal solution. The mutated vector is constructed from two vectors weighted by a scaling factor, as shown in Eq. 9:

vj,i,g=xj,i,g,r1+F(xj,i,g,r2xj,i,gr3),r1r2r3i, (9)

where vj,i,g is the descendant generated by crossing three individuals from the population r1, r2, r3 totally different from each other and randomly selected with a uniform distribution.

Crossover process: controls the recombination of the mutation of individuals to generate a new descendant, where the CR operator is an end-user-defined parameter, which can be randomly or statically defined.

In the selection process: the descendant is evaluated in the problem function. If the descendant obtains a better result than the parent, then it replaces the parent in the next generation of the algorithm, otherwise the parent is kept (Eq. 10):

xg+1,i={vj,i,giff(vj,i,g)f(xj,i,g),xj,i,gotherwise, (10)

where f is the objective function of the problem to optimize. The Algorithm 2 shows the classic DEA.

Algorithm 2 DEA pseudocode. The parameters are as follows: Population of individuals, F is mutation, CR is crossover and GMAX is the number of generations. 

4 Experimentation and Results

TS-MBFOA and DEA were implemented in the Java programming language, a free and cross-platform language.

The CNOPs to solve by both algorithms have their own characteristics, such as: different numbers of variables, number and types of constraints, ranges of variables, among others; as presented below in its mathematical model.

Problem 1: Pressure Vessel.

Minimize: 0.6224x1x3x4+1.7781x2x32+3.1661x12x4+19.84x12x3.

subject to:

g1(x)=x1+0.0193x30, (11)

g2(x)=x2+0.00954x30, (12)

g3(x)=πx32x443πx33+12960000, (13)

g4(x)=x42400, (14)

where: 1 ≤ x1 ≤ 99, 1 ≤ x2 99, 10 ≤ x3 ≤ 200 and 10 ≤ x4 ≤ 200:

f(x)*=6059.946. (15)

Problem 2: Process Synthesis MINLP.

Minimize: (y11)2+(y22)2+(y31)2log(y4+1)+(x11)2+(x22)2+(x33)2

subject to:

y1+y2+y3+x1+x2+x35, (16)

y32+x12+x22+x325.5, (17)

y1+x11.2, (18)

y2+x21.8, (19)

y3+x32.5, (20)

y4+x11.2, (21)

y22+x221.64, (22)

y32+x324.25, (23)

y22+x324.64, (24)

where: 0xi(1.2,1.8,2.5) i=1,2,3

yi=0,1, i=1,2,3,4:

f(x)*=4.579582. (25)

Problem 3: Tension Compression Spring. Minimize: (N+2)Dd2 subject to:

g1(x)=1D3N71785d40, (26)

g2(x)=4D2dD12566(Dd3d4)+15108d210, (27)

g3(x)=1140.45dD2N0, (28)

g4(x)=D+d1.510, (29)

(30)

where: 0.05 ≤ d ≤ 2, 0.25 ≤ D 1.3 and 2 ≤ N ≤ 15:

f(x)*=0.012681. (31)

Problem 4: Quadratically constrained quadratic program.

Minimize: x1414x12+24x1x22

subject to:

x1+x280, (32)

x2x122x1+20, (33)

where: (8,0)xi(10,10) i=1,2:

f(x)*=118.7048. (34)

Each algorithm has parameters that must be calibrated to generate competitive results.

Previously, we performed tests in search of a good calibration, to run each algorithm on the test problems.

Tab. 1 presents the parameter settings of the TS-MBFOA and DEA. TS-MBFOA and DEA, where a number of 500 fixed generations were established for each algorithm.

Table 1 TS-MBFOA and DEA Parameters 

TS-MBFOA DEA
Parameter Value Parameter Value
Sb 15 Population 50
R 0.0005 F 0.7
Nc 8 CR 0.8
β 1.95 GMAX 500
Sr 1
Repcycle 100
GMAX 500

This yields to around 60,500 evaluations for each algorithm, which allows a fair comparison between the results and to plot the convergence of the algorithms.

In the tests conducted with each algorithm, 30 independent iterations were adjusted to measure the consistency of the results.

The results of the independent runs of each algorithm with the four CNOPs are presented in Tab. 2, where CNOP is the problem to solve, f(x)* is the best known value in the literature, Nv is the number of design variables, Alg are the algorithms used, BV is the best value found and, T is the time in seconds of total iterations.

Table 2 Best result found by the TS-MBFOA and DEA in 30 independent executions to each CNOPs 

CNOP Nv Alg Bv T
44
Problem 1 4 TS-MBFOA 8796.84951 20
f(x)*= 6059.946 DEA 9375.75820
55
Problem 2 7 TS-MBFOA 4.26075 31
f(x)*= 4.579582 DEA 3.73269
96
Problem 3 3 TS-MBFOA 0.012665 21
f(x)*= 0.012681 DEA 0.012665
22
Problem 4 2 TS-MBFOA -118.70485 27
f(x)*= -118.704 DEA -118.70485

TS-MBFOA obtained better results than DEA in most of the runs. However, the runtime of DEA was smaller in many cases.

In problem 1, the TS-MBFOA obtained a result of 8796.84951, better than the one obtained by DEA. However, this solution is feasible but not competitive with the best known optimal solution, which is f(x)* = 6059.946.

For problem 2, DEA finds the best optimal solution of 3.73269. With TS-MBFOA, the best value found is 4.26075, a non-competitive solution. For problems 3 and 4, both algorithms find the global optimum of the problem similar to the optimal solution known in the state-of-the-art.

Generally speaking, both algorithms generate results in less than 60 seconds, except for problem 3. With TS-MBFOA this experiment tooked 96 seconds, but it is a reasonable time with a competitive result.

For each experiment performed with both metaheuristics, basic statistics were applied to check the consistency of the results obtained.

Tab. 3 shows the basic statistics of the 30 iterations performed for each CNOPs with both algorithms. The TS-MBFOA obtains better results in problem 1. In problems 3 and 4, both algorithms obtain equal results.

Table 3 Basic statistics of the best results of the 30 iterations of the TS-MBFOA and DEA. The best values are highlighted in bold 

CNOP Measure TS-MBFOA DEA
Problem 1
f(x)*= 6059.946
Media 8796.84982 9375.75820
Median 8796.84951 9375.75820
Std. dev. 0.00166 5.45696E-12
Worst 8796.85879 9375.75820
Problem 2
f(x)*= 4.579582
Media 4.27758 3.73305
Median 4.27485 3.73305
Std. dev. 0.01223 2.01891E-4
Worst 4.32189 3.73348
Problem 3
f(x)*= 0.012681
Media 0.012665 0.012665
Median 0.012665 0.012665
Std. dev. 0.0000013 7.110684E-12
Worst 0.012672 0.012665
Problem 4
f(x)*= -118.704
Media -118.7048 -118.70485
Median -118.7048 -118.70485
Std. dev. 7.21821E14 4.26325E-14
Worst -118.7048 -118.7048

In problem 2, DEA obtains a better result. According to the standard deviation, the consistency of the solutions found by DEA is better than TS-MBFOA.

Figs. 1 and 2 present the behavior of both algorithms at each CNOP of iteration number 15 of the 30 runs performed independently (median).

Fig. 1 Typical convergence in CNOP 1 and 2 

Fig. 2 Typical convergence in CNOP 3 and 4 

The convergence of both algorithms is similar in 3 of the 4 CNOPs, both algorithms from the first 50 generations already reach optimal solutions.

Only in problem 2 both algorithms require more than 200 generations to converge to an optimal solution. It is worth mentioning that CNOP 2 is a highly constrained problem. The WSRT non-parametric test was applied with a 95% confidence level, being 5% the significant level.

The result obtained in CNOP 1, 2, and 3 is p-value = 0.00001, a value lower than the significant level.

This indicates a significant difference between the results of both algorithms. Therefore, the lower the p-value, the more significant the result. For CNOP 4 the p-value = 0, this indicates that there is no significant difference.

5 Conclusion and Future Work

In this work, two metaheuristics were implemented to solve a set of numerical optimization problems with constraints: TS-MBFOA, a swarm intelligence algorithm, and DEA, an evolutionary algorithm. Four benchmark problems were tested on both algorithms, programmed in Java Language.

30 independent runs were performed by each algorithm on each test problem with a number of 500 generations and approximately 60,500 evaluations.

The parameters of each algorithm were adjusted to the number of evaluations allowed in this work. Basic statistics and a non-parametric test called Wilcoxon Signed Rank Test was applied to know the quality and consistency of the results, where both algorithms obtained similar results.

DEA has better consistency of results according to the standard deviation obtained in each problem. TS-MBFOA and DEA obtained better quality results in 1 of the 4 problems. The non-parametric test indicates that there is no significant difference between the results of both algorithms in problem 4.

In the remaining three problems, there is a significant difference (problem 1, 2 and 3). With respect to execution times, both algorithms generate solutions in seconds, however, DEA is the one that generates results in less time with respect to TS-MBFOA and this is due to the simplicity of the processes and the small number of parameters of the evolutionary algorithm.

The convergence graph of each algorithm shows that the TS-MBFOA and DEA after 50 generations begin to converge, but in problem 2, both algorithms converge after 200 generations.

It is necessary to perform a finer adjustment of the parameters of each algorithm and to test both algorithms in more benchmark problems.

Acknowledgments

We thank CONACYT (Ministry of Science in Mexico) for supporting the Doctoral program in Computer Science at the Universidad Juárez Autónoma de Tabasco.

References

1. Dorigo, M., Maniezzo, V., Colorni, A. (1996). The ant system: Optimization by a colony of cooperating agents. IEEE Transactions of Systems, Man and Cybernetics-Part B, Vol. 26, No. 1, pp. 29–41. DOI: 10.1109/3477.484436. [ Links ]

2. Eiben, A. E., Smith, J. E. (2003). Introduction to evolutionary computing. Natural Computing Series. [ Links ]

3. Engelbrecht, A. (2005). Fundamentals of computational swarm intelligence. John Wiley & Sons. [ Links ]

4. Hernández-Ocaña, B., Chávez-Bosquez, O., Hernández-Torruco, J., Canul-Reich, J., Pozos-Parra, P. (2018). Bacterial foraging optimization algorithm for menu planning. IEEE Access, Vol. 6, pp. 8619–8629. DOI: 10.1109/ACCESS.2018.2794198. [ Links ]

5. Hernández-Ocaña, B., Hernández-Torruco, J., Chávez-Bosquez, O., Calva-Yáñez, M. B., Portilla-Flores, E. A. (2019). Bacterial foraging-based algorithm for optimizing the power generation of an isolated microgrid. Applied Sciences, Vol. 9, No. 6. DOI: 10.3390/app9061261. [ Links ]

6. Hernández-Ocaña, B., Pozos-Parra, M. D. P., Mezura-Montes, E. (2016). Improved modified bacterial foraging optimization algorithm to solve constrained numerical optimization problems. Applied Mathematics and Information Sciences, Vol. 10, No. 2, pp. 607–622. DOI: 10.18576/amis/100220. [ Links ]

7. Hernández-Ocaña, B., Pozos-Parra, M. P., Mezura-Montes, E., Portilla-Flores, E. A., Vega-Alvarado, E., Calva-Yáñez, M. B. (2016). Two-swim operators in the modified bacterial foraging algorithm for the optimal synthesis of four-bar mechanisms. Computational Intelligence and Neuroscience, Vol. 2016, pp. 1–18. DOI: 10.1155/2016/4525294. [ Links ]

8. Karaboga, D., Basturk, B. (2007). Powerful and efficient algorithm for numerical function optimization: Artificial bee colony (ABC) algorithm. Journal of Global Optimization, Vol. 39, No. 3, pp. 459–471. DOI: 10.1007/s10898-007-9149-x. [ Links ]

9. León, J. A. (2009). Diseño e implementación en hardware de un algoritmo bio-inspirado. Ph.D. thesis, Instituto Politécnico Nacional. Centro de Investigación en Computación. [ Links ]

10. Martínez-Zecua, M. Y., Salamanca-Vázquez, L. A., Flores-Pulido, L., Portilla-Flores, E. A., Ortiz-Arroyo, A. (2019). Evolución diferencial para la optimización global de procesos de ingeniería química. Research in Computing Science, Vol. 148, No. 8. DOI: 10.13053/rcs-148-8-2. [ Links ]

11. Mezura-Montes, E., Cetina-Domínguez, O., Hernández-Ocaña, B. (2010). Nuevas heurísticas inspiradas en la naturaleza para optimización numérica. , pp. 249–272. [ Links ]

12. Mezura-Montes, E., Coello-Coello, C. A. (2011). Constraint-handling in nature-inspired numerical optimization: Past, present and future. Swarm and Evolutionary Computation, Vol. 1, No. 4, pp. 173–194. DOI: 10.1016/j.swevo.2011.10.001. [ Links ]

13. Mezura-Montes, E., Hernández-Ocaña, B. (2008). Bacterial foraging for engineering design problems: Preliminary results. Memorias del 4o Congreso Nacional de Computación Evolutiva (COMCEV). [ Links ]

14. Passino, K. (2002). Biomimicry of bacterial foraging for distributed optimization and control. IEEE Control Systems Magazine, Vol. 22, No. 3, pp. 52–67. DOI: 10.1109/MCS.2002.1004010. [ Links ]

15. Price, K. (1997). Differential evolution vs. the functions of the 2nd ICEO. Proceedings of 1997 IEEE International Conference on Evolutionary Computation (ICEC 97), pp. 153–157. DOI: 10.1109/ICEC.1997.592287. [ Links ]

16. Price, K., Storn, R. M., Lampinen, J. A. (2006). Differential evolution: A practical approach to global optimization. Springer Science & Business Media. [ Links ]

17. Storn, R., Price, K. (1997). Differential evolution-a simple and efficient heuristic for global optimization over continuous spaces. Journal of Global Optimization, Vol. 11, No. 4, pp. 341–359. DOI: 10.1023/A:1008202821328. [ Links ]

18. Suárez, O. (2011). Una aproximación a la heurística y metaheurísticas. INGE@ UAN-Tendencias en la Ingeniería, Vol. 1, No. 2. [ Links ]

19. Zapata-Zapata, M. F., Mezura-Montes, E., Portilla-Flores, E. A. (2017). Evolución diferencial con memoria de parámetros para la optimización de mecanismos de cuatro barras. Research in Computing Science, Vol. 134, No. 1, pp. 9–22. [ Links ]

Received: October 02, 2022; Accepted: December 15, 2022

* Corresponding author: Adrián García-López, e-mail: 221H18002@alumno.ujat.mx

Creative Commons License This is an open-access article distributed under the terms of the Creative Commons Attribution License