SciELO - Scientific Electronic Library Online

 
 número46Constricted Particle Swarm Optimization based Algorithm for Global OptimizationMap Building of Unknown Environment Using L1-norm, Point-to-Point Metric and Evolutionary Computation índice de autoresíndice de assuntospesquisa de artigos
Home Pagelista alfabética de periódicos  

Serviços Personalizados

Journal

Artigo

Indicadores

Links relacionados

  • Não possue artigos similaresSimilares em SciELO

Compartilhar


Polibits

versão On-line ISSN 1870-9044

Polibits  no.46 México Jul./Dez. 2012

 

Diseño Automático de Redes Neuronales Artificiales mediante el uso del Algoritmo de Evolución Diferencial (ED)

 

Automatic Design of Artificial Neural Networks by means of Differential Evolution (DE) Algorithm

 

Beatriz A. Garro1, Humberto Sossa1, Roberto A. Vázquez2

 

1 Centro de Investigación en Computación del Instituto Politécnico Nacional, CIC–IPN, Av. Juan de Dios Bátiz s/n, esquina con Miguel de Othon de Mendizábal, 07738, Ciudad de México, México. (Email: bgarrol@ipn.mx, hsossa@cic.ipn.mx).

2 Grupo de Sistemas Inteligentes, Facultad de Ingeniería, Universidad la Salle, Benjamín Franklin 47, Col. Hipódromo Condesa, 06140, Ciudad de México, México (Email: ravem@lasallistas.oig.mx).

 

Manuscript received April 22, 2012.
Manuscript accepted for publication July 20, 2012.

 

Resumen

En el área de la Inteligencia Artificial, las Redes Neuronales Artificiales (RNA) han sido aplicadas para la solución de múltiples tareas. A pesar de su declive y del resurgimiento de su desarrollo y aplicación, su diseño se ha caracterizado por un mecanismo de prueba y error, el cual puede originar un desempeño bajo. Por otro lado, los algoritmos de aprendizaje que se utilizan como el algoritmo de retropropagacion y otros basados en el gradiente descenciente, presentan una desventaja: no pueden resolver problemas no continuos ni problemas multimodales. Por esta razón surge la idea de aplicar algoritmos evolutivos para diseñar de manera automática una RNA. En esta investigación, el algoritmo de Evolución Diferencial (ED) encuentra los mejores elementos principales de una RNA: la arquitectura, los pesos sinápticos y las funciones de transferencia. Por otro lado, dos funciones de aptitud son propuestas: el error cuadraatico medio (MSE por sus siglas en inglés) y el error de clasificación (CER) las cuales, involucran la etapa de validación para garantizar un buen desempeño de la RNA. Primero se realizó un estudio de las diferentes configuraciones del algoritmo de ED, y al determinar cuál fue la mejor configuración se realizó una experimentación exhaustiva para medir el desempeño de la metodología propuesta al resolver problemas de clasificación de patrones. También, se presenta una comparativa contra dos algoritmos clásicos de entrenamiento: Gradiente descendiente y Levenberg–Marquardt.

Palabras clave: Evolución diferencial, evolución de redes neuronales artificiales, clasificación de patrones.

 

Abstract

Artificial Neural Networks (ANN) have been applied in several tasks in the field of Artificial Intelligence. Despite their decline and then resurgence, the ANN design is currently a trial–and–error process, which can stay trapped in bad solutions. In addition, the learning algorithms used, such as back–propagation and other algorithms based in the gradient descent, present a disadvantage: they cannot be used to solve non–continuous and multimodal problems. For this reason, the application of evolutionary algorithms to automatically designing ANNs is proposed. In this research, the Differential Evolution (DE) algorithm inds the best design for the main elements of ANN: the architecture, the set of synaptic weights, and the set of transfer functions. Also two itness functions are used (the mean square error—MSE and the classification error—CER) which involve the validation stage to guarantee a good ANN performance. First, a study of the best parameter coniguration for DE algorithm is conducted. The experimental results show the performance of the proposed methodology to solve pattern classiication problems. Next, a comparison with two classic learning algorithms—gradiant descent and Levenberg–Marquardt—are presented.

Key words: Differential evolution, evolutionary neural networks, pattern classification.

 

DESCARGAR ARTÍCULO EN FORMATO PDF

 

AGRADECIMIENTOS

H. Sossa agradece a la SIP–IPN y al DAAD, por el apoyo económico bajo el número 20111016 y al DAAD–PROALMEX J000.426/2009. B. A. Garro agradece al CONACYT por la beca otorgada durante sus estudios doctorales. H. Sossa también agradece a la Union Europea y el CONACYT por el apoyo económico FONCICYT 93829. El contenido de este artículo es responsabilidad exclusiva del CIC–IPN y no puede ser considerado como posición de la Unión Europea.

 

REFERENCES

[1] S. R. y Cajal, Ed., Elementos de histología normal y de técnica micrográfica para uso de estudiantes, 3rd ed. Madrid, España: Imprenta y librería de Nicolas Moya, 1901.         [ Links ]

[2] D. E. Rumelhart, G. E. Hinton, and R. J. Williams, Learning internal representations by error propagation. Cambridge, MA, USA: MIT Press, 1988, pp. 673–695. [Online]. Available: http://dl.acm.org/citation.cfm?id=65669.104449         [ Links ]

[3] T. Back, D. B. Fogel, and Z. Michalewicz, Eds., Handbook of Evolutionary Computation, 1st ed. Bristol, UK, UK: IOP Publishing Ltd., 1997.         [ Links ]

[4] C. Darwin, The origin of species /. New York :P.F. Collier,, 1859, http://www.biodiversitylibrary.org/bibliography/24252. [Online]. Available: http://www.biodiversitylibrary.org/item/65514        [ Links ]

[5] A. Weismann, Die Kontinuität des Keimplasmas als Grundlage einer Theorie der Vererbung /. Jena, Germany: Gustav Fischer, 1885.         [ Links ]

[6] G. Mendel and W. Bateson, Experiments in plant–hybridisation /. Cambridge, Mass. :Harvard University Press,, 1925, http://www.biodiversitylibrary.org/bibliography/4532. [Online]. Available: http://www.biodiversitylibrary.org/item/23469        [ Links ]

[7] J. Ilonen, J.–K. Kamarainen, and J. Lampinen, "Differential evolution training algorithm for feed–forward neural networks," Neural Process. Lett., vol. 17, no. 1, pp. 93–105, Mar. 2003. [Online]. Available: http://dx.doi.org/10.1023/A:1022995128597        [ Links ]

[8] N. Guiying and Z. Yongquan, "A modified differential evolution algorithm for optimization neural network," in Proceedings of the International Conference on Intelligent Systems and Knowledge Engineering, ser. Advances in Intelligent Systems Research, 2007.         [ Links ]

[9] H. M. Abdul–Kader, "Neural networks training based on differential evolution al–gorithm compared with other architectures for weather forecasting34," International Journal of Computer Science and Network Security, vol. 9, no. 3, pp. 92–99, march 2009.         [ Links ]

[10] B. Garro, H. Sossa, and R. Vázquez, "Evolving neural networks: A comparison between differential evolution and particle swarm optimization," in Advances in Swarm Intelligence, ser. Lecture Notes in Computer Science, Y. Tan, Y. Shi, Y. Chai, and G. Wang, Eds. Springer Berlin / Heidelberg, 2011, vol. 6728, pp. 447–454.         [ Links ]

[11] X. Yao, "Evolving artificial neural networks," 1999.         [ Links ]

[12] B. A. Garro, H. Sossa, and R. A. Vázquez, "Design of artificial neural networks using differential evolution algorithm," in Proceedings of the 17th international conference on Neural information processing: models and applications – Volume Part II, ser. ICONIP'10. Berlin, Heidelberg: Springer–Verlag, 2010, pp. 201–208. [Online]. Available: http://dl.acm.org/citation.cfm?id=1939751.1939779        [ Links ]

[13] T. Kohonen, "An introduction to neural computing," Neural Networks, vol. 1, no. 1, pp. 3–16, 1988.         [ Links ]

[14] F. Rosenblatt, Principles of Neurodynamics. Spartan Book, 1962.         [ Links ]

[15] J. J. Hopfield, "Neural networks and physical systems with emergent collective computational abilities," Proceedings of the National Academy of Sciences, vol. 79, no. 8, pp. 2554–2558, Apr. 1982.         [ Links ]

[16] S. Haykin, Neural Networks: A Comprehensive Foundation. New York: Macmillan, 1994.         [ Links ]

[17] P. Isasi Viñuela and I. M. Galván León, Redes de neuronas artificiales: Un enfoque práctico. Madrid, España: Pearson Educación, 2004.         [ Links ]

[18] B. Martín del Brío and A. Saenz Molina, Redes Neuronales y Sistemas Borrosos. Madrid, España: Alfaomega, 2007.         [ Links ]

[19] R. Storn and K. Price, "Differential evolution — a simple and efficient adaptive scheme for global optimization over continuous spaces," International Computer Science Institute, Berkeley, Tech. Rep., 1995. [Online]. Available: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.1.9696        [ Links ]

[20] S. Das and P. N. Suganthan, "Differential evolution: A survey of the state–of–the–art," IEEE Trans. Evolutionary Computation, vol. 15, no. 1, pp. 4–31, 2011.         [ Links ]

[21] E. Mezura–Montes, J. Velázquez–Reyes, and C. A. C. Coello, "A comparative study of differential evolution variants for global optimization," in GECCO, M. Cattolico, Ed. ACM, 2006, pp. 485–492.         [ Links ]

[22] R. Storn and K. Price. (2012, april) Official web site this is a test entry of type @ONLINE. [Online]. Available: http://www.icsi.berkeley.edu/storn/code.html,         [ Links ]

[23] D. N. A. Asuncion, "UCI machine learning repository," 2007. [Online]. Available: http://www.ics.uci.edu/~mlearn/MLRepository.html        [ Links ]

[24] R. A. Vázquez and H. Sossa, "A new associative model with dynamical synapses," Neural Processing Letters, vol. 28, no. 3, pp. 189–207, 2008.         [ Links ]

[25] B. A. G. Licon, J. H. S. Azuela, and R. A. Vázquez, "Design of artificial neural networks using a modified particle swarm optimization algorithm," in IJCNN. IEEE, 2009, pp. 938–945.         [ Links ]

[26] B. A. Garro, H. Sossa, and R. A. Vazquez, "Artificial neural network synthesis by means of artificial bee colony (abc) algorithm," in IEEE Congress on Evolutionary Computation. IEEE, 2011, pp. 331–338.         [ Links ]

Creative Commons License Todo o conteúdo deste periódico, exceto onde está identificado, está licenciado sob uma Licença Creative Commons