SciELO - Scientific Electronic Library Online

 
vol.16 número4Modelado y control en espacio de tarea de un manipulador móvil con cancelación de control proporcional-derivativo instalado en fábricaAnálisis de las propiedades del establecimiento de la conexión Bluetooth Bandabase usando Redes de Petri Coloreadas índice de autoresíndice de materiabúsqueda de artículos
Home Pagelista alfabética de revistas  

Servicios Personalizados

Revista

Articulo

Indicadores

Links relacionados

  • No hay artículos similaresSimilares en SciELO

Compartir


Computación y Sistemas

versión On-line ISSN 2007-9737versión impresa ISSN 1405-5546

Resumen

LOPEZ-JUAREZ, Ismael et al. Fast Object Recognition for Grasping Tasks using Industrial Robots. Comp. y Sist. [online]. 2012, vol.16, n.4, pp.421-432. ISSN 2007-9737.

Working in unstructured assembly robotic environments, i.e. with unknown part location; the robot has to accurately not only to locate the part, but also to recognize it in readiness for grasping. The aim of this research is to develop a fast and robust approach to accomplish this task. We propose an approach to aid the learning of assembly parts on-line. The approach which is based on ANN and a reduced set of recurrent training patterns which speed up the recognition task compared with our previous work is introduced. Experimental learning results using a fast camera are presented. Some simple parts (i.e. circular, squared and radiused-square) were used for comparing different connectionist models (Backpropagation, Perceptron and FuzzyARTMAP) and to select the appropriate model. Later during experiments, complex figures were learned using the chosen FuzzyARTMAP algorithm showing a 93.8% overall efficiency and 100% recognition rate. Recognition times were lower than 1 ms, which clearly indicates the suitability of the approach to be implemented in real-world operations.

Palabras llave : Artificial neural networks; invariant object recognition; machine vision; robotics.

        · resumen en Español     · texto en Inglés     · Inglés ( pdf )

 

Creative Commons License Todo el contenido de esta revista, excepto dónde está identificado, está bajo una Licencia Creative Commons