SciELO - Scientific Electronic Library Online

 
vol.16 issue4Modeling and Control in Task-Space of a Mobile Manipulator with Cancellation of Factory-Installed Proportional-Derivative ControlAnalysis of the Properties of the Bluetooth Baseband Connection Establishment Using Colored Petri Nets author indexsubject indexsearch form
Home Pagealphabetic serial listing  

Services on Demand

Journal

Article

Indicators

Related links

  • Have no similar articlesSimilars in SciELO

Share


Computación y Sistemas

On-line version ISSN 2007-9737Print version ISSN 1405-5546

Abstract

LOPEZ-JUAREZ, Ismael et al. Fast Object Recognition for Grasping Tasks using Industrial Robots. Comp. y Sist. [online]. 2012, vol.16, n.4, pp.421-432. ISSN 2007-9737.

Working in unstructured assembly robotic environments, i.e. with unknown part location; the robot has to accurately not only to locate the part, but also to recognize it in readiness for grasping. The aim of this research is to develop a fast and robust approach to accomplish this task. We propose an approach to aid the learning of assembly parts on-line. The approach which is based on ANN and a reduced set of recurrent training patterns which speed up the recognition task compared with our previous work is introduced. Experimental learning results using a fast camera are presented. Some simple parts (i.e. circular, squared and radiused-square) were used for comparing different connectionist models (Backpropagation, Perceptron and FuzzyARTMAP) and to select the appropriate model. Later during experiments, complex figures were learned using the chosen FuzzyARTMAP algorithm showing a 93.8% overall efficiency and 100% recognition rate. Recognition times were lower than 1 ms, which clearly indicates the suitability of the approach to be implemented in real-world operations.

Keywords : Artificial neural networks; invariant object recognition; machine vision; robotics.

        · abstract in Spanish     · text in English     · English ( pdf )

 

Creative Commons License All the contents of this journal, except where otherwise noted, is licensed under a Creative Commons Attribution License