SciELO - Scientific Electronic Library Online

 
vol.19 número2A Photometric Sampling Strategy for Reflectance Characterization and TransferenceFiltro mediana recursivo para la estimación de fondo y segmentación de primer plano en videos de vigilancia índice de autoresíndice de materiabúsqueda de artículos
Home Pagelista alfabética de revistas  

Servicios Personalizados

Revista

Articulo

Indicadores

Links relacionados

  • No hay artículos similaresSimilares en SciELO

Compartir


Computación y Sistemas

versión impresa ISSN 1405-5546

Comp. y Sist. vol.19 no.2 México abr./jun. 2015

http://dx.doi.org/10.13053/CyS-19-2-1931 

Artículos

 

Camera as Position Sensor for a Ball and Beam Control System

 

Alejandro-Israel Barranco-Gutiérrez1,3, Jesús Sandoval-Galarza2, Saúl Martínez-Díaz2

 

1 Tecnológico Nacional de México, Instituto Tecnológico de Celaya, México. israel.barranco@itcelaya.edu.mx

2 Tecnológico Nacional de México, Instituto Tecnológico de La Paz, México. jsandoval@itlp.edu.mx, saulmd@itlp.edu.mx

3 Cátedras CONACyT, México.

Corresponding author is Alejandro Israel Barranco Gutiérrez.

 

Article received on 22/11/2013.
Accepted on 09/03/2015.

 

Abstract

This paper describes a novel strategy to use a digital camera as a position sensor to control a ball and beam system. A linear control law is used to position the ball at the desired location on the beam. The experiments show how this method controls the positioning of the ball in any location on the beam using a camera with a sampling rate of 30 frames per second (fps), and these results are compared with those obtained by using an analog resistive sensor with a feedback signal sampled at a rate of 1000 samples per second. The mechanical characteristics of this ball and beam system are used to simplify the calculation of the ball position using our vision system, and to ease camera calibration with respect to the ball and beam system. Our proposal uses a circularity feature of blobs in a binary image, instead of the classic correlation or Hough transform techniques for ball tracking. The main control system is implemented in Simulink with Real Time Workshop (RTW) and vision processing with OpenCV libraries.

Keywords: Computer vision, ball and beam system, linear control.

 

DESCARGAR ARTÍCULO EN FORMATO PDF

 

Acknowledgements

The authors greatly appreciate the support of PROMEP and CONACyT with the project 215435. The work of the second author was partially supported by CONACYT grant 166636 and by TecNM grant 5345.14-P. We would also like to thank Laura Heit for her valuable editorial help.

 

References

1. Corke, P. (2011). Robotics, vision and control: Fundamental Algorithms in MATLAB. Springer Verlag.         [ Links ]

2. Corke, P. & Hutchinson, S. (2001). A new partitioned approach to image based visual servo control. IEEE Transactions Robot Automation, Vol. 17, No. 4, pp. 507-515. DOI: 10.1109/70.954764        [ Links ]

3. Handa, A., Newcombe, R. A., Angeli, A., & Davison, A. J. (2012). Real-time camera tracking: When is high frame-rate best? Lecture Notes in Computer Science, Vol. 7578, pp. 222-235. DOI: 10.1007/978-3-642-33786-4_17        [ Links ]

4. Nyquist, H. (1928). Certain topics in telegraph transmission theory. American Institute of Electrical Transactions Engineers, Vol. 47, No. 2, pp. 617-644.         [ Links ]

5. Petrovic, I. (2002). Machine vision based control of the ball and beam. IEEE 7th International Workshop on Advanced Motion Control, pp. 573-577. DOI: 10.1109/AMC.2002.1026984        [ Links ]

6. Ho, C. C. & Shih, C. L. (2008). Machine vision based tracking control of ball beam system. Key Engineering Materials, pp. 301-304. DOI: 10.4028/www.scientific.net/KEM.381-382.301        [ Links ]

7. Xiaohu, L., Yongxin, L., & Haiyan, L. (2011). Design of ball and beam control system based on machine vision. Applied mechanics and materials, Vol. 71-78, pp. 4219-4225. DOI: 10.4028/www.scientific.net/AMM.71-78.4219        [ Links ]

8. Ogata, K. (1970). Modern control engineering. Prentice-Hall.         [ Links ]

9. Sossa, H. (2006). Features for object recognition (in Spanish). Instituto Politécnico Nacional.         [ Links ]

10. González, R. & Woods, R. (2008). Digital image processing. Prentice Hall, pp. 201-207.         [ Links ]

11. Barranco, A. & Medel, J. (2011). Artificial vision and identification for intelligent orientation using a compass. Revista Facultad de Ingeniería de la Universidad de Antioquía, Vol. 58, pp. 191-198.         [ Links ]

12. Barranco, A. & Medel, J. (2011). Automatic object recognition based on dimensional relationships. Computación y Sistemas, Vol. 5, No. 2, pp. 267-272.         [ Links ]

13. Voss, K., Marroquin, J. L., Gutiérrez, S. J., & Suesse, H. (2006). Analysis of images of three-dimensional objects (in Spanish). Instituto Politécnico Nacional.         [ Links ]

14. Shapiro, L. & Stockman., G. C. (2001). Computer Vision. Prentice-Hall.         [ Links ]

15. Tsai, D. M. & Lin, C. T. (2003). Fast normalized cross correlation for defect detection. Pattern Recognition Letters, Vol. 24, No.15, pp. 2625-2631.         [ Links ]

16. Pérez, C. & Moreno, M. (2009). Fuzzy visual control of a nonlinear system. Master Thesis, Instituto Politécnico Nacional.         [ Links ]

17. Otsu, N. (1979). A threshold selection method from gray level histograms. IEEE Transactions on Systems, Man, Cybernetics, Vol. 9, No.1, pp. 62-66.         [ Links ]

18. Zhang, Z. (2000). A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 22, No. 11, pp. 1330-1334. DOI: 10.1109/34.888718        [ Links ]

19. Hartley, R. & Zisserman, A. (2003). Multiple view geometry in computer vision. Cambridge University press, pp. 152-208.         [ Links ]

20. Barranco, A. & Medel, J. (2009). Digital camera calibration analysis using perspective projection matrix. Proceedings of the 8th WSEAS International Conference on Signal Processing, robotics and automation, pp. 321 -325.         [ Links ]

Creative Commons License Todo el contenido de esta revista, excepto dónde está identificado, está bajo una Licencia Creative Commons