<?xml version="1.0" encoding="ISO-8859-1"?><article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<front>
<journal-meta>
<journal-id>1405-7743</journal-id>
<journal-title><![CDATA[Ingeniería, investigación y tecnología]]></journal-title>
<abbrev-journal-title><![CDATA[Ing. invest. y tecnol.]]></abbrev-journal-title>
<issn>1405-7743</issn>
<publisher>
<publisher-name><![CDATA[Universidad Nacional Autónoma de México, Facultad de Ingeniería]]></publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id>S1405-77432011000200002</article-id>
<title-group>
<article-title xml:lang="en"><![CDATA[A LabVIEW-based Autonomous Vehicle Navigation System using Robot Vision and Fuzzy Control]]></article-title>
<article-title xml:lang="es"><![CDATA[Sistema de navegación autónoma de un vehículo usando visión robótica y control difuso en LabVIEW]]></article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname><![CDATA[Ramírez-Cortés]]></surname>
<given-names><![CDATA[J.M]]></given-names>
</name>
<xref ref-type="aff" rid="A01"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname><![CDATA[Gómez-Gil]]></surname>
<given-names><![CDATA[P]]></given-names>
</name>
<xref ref-type="aff" rid="A02"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname><![CDATA[Martínez-Carballido]]></surname>
<given-names><![CDATA[J]]></given-names>
</name>
<xref ref-type="aff" rid="A03"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname><![CDATA[López-Larios]]></surname>
<given-names><![CDATA[F]]></given-names>
</name>
<xref ref-type="aff" rid="A04"/>
</contrib>
</contrib-group>
<aff id="A01">
<institution><![CDATA[,Instituto Nacional de Astrofísica, Óptica y Electrónica Coordinación de Electrónica ]]></institution>
<addr-line><![CDATA[ ]]></addr-line>
</aff>
<aff id="A02">
<institution><![CDATA[,Instituto Nacional de Astrofísica, Óptica y Electrónica Coordinación de Electrónica ]]></institution>
<addr-line><![CDATA[ ]]></addr-line>
</aff>
<aff id="A03">
<institution><![CDATA[,Instituto Nacional de Astrofísica, Óptica y Electrónica Coordinación de Electrónica ]]></institution>
<addr-line><![CDATA[ ]]></addr-line>
</aff>
<aff id="A04">
<institution><![CDATA[,Universidad de las Américas Departamento de Ingeniería Electrónica ]]></institution>
<addr-line><![CDATA[ ]]></addr-line>
</aff>
<pub-date pub-type="pub">
<day>00</day>
<month>06</month>
<year>2011</year>
</pub-date>
<pub-date pub-type="epub">
<day>00</day>
<month>06</month>
<year>2011</year>
</pub-date>
<volume>12</volume>
<numero>2</numero>
<fpage>129</fpage>
<lpage>136</lpage>
<copyright-statement/>
<copyright-year/>
<self-uri xlink:href="http://www.scielo.org.mx/scielo.php?script=sci_arttext&amp;pid=S1405-77432011000200002&amp;lng=en&amp;nrm=iso"></self-uri><self-uri xlink:href="http://www.scielo.org.mx/scielo.php?script=sci_abstract&amp;pid=S1405-77432011000200002&amp;lng=en&amp;nrm=iso"></self-uri><self-uri xlink:href="http://www.scielo.org.mx/scielo.php?script=sci_pdf&amp;pid=S1405-77432011000200002&amp;lng=en&amp;nrm=iso"></self-uri><abstract abstract-type="short" xml:lang="en"><p><![CDATA[This paper describes a navigation system for an autonomous vehicle using machine vision techniques applied to real-time captured images of the track for academic purposes. The experiment consists of the automatic navigation of a remote control car through a closed circuit. Computer vision techniques are used for the sensing of the environment through a wireless camera. The received images are captured into the computer through the acquisition card NI USB-6009, and processed in a system developed under the LabVIEW platform, taking advantage of the toolkit for acquisition and image processing. Fuzzy logic control techniques are incorporated for the intermediate control decisions required during the car navigation. An efficient approach based on logic machine-states is used as an optimal method to implement the changes required by the fuzzy logic control. Results and concluding remarks are presented.]]></p></abstract>
<abstract abstract-type="short" xml:lang="es"><p><![CDATA[En este artículo se presenta un sistema de navegación para un vehículo autónomo usando técnicas de visión robótica, desarrollado en LabVIEW con fines académicos. El sistema adquiere en tiempo real las imágenes del camino por recorrer. Estas imágenes son enviadas en forma inalámbrica a una computadora, en donde un sistema de control, basado en reglas de control difuso, toma las decisiones de movimiento correspondientes. La computadora envía en forma inalámbrica las señales adecuadas al vehículo de control remoto, cerrando de esta manera el lazo de control. Las imágenes son capturadas en la computadora a través de la tarjeta de adquisición NI USB-6009 y procesadas en un sistema desarrollado bajo la plataforma de LabVIEW y sus herramientas de adquisición, procesado de imágenes y control difuso. Se incorpora un eficiente esquema de diseño basado en máquinas de estados para la navegación por las diversas escenas detectadas por la cámara. Se presentan resultados y conclusiones de este trabajo.]]></p></abstract>
<kwd-group>
<kwd lng="en"><![CDATA[fuzzy]]></kwd>
<kwd lng="en"><![CDATA[control]]></kwd>
<kwd lng="en"><![CDATA[robot]]></kwd>
<kwd lng="en"><![CDATA[vision]]></kwd>
<kwd lng="en"><![CDATA[autonomous]]></kwd>
<kwd lng="en"><![CDATA[navigation]]></kwd>
<kwd lng="es"><![CDATA[difuso]]></kwd>
<kwd lng="es"><![CDATA[control]]></kwd>
<kwd lng="es"><![CDATA[visión]]></kwd>
<kwd lng="es"><![CDATA[navegación]]></kwd>
<kwd lng="es"><![CDATA[autónoma]]></kwd>
<kwd lng="es"><![CDATA[robot]]></kwd>
</kwd-group>
</article-meta>
</front><body><![CDATA[ <p align="center"><font face="verdana" size="4"><b>A LabVIEW&#150;based Autonomous Vehicle Navigation System using Robot Vision and Fuzzy Control</b></font></p>     <p align="center"><font face="verdana" size="2">&nbsp;</font></p>     <p align="center"><font face="verdana" size="3"><b>Sistema de navegaci&oacute;n aut&oacute;noma de un veh&iacute;culo usando visi&oacute;n rob&oacute;tica y control difuso en LabVIEW</b></font></p>     <p align="center"><font face="verdana" size="2">&nbsp;</font></p>     <p align="center"><font face="verdana" size="2"><b>Ram&iacute;rez&#150;Cort&eacute;s J.M<sup>1</sup>., G&oacute;mez&#150;Gil P.<sup>2</sup>, Mart&iacute;nez&#150;Carballido J.<sup>3</sup> and L&oacute;pez&#150;Larios F.<sup>4</sup></b></font></p>     <p align="justify"><font face="verdana" size="2">&nbsp;</font></p>     <p align="justify"><font face="verdana" size="2"><i><sup>1</sup> Coordinaci&oacute;n de Electr&oacute;nica Instituto Nacional de Astrof&iacute;sica, &Oacute;ptica y Electr&oacute;nica. E&#150;mail: </i><a href="mailto:jmram@inaoep.mx">jmram@inaoep.mx</a></font></p>     <p align="justify"><font face="verdana" size="2"><i><sup>2</sup> Coordinaci&oacute;n de Computaci&oacute;n Instituto Nacional de Astrof&iacute;sica, &Oacute;ptica y Electr&oacute;nica. E&#150;mail: </i><a href="mailto:pgomez@inaoep.mx">pgomez@inaoep.mx</a></font></p>     <p align="justify"><font face="verdana" size="2"><i><sup>3</sup> Coordinaci&oacute;n de Electr&oacute;nica Instituto Nacional de Astrof&iacute;sica, &Oacute;ptica y Electr&oacute;nica. E&#150;mail: </i><a href="mailto:jmc@inaoep.mx">jmc@inaoep.mx</a></font></p>     <p align="justify"><font face="verdana" size="2"><i><sup>4</sup> Departamento de Ingenier&iacute;a Electr&oacute;nica Universidad de las Am&eacute;ricas, Puebla. E&#150;mail: </i><a href="mailto:lopez.filiberto@gmail.com">lopez.filiberto@gmail.com</a></font></p>     ]]></body>
<body><![CDATA[<p align="justify"><font face="verdana" size="2">&nbsp;</font></p>     <p align="justify"><font face="verdana" size="2">Informaci&oacute;n del art&iacute;culo: recibido: octubre de 2007.    <br> Aceptado: octubre de 2010.</font></p>     <p align="justify"><font face="verdana" size="2">&nbsp;</font></p>     <p align="justify"><font face="verdana" size="2"><b>Abstract</b></font></p>     <p align="justify"><font face="verdana" size="2">This paper describes a navigation system for an autonomous vehicle using machine vision techniques applied to real&#150;time captured images of the track for academic purposes. The experiment consists of the automatic navigation of a remote control car through a closed circuit. Computer vision techniques are used for the sensing of the environment through a wireless camera. The received images are captured into the computer through the acquisition card NI USB&#150;6009, and processed in a system developed under the LabVIEW platform, taking advantage of the toolkit for acquisition and image processing. Fuzzy logic control techniques are incorporated for the intermediate control decisions required during the car navigation. An efficient approach based on logic machine&#150;states is used as an optimal method to implement the changes required by the fuzzy logic control. Results and concluding remarks are presented.</font></p>     <p align="justify"><font face="verdana" size="2"><b>Keywords:</b> fuzzy, control, robot, vision, autonomous, navigation.</font></p>     <p align="justify"><font face="verdana" size="2">&nbsp;</font></p>     <p align="justify"><font face="verdana" size="2"><b>Resumen</b></font></p>     <p align="justify"><font face="verdana" size="2"><i>En este art&iacute;culo se presenta un sistema de navegaci&oacute;n para un veh&iacute;culo aut&oacute;nomo usando t&eacute;cnicas de visi&oacute;n rob&oacute;tica, desarrollado en LabVIEW con fines acad&eacute;micos. El sistema adquiere en tiempo real las im&aacute;genes del camino por recorrer. Estas im&aacute;genes son enviadas en forma inal&aacute;mbrica a una computadora, en donde un sistema de control, basado en reglas de control difuso, toma las decisiones de movimiento correspondientes. La computadora env&iacute;a en forma inal&aacute;mbrica las se&ntilde;ales adecuadas al veh&iacute;culo de control remoto, cerrando de esta manera el lazo de control. Las im&aacute;genes son capturadas en la computadora a trav&eacute;s de la tarjeta de adquisici&oacute;n NI USB&#150;6009 y procesadas en un sistema desarrollado bajo la plataforma de LabVIEW y sus herramientas de adquisici&oacute;n, procesado de im&aacute;genes y control difuso. Se incorpora un eficiente esquema de dise&ntilde;o basado en m&aacute;quinas de estados para la navegaci&oacute;n por las diversas escenas detectadas por la c&aacute;mara. Se presentan resultados y conclusiones de este trabajo.</i></font></p>     ]]></body>
<body><![CDATA[<p align="justify"><font face="verdana" size="2"><b>Descriptores: </b>difuso, control, visi&oacute;n, navegaci&oacute;n, aut&oacute;noma, robot.</font></p>     <p align="justify"><font face="verdana" size="2">&nbsp;</font></p>     <p align="justify"><font face="verdana" size="2"><b>Introduction</b></font></p>     <p align="justify"><font face="verdana" size="2">An autonomous navigation system consists of a self&#150;piloted vehicle that does not require an operator to navigate and accomplish its tasks. The aim of an autonomous vehicle is to have self sufficiency and a decision making heuristic installed within it, which allows it to automatically move in the corresponding environment, as well as to accomplish the tasks required (Ar&#150;mingol <i>et al., </i>2007). Some of the areas where autonomous vehicles have been successfully used are space rovers, rice planting and agricultural vehicles (Tunstel <i>et al., </i>2007, Nagasaka <i>et al, </i>2004, Zhao <i>et al, </i>2007), autonomous driving for urban areas (De la Escalera <i>et al., </i>2003), security and surveillance (Srini, 2006, Flan <i>et al, </i>2004, Micheloni <i>et al., </i>2007), and also exploration of any place where human life may be at risk, like a mine with toxic gases or during a nuclear plant disaster (Isozaki <i>et al., </i>2002). There is a whole variety of autonomous vehicles present, with an extensive classification and categories depending on their characteristics (Bertozzi <i>et al., </i>2000). Some of those characteristics described in the literature are: autonomy level, methods of data acquisition, methods of localizations, goal tasks, displacement techniques, control methods, and so on. The project presented in this paper is restricted to the autonomous navigation of a small control remote car through a closed circuit. Although it is a very specific task in a controlled environment, it is aimed to provide a platform for academic purposes, in which several approaches of navigation control can be tried, and different schemes of pre&#150;processing image techniques can be used in educational experiments. The software package LabVIEW and their available toolboxes on image analysis, computer vision, and fuzzy logic control (NI 2002, 2005), have been found to be an excellent platform for experimentation purposes for a quick design, implementation, and test of the prototypes. It is expected to continue the experimentation with this prototype in order to explore further tasks, which would require the use of more sophisticated control heuristics in the field of artificial intelligence or neural networks techniques.</font></p>     <p align="justify"><font face="verdana" size="2"><b>Hardware description</b></font></p>     <p align="justify"><font face="verdana" size="2">The implemented system basically consists of the wireless control of a small remote control car from a laptop computer as shown in <a href="#f1">figure 1</a>. The vehicle is equipped with a wireless camera which sends in real time the video signal corresponding to the path. Once the streaming data corresponding to the video signal of the path is entered into the computer, it is processed through a LabVIEW application, which generates the control signals to be applied to the remote control of the vehicle. A block diagram of the system is presented in <a href="#f1">figure 1</a>. The used hardware is listed as follows: data acquisition card (DAQ) NI USB&#150;6009, wireless analog mini&#150;camera JMK, Dazzle USB video capture card, and a small digital&#150;remote control car.</font></p>     <p align="center"><font face="verdana" size="2"><a name="f1"></a></font></p>     <p align="center"><font face="verdana" size="2"><img src="/img/revistas/iit/v12n2/a2f1.jpg"></font></p>     <p align="justify"><font face="verdana" size="2">The wireless camera transmits a video signal with a horizontal resolution of 380 TV lines in the frequency of 1.2 GHz on the ISM (Industrial, Scientific, and Medical) radio band. Due to the limitations on the maximum current provided by the output port of the I/O card, a simple optocoupler&#150;based interface was included between the NI card and the remote control as a conditioning signal stage. The digital signals obtained from the interface are applied to the remote control, which sends the movement commands to the 27 MHz radio controlled car and the cycle is closed.</font></p>     <p align="justify"><font face="verdana" size="2"><b>Fuzzy logic control in LabVIEW</b></font></p>     ]]></body>
<body><![CDATA[<p align="justify"><font face="verdana" size="2">Fuzzy logic control has been extensively used in many applications, including commercial products. There is a large amount of references and books on the topic with a detailed revision of the basic theory (Kovacic <i>et al., </i>2005, Jantsen <i>et al. </i>, 2007), as well as some references on fuzzy control in the context of autonomous navigation (Hagras <i>et al. </i>, 2004). For the purpose of the work presented, it is important to point out the two types of fuzzy inference systems that are most used currently: Mamdani&#150;type and Sugeno&#150;type (Zhang <i>et al. </i>, 2006, Ruano, 2005). These two types of inference systems vary in the way the outputs are determined. The method used in the control described in this paper is the Mamdani type, which includes fuzzification of input data based on membership functions, an inference rules database, and defuzzification of the output signal. <i>Mamdani&#150;type inference </i>expects the output membership functions to be fuzzy variables, in consequence, after agregation of signal outputs there is a fuzzy set for each output variable to be defuzzified. The LabVIEW toolbox on fuzzy logic control was found an excellent platform to support design, implementation, and test of a system control based on fuzzy logic techniques.</font></p>     <p align="justify"><font face="verdana" size="2">This feature in conjunction with the image analysis and acquisition library IMAQ&#150;VISION and Vision Builder from NI, were the key for a quick and accurate implementation of this project. The LabVIEW toolbox on fuzzy logic control includes edition of the fuzzy variables and the corresponding triangular and trapezoidal membership functions, a friendly rulebase editor to enter the if&#150;then rules associated to the fuzzy control, and some deffuzzification methods such as centroid, center of maximum, and min&#150;max, which are mathematical operations over a two dimensional function obtained from the combination of the fuzzy outputs.</font></p>     <p align="justify"><font face="verdana" size="2">In this project, there are two output variables: the steering wheel angle, and the speed. These two signals are connected to the remote control as the guidance mechanism of the vehicle. Each signal allows sixteen values coded in 4 bits. In the case of the speed it covers positive and negative values for movements in reverse, and in the case of the steering wheel angle it covers an angle range of &plusmn; 45&deg; <a href="#f2">Figure 2</a> shows the partition of the variable &#945; "Driving wheel angle", in five fuzzy sets. In a similar way, the output variable "Speed" is partitioned in five fuzzy sets as: BM: back medium, BS: back small, SS: straight small, SM: straight medium, SH: straight high. The input information needed to locate the relative position of the vehicle relies on the image sequence detected by the wireless camera, which is a streaming data of 30 images per second. The program automatically segments and marks the frontal part of the car as the image reference with a yellow box, and the path lanes are segmented and marked with two red lines. The marking is presented in the screen as a visual representation, and simultaneously registered in the image file. The numerical information obtained from both markings is further used during the tracking algorithm. The fuzzy input variables used in this work are: lateral distance to the nearest lane, and the inclination angle of the incoming curve obtained as the average of the two angles detected from the lateral borders of the road. These variables are represented in membership functions derived from the partition of the variables in five fuzzy sets. <a href="#f3">Figure 3</a> shows the partition of the input variable x "Lateral displacement". In a similar way, the input variable <img src="/img/revistas/iit/v12n2/a2s1.jpg"> "Angle of the incoming curve" is partitioned as: LTC: left tight curve, LSC: left soft curve, S: straight, RSC: right soft curve, RTC: right tight curve. The inference rules of the fuzzy control system in the form IF&#150;THEN are located in a database which is accessed in each iteration. The database was constructed considering the actions that a human being would perform in every situation during the trajectory, with the restrictions of the range designated for each variable. This information is further analyzed in order to make a decision concerned to the car position with respect to the path, and the required action to keep the track in the circuit. The numerical information regarding the angle of the lines representing the road path, is codified as a fuzzy variable to be used in the control system, as described in the next sections.</font></p>     <p align="center"><font face="verdana" size="2"><a name="f2"></a></font></p>     <p align="center"><font face="verdana" size="2"><img src="/img/revistas/iit/v12n2/a2f2.jpg"></font></p>     <p align="center"><font face="verdana" size="2"><a name="f3"></a></font></p>     <p align="center"><font face="verdana" size="2"><img src="/img/revistas/iit/v12n2/a2f3.jpg"></font></p>     <p align="justify"><font face="verdana" size="2"><b>The LabVIEW IMAQ vision toolbox</b></font></p>     <p align="justify"><font face="verdana" size="2">The IMAQ vision is a virtual instruments library aimed to the design and implementation of computer vision and image analysis scientific applications. It includes tools for vision controls using different type of images, image processing operations, such as binarization, histograms, filters, morphology operations, and so on. It also provides options for a graphical and numerical image analysis through lines, circles, squares or coordinate systems on the captured image. In addition, National Instruments has developed the so&#150;called Vision Builder, which is an interactive software package for configuring and implementing complete machine vision applications using the same graphic philosophy of LabVIEW, with high&#150;level operations such as classification or optical character recognition. In the project described in this paper, the image is video&#150;captured, modified from color to gray scale type, and binarized through the corresponding tools. The image of the road is analyzed in order to segment the lines defining the path. Numerical information regarding the relative position of these lines with respect to the reference, which is the center of the car, as well as the angle, is obtained through the analysis of those lines, and is codified as fuzzy variables to be used in the control.</font></p>     <p align="justify"><font face="verdana" size="2"><b>Programming approach</b></font></p>     ]]></body>
<body><![CDATA[<p align="justify"><font face="verdana" size="2">The control program developed in LabVIEW was developed following an approach of machine states logic design. According to the scene detected, the vehicle could be located in one of six states as described in the state diagram of <a href="#f4">figure 4</a>. Once the system is turned on, the car stays in the state 'inif until the user presses the button Start. Pressing this button initializes the route and moves the car to the state 'straight'. In each state, the system waits for the camera to sense the road, derives the numerical information from the relative position of the car, and makes a decision according to the fuzzy logic inference rules stored in the database. After defuzzification, the output signals are sent to the car through the wireless remote control, which closes the control loop by making the corresponding movement in the logic&#150;states machine, and an iteration is completed.</font></p>     <p align="center"><font face="verdana" size="2"><a name="f4"></a></font></p>     <p align="center"><font face="verdana" size="2"><img src="/img/revistas/iit/v12n2/a2f4.jpg"></font></p>     <p align="justify"><font face="verdana" size="2"><a href="/img/revistas/iit/v12n2/a2f5.jpg" target="_blank">Figure 5</a> shows the code in the LabVIEW graphical language corresponding to the state 'straight/. In this diagram there is an input signal corresponding to the image of the road obtained from the wireless camera, and an output signal connected through the DAQ assistant, which will generate the electric signal required by the remote control. The main operation relays on three sub&#150;virtual instruments named 'straight', 'fuzzy', and 'wheel', which are designed to evaluate the video&#150;signal obtained from the camera, calculate the numerical information regarding the relative position of the car, and derive the corresponding action based on the fuzzy inference rules contained in the database. The internal routines included in each machine state are basically the same with small differences according to the corresponding position, so for the purposes of this paper only the state 'straight' will be explained.</font></p>     <p align="justify"><font face="verdana" size="2">Inside the main block in <a href="/img/revistas/iit/v12n2/a2f5.jpg" target="_blank">figure 5</a>, we can distinguish a sub&#150;virtual instrument called 'straight', which has the purpose of obtaining the numerical representation derived from the visual information of the road. <a href="#f6">Figure 6</a> corresponds to the code used to derive two values named as <i>'max' </i>and <i>'min' </i>with respect to the center of the vehicle, from the right and left lines obtained from the input image of the road.</font></p>     <p align="center"><font face="verdana" size="2"><a name="f6"></a></font></p>     <p align="center"><font face="verdana" size="2"><img src="/img/revistas/iit/v12n2/a2f6.jpg"></font></p>     <p align="justify"><font face="verdana" size="2">The obtained values max, min, and center, are entered to the next stage, which applies the fuzzy logic rules to obtain the required output value used to control the steering wheel angle, and the displacement of the vehicle, in consequence. The output value is converted to the 4 bit digital word required in the remote control through a table containing the corresponding codes, as shown in <a href="/img/revistas/iit/v12n2/a2f7.jpg" target="_blank">figure 7</a>. Once all the operations are completed, the process starts again in a new state depending on the position of the vehicle. When the button <i>'stop' </i>in the main display is pressed, the program goes to the state <i>'init', </i>where the program waits until the user decides to resume the car movement or to finish the process.</font></p>     <p align="justify"><font face="verdana" size="2">&nbsp;</font></p>     <p align="justify"><font face="verdana" size="2"><b>Results</b></font></p>     ]]></body>
<body><![CDATA[<p align="justify"><font face="verdana" size="2"><a href="/img/revistas/iit/v12n2/a2f8.jpg" target="_blank">Figure 8</a> shows the frontal panel of the program in two cases. The frontal panel consists of buttons start&#150;stop, the current machine state, the visual indicators for the speed and the steering wheel angle, and a window to show the scene visualized from the car in real time. When the button <i>'start' </i>is pressed it changes to green color, the machine state changes to <i>'straight', </i>and the vehicle starts moving. The first case showed in <a href="/img/revistas/iit/v12n2/a2f8.jpg" target="_blank">figure 8</a> corresponds to the initial state, once the system is turned on. The second one shows the case in which the vehicle detects a curve to the right and the steering wheel is conditioned to do the turn.</font></p>     <p align="justify"><font face="verdana" size="2">A simple experiment aimed to test the response of the vehicle was implemented. The car was located in a linear track of 4 meters, with an initial position of 70 pixels out of the center. The car was expected to correct its position until the center is reached. The experiment was carried out several times using partitions of the variables in three and five fuzzy sets. After averaging the trajectories, the curves shown in <a href="/img/revistas/iit/v12n2/a2f9.jpg" target="_blank">figure 9</a> were obtained. It can be seen that the vehicle stabilizes after some oscillations in approximately 1.5 meters. Additional experiments and results can be checked at the document in the complete project report (Lopez, 2007).</font></p>     <p align="justify"><font face="verdana" size="2">&nbsp;</font></p>     <p align="justify"><font face="verdana" size="2"><b>Conclusions</b></font></p>     <p align="justify"><font face="verdana" size="2">An autonomous vehicle navigation system, based on fuzzy logic control techniques with robot vision capabilities has been presented. This experiment was designed with academic purposes on a LabVIEW platform, taking advantages of the toolbox on fuzzy logic control and the acquisition library IMAQ&#150;VISION. These resources were found to be an excellent tool&#150;set to develop in short time with a very good flexibility and excellent performance, the design, implementation and testing of a control system such as the one described in this paper. Particularly, the paradigm of control based on the use of machine states, represents an interesting approach for automatic vehicle navigation, as well as a didactic case of study. This prototype is ready to support further experimentation in different tasks, including different heuristics of control with the use of artificial intelligence techniques in different environments. It is also worth to point out that a rigid camera like the one used in this project is a considerable limitation. The use of a more sophisticated camera with options like pan, tilt, or zoom, could be a very good improvement, for the possibility to anticipate possible trajectories and do a movement strategy in advance, with a reasonable increasing in the cost of the prototype.</font></p>     <p align="justify"><font face="verdana" size="2">&nbsp;</font></p>     <p align="justify"><font face="verdana" size="2"><b>Acknowledgments</b></font></p>     <p align="justify"><font face="verdana" size="2">The authors would like to thank the anonymous reviewers for their detailed and helpful comments.</font></p>     <p align="justify"><font face="verdana" size="2">&nbsp;</font></p>     <p align="justify"><font face="verdana" size="2"><b>References</b></font></p>     ]]></body>
<body><![CDATA[<!-- ref --><p align="justify"><font face="verdana" size="2">Armingol J.M., De la Escalera A., Hilario C., Collado J.M., Carrasco J.P., Flores M.J., Pastor J.M., Rodr&iacute;guez J. IVVI: Intelligent Vehicle Based on Visual Information. <i>Robotics and Autonomous Systems, </i>55(12):904&#150;916. December 2007. ISSN: 0921&#150;8890.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=4257302&pid=S1405-7743201100020000200001&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font></p>     <!-- ref --><p align="justify"><font face="verdana" size="2">Bertozzi M., Broggi A., Fascioli A. Vision&#150;Based Intelligent Vehicle: State of the Art and Perspectives. <i>Robotics and Automation Systems, </i>32(1):1&#150;16. January 2000. ISSN: 0921&#150;8890.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=4257304&pid=S1405-7743201100020000200002&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font></p>     <!-- ref --><p align="justify"><font face="verdana" size="2">De la Escalera A., Mata M. Traffic Sign Recognition and Analysis for Intelligent Vehicle. <i>Image and Vision Computing, </i>21(3):247&#150;258. July 2003. ISSN: 02</font>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=4257306&pid=S1405-7743201100020000200003&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --><!-- ref --><p align="justify"><font face="verdana" size="2">Flan N.N., Moore K.L. A Small Mobile Robot for Security and Inspection. <i>Control Engineering Practice, </i>10(11):1265&#150; 1270. 2004. ISSN: 0967&#150;0661.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=4257307&pid=S1405-7743201100020000200004&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font></p>     <!-- ref --><p align="justify"><font face="verdana" size="2">Hagras H.A. A Hierarchical Type&#150;2 Fuzzy Logic Control Architecture for Autonomous Mobile Robots. <i>IEEE Transactions on Fuzzy Systems, </i>12(4):524&#150;539. August 2004. ISSN: 1063&#150;6706.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=4257309&pid=S1405-7743201100020000200005&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font></p>     <!-- ref --><p align="justify"><font face="verdana" size="2">Isozaki Y., Nakai K. Development of a Work Robot with a Manipulator and a Transport Robot for Nuclear Facility Emergency Preparedness. <i>Advanced Robotics, </i>16(6):489&#150;492. 2002, ISSN: 0169&#150;1864.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=4257311&pid=S1405-7743201100020000200006&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font></p>     <!-- ref --><p align="justify"><font face="verdana" size="2">Jantsen J. <i>Foundations of Fuzzy Control. </i>West Sussex, England. John Wiley &amp; Sons, 2007. Pp. 13&#150;69.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=4257313&pid=S1405-7743201100020000200007&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font></p>     <!-- ref --><p align="justify"><font face="verdana" size="2">Kovacic Z., Bogdan S., <i>Fuzzy Controller Design: Theory and Applications, Control Engineering Series. </i>Boca Raton, Florida, U.S. CRC Press, Taylor and Francis Group. 2005. Pp. 9&#150;40.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=4257315&pid=S1405-7743201100020000200008&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font></p>     <!-- ref --><p align="justify"><font face="verdana" size="2">Lopez L.F. Navegaci&oacute;n de un veh&iacute;culo guiado por tratamiento y an&aacute;lisis de im&aacute;genes con control difuso. Tesis (Maestr&iacute;a en ciencias en electr&oacute;nica). M&eacute;xico. Universidad de las Am&eacute;ricas, Puebla. 2007. 132 p.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=4257317&pid=S1405-7743201100020000200009&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font></p>     <!-- ref --><p align="justify"><font face="verdana" size="2">Micheloni, C., Foresti, G.L., Piciarelli, C., Cinque, L. An Autonomous Vehicle for Video Surveillance of Indoor Environments. <i>IEEE Transactions on Vehicular Technology, </i>56(2):487&#150;498. 2007. ISSN: 0018&#150;9545.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=4257319&pid=S1405-7743201100020000200010&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font></p>     <!-- ref --><p align="justify"><font face="verdana" size="2">Nagasaka Y., Umeda N., Kanetia Y., Taniwaki K., Sasaki Y. Autonomous Guidance for Rice Transplanting Using Global Positioning  and  Gyroscopes.  <i>Computers   and  Electronics in Agriculture, </i>43(3):223&#150;234. 2004. ISSN: 0168&#150;1699.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=4257321&pid=S1405-7743201100020000200011&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font></p>     <!-- ref --><p align="justify"><font face="verdana" size="2">NI Fuzzy Logic Control Toolbox. Application Notes, National Instruments, 2002.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=4257323&pid=S1405-7743201100020000200012&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font></p>     <!-- ref --><p align="justify"><font face="verdana" size="2">NI Vision for LabVIEW User Manual, National Instruments, 2005.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=4257325&pid=S1405-7743201100020000200013&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font></p>     <!-- ref --><p align="justify"><font face="verdana" size="2">Ruano A.E. <i>Intelligent Control Systems using Computational Intelligence Techniques. </i>London, United Kingdom. The Institution of Engineering and Technology. 2005. Pp. 3&#150;34.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=4257327&pid=S1405-7743201100020000200014&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font></p>     <!-- ref --><p align="justify"><font face="verdana" size="2">Srini V.P. A Vision for Supporting Autonomous Navigation in Urban Environments. <i>Computer, </i>39(12):68&#150;77. 2006. ISSN: 00189162.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=4257329&pid=S1405-7743201100020000200015&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font></p>     <!-- ref --><p align="justify"><font face="verdana" size="2">Tunstel, E., Anderson, G.T., Wilson, E.W. Autonomous Mobile Surveying for Science Rovers Using in Situ Distributed Remote Sensing. On: IEEE International Conference on Systems, Man and Cybernetics. Montreal, Canada. October 2007, pp. 2348&#150;2353.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=4257331&pid=S1405-7743201100020000200016&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font></p>     <!-- ref --><p align="justify"><font face="verdana" size="2">Zhang H., Liu D. <i>Fuzzy Modeling and Fuzzy Control. </i>Boston. Birkhauser. 2006. Pp. 33&#150;75.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=4257333&pid=S1405-7743201100020000200017&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font></p>     <!-- ref --><p align="justify"><font face="verdana" size="2">Zhao B., Zhu Z., Mao E.R., Song Z.H. Vision System Calibration of Agricultural Wheeled&#150;Mobile Robot Based on BP Neural Network. On: International Conference on Machine Learning and Cybernetics. Hong Kong, China. August 2007, pp. 340&#150;344.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=4257335&pid=S1405-7743201100020000200018&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font></p>     <p align="justify"><font face="verdana" size="2">&nbsp;</font></p>     <p align="justify"><font face="verdana" size="2"><b>About the authors</b></font></p>     <p align="justify"><font face="verdana" size="2"><i>Juan Manuel Ram&iacute;rez&#150;Cort&eacute;s. </i>Was born in Puebla, Mexico. He received the B.Sc. degree from the National Polytechnic Institute, Mexico, the M.Sc. degree from the National Institute of Astrophysics, Optics, and Electronics (INAOE), Mexico, and the Ph.D. degree from Texas Tech University, all in electrical engineering. He is currently a Titular Researcher at the Electronics Department, INAOE, in Mexico. He is member of the National Research System, level 1. His research interests include signal and image processing, computer vision, neural networks, fuzzy control, and digital systems.</font></p>     <p align="justify"><font face="verdana" size="2"><i>Pilar G&oacute;mez&#150;Gil. </i>Was born in Puebla, Mexico. She received the B.Sc. degree from <i>Universidad de las Americas A.C, </i>Mexico, the M.Sc. and Ph.D. degrees from Texas Tech University, USA, all in computer science. She is currently an Associate Researcher in computer science at INAOE, Mexico. She is member of the National Research System, level 1. Her research interests include neural networks, image processing, fuzzy logic, pattern recognition, and software engineering. She is a senior member of IEEE, and a member of ACM.</font></p>     <p align="justify"><font face="verdana" size="2"><i>Jorge Mart&iacute;nez&#150;Carballido. </i>Received the B.Sc. degree in electrical engineering from <i>Universidad de las Americas, </i>Mexico, the M.Sc. degree and the Ph.D. degree in electrical engineering, both from Oregon State University. He is currently a Titular Researcher at the National Institute of Astrophysics, Optics, and Electronics (INAOE), Mexico. His research interests include digital systems, reconfigurable hardware, signal and image processing, and instrumentation.</font></p>     ]]></body>
<body><![CDATA[<p align="justify"><font face="verdana" size="2"><i>Filiberto L&oacute;pez&#150;Larios. </i>Received the B.Sc. degree in electronics and communications engineering from <i>Universidad La Salle Baj&iacute;o, </i>and the M.Sc. degree in electrical engineering from <i>Universidad de las Am&eacute;ricas Puebla. </i>His research interests include digital signal processing, fuzzy control systems, and the design of applications based on virtual instrumentation.</font></p>      ]]></body><back>
<ref-list>
<ref id="B1">
<nlm-citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Armingol]]></surname>
<given-names><![CDATA[J.M.]]></given-names>
</name>
<name>
<surname><![CDATA[De la Escalera]]></surname>
<given-names><![CDATA[A]]></given-names>
</name>
<name>
<surname><![CDATA[Hilario]]></surname>
<given-names><![CDATA[C]]></given-names>
</name>
<name>
<surname><![CDATA[Collado]]></surname>
<given-names><![CDATA[J.M.]]></given-names>
</name>
<name>
<surname><![CDATA[Carrasco]]></surname>
<given-names><![CDATA[J.P.]]></given-names>
</name>
<name>
<surname><![CDATA[Flores]]></surname>
<given-names><![CDATA[M.J.]]></given-names>
</name>
<name>
<surname><![CDATA[Pastor]]></surname>
<given-names><![CDATA[J.M.]]></given-names>
</name>
<name>
<surname><![CDATA[Rodríguez]]></surname>
<given-names><![CDATA[J]]></given-names>
</name>
</person-group>
<article-title xml:lang="en"><![CDATA[IVVI: Intelligent Vehicle Based on Visual Information]]></article-title>
<source><![CDATA[Robotics and Autonomous Systems]]></source>
<year>2007</year>
<volume>55</volume>
<numero>12</numero>
<issue>12</issue>
<page-range>904-916</page-range></nlm-citation>
</ref>
<ref id="B2">
<nlm-citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Bertozzi]]></surname>
<given-names><![CDATA[M]]></given-names>
</name>
<name>
<surname><![CDATA[Broggi]]></surname>
<given-names><![CDATA[A]]></given-names>
</name>
<name>
<surname><![CDATA[Fascioli]]></surname>
<given-names><![CDATA[A]]></given-names>
</name>
</person-group>
<article-title xml:lang="en"><![CDATA[Vision-Based Intelligent Vehicle: State of the Art and Perspectives]]></article-title>
<source><![CDATA[Robotics and Automation Systems]]></source>
<year>2000</year>
<volume>32</volume>
<numero>1</numero>
<issue>1</issue>
<page-range>1-16</page-range></nlm-citation>
</ref>
<ref id="B3">
<nlm-citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname><![CDATA[De la Escalera]]></surname>
<given-names><![CDATA[A]]></given-names>
</name>
<name>
<surname><![CDATA[Mata]]></surname>
<given-names><![CDATA[M]]></given-names>
</name>
</person-group>
<article-title xml:lang="en"><![CDATA[Traffic Sign Recognition and Analysis for Intelligent Vehicle]]></article-title>
<source><![CDATA[Image and Vision Computing]]></source>
<year>2003</year>
<volume>21</volume>
<numero>3</numero>
<issue>3</issue>
<page-range>247-258</page-range></nlm-citation>
</ref>
<ref id="B4">
<nlm-citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Flan]]></surname>
<given-names><![CDATA[N.N.]]></given-names>
</name>
<name>
<surname><![CDATA[Moore]]></surname>
<given-names><![CDATA[K.L.]]></given-names>
</name>
</person-group>
<article-title xml:lang="en"><![CDATA[A Small Mobile Robot for Security and Inspection]]></article-title>
<source><![CDATA[Control Engineering Practice]]></source>
<year>2004</year>
<volume>10</volume>
<numero>11</numero>
<issue>11</issue>
<page-range>1265- 1270</page-range></nlm-citation>
</ref>
<ref id="B5">
<nlm-citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Hagras]]></surname>
<given-names><![CDATA[H.A.]]></given-names>
</name>
</person-group>
<article-title xml:lang="en"><![CDATA[A Hierarchical Type-2 Fuzzy Logic Control Architecture for Autonomous Mobile Robots]]></article-title>
<source><![CDATA[IEEE Transactions on Fuzzy Systems]]></source>
<year>2004</year>
<volume>12</volume>
<numero>4</numero>
<issue>4</issue>
<page-range>524-539</page-range></nlm-citation>
</ref>
<ref id="B6">
<nlm-citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Isozaki]]></surname>
<given-names><![CDATA[Y]]></given-names>
</name>
<name>
<surname><![CDATA[Nakai]]></surname>
<given-names><![CDATA[K]]></given-names>
</name>
</person-group>
<article-title xml:lang="en"><![CDATA[Development of a Work Robot with a Manipulator and a Transport Robot for Nuclear Facility Emergency Preparedness]]></article-title>
<source><![CDATA[Advanced Robotics]]></source>
<year>2002</year>
<volume>16</volume>
<numero>6</numero>
<issue>6</issue>
<page-range>489-492</page-range></nlm-citation>
</ref>
<ref id="B7">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Jantsen]]></surname>
<given-names><![CDATA[J]]></given-names>
</name>
</person-group>
<source><![CDATA[Foundations of Fuzzy Control]]></source>
<year>2007</year>
<page-range>13-69</page-range><publisher-loc><![CDATA[West Sussex ]]></publisher-loc>
<publisher-name><![CDATA[John Wiley & Sons]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B8">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Kovacic]]></surname>
<given-names><![CDATA[Z]]></given-names>
</name>
<name>
<surname><![CDATA[Bogdan]]></surname>
<given-names><![CDATA[S]]></given-names>
</name>
</person-group>
<source><![CDATA[Fuzzy Controller Design: Theory and Applications, Control Engineering Series]]></source>
<year>2005</year>
<page-range>9-40</page-range><publisher-loc><![CDATA[Boca Raton^eFlorida Florida]]></publisher-loc>
<publisher-name><![CDATA[CRC Press]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B9">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Lopez]]></surname>
<given-names><![CDATA[L.F.]]></given-names>
</name>
</person-group>
<source><![CDATA[Navegación de un vehículo guiado por tratamiento y análisis de imágenes con control difuso]]></source>
<year></year>
<page-range>132</page-range></nlm-citation>
</ref>
<ref id="B10">
<nlm-citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Micheloni]]></surname>
<given-names><![CDATA[C]]></given-names>
</name>
<name>
<surname><![CDATA[Foresti]]></surname>
<given-names><![CDATA[G.L.]]></given-names>
</name>
<name>
<surname><![CDATA[Piciarelli]]></surname>
<given-names><![CDATA[C]]></given-names>
</name>
<name>
<surname><![CDATA[Cinque]]></surname>
<given-names><![CDATA[L]]></given-names>
</name>
</person-group>
<article-title xml:lang="en"><![CDATA[An Autonomous Vehicle for Video Surveillance of Indoor Environments]]></article-title>
<source><![CDATA[IEEE Transactions on Vehicular Technology]]></source>
<year>2007</year>
<volume>56</volume>
<numero>2</numero>
<issue>2</issue>
<page-range>487-498</page-range></nlm-citation>
</ref>
<ref id="B11">
<nlm-citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Nagasaka]]></surname>
<given-names><![CDATA[Y]]></given-names>
</name>
<name>
<surname><![CDATA[Umeda]]></surname>
<given-names><![CDATA[N]]></given-names>
</name>
<name>
<surname><![CDATA[Kanetia]]></surname>
<given-names><![CDATA[Y]]></given-names>
</name>
<name>
<surname><![CDATA[Taniwaki]]></surname>
<given-names><![CDATA[K]]></given-names>
</name>
<name>
<surname><![CDATA[Sasaki]]></surname>
<given-names><![CDATA[Y]]></given-names>
</name>
</person-group>
<article-title xml:lang="en"><![CDATA[Autonomous Guidance for Rice Transplanting Using Global Positioning and Gyroscopes]]></article-title>
<source><![CDATA[Computers and Electronics in Agriculture]]></source>
<year>2004</year>
<volume>43</volume>
<numero>3</numero>
<issue>3</issue>
<page-range>223-234</page-range></nlm-citation>
</ref>
<ref id="B12">
<nlm-citation citation-type="book">
<collab>NI Fuzzy</collab>
<source><![CDATA[Logic Control Toolbox. Application Notes]]></source>
<year>2002</year>
<publisher-name><![CDATA[National Instruments]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B13">
<nlm-citation citation-type="book">
<collab>NI Vision for LabVIEW</collab>
<source><![CDATA[User Manual]]></source>
<year>2005</year>
<publisher-name><![CDATA[National Instruments]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B14">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Ruano]]></surname>
<given-names><![CDATA[A.E.]]></given-names>
</name>
</person-group>
<source><![CDATA[Intelligent Control Systems using Computational Intelligence Techniques]]></source>
<year>2005</year>
<page-range>3-34</page-range><publisher-loc><![CDATA[London ]]></publisher-loc>
<publisher-name><![CDATA[The Institution of Engineering and Technology]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B15">
<nlm-citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Srini]]></surname>
<given-names><![CDATA[V.P.]]></given-names>
</name>
</person-group>
<article-title xml:lang="en"><![CDATA[A Vision for Supporting Autonomous Navigation in Urban Environments]]></article-title>
<source><![CDATA[Computer]]></source>
<year>2006</year>
<volume>39</volume>
<numero>12</numero>
<issue>12</issue>
<page-range>68-77</page-range></nlm-citation>
</ref>
<ref id="B16">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Tunstel]]></surname>
<given-names><![CDATA[E]]></given-names>
</name>
<name>
<surname><![CDATA[Anderson]]></surname>
<given-names><![CDATA[G.T.]]></given-names>
</name>
<name>
<surname><![CDATA[Wilson]]></surname>
<given-names><![CDATA[E.W.]]></given-names>
</name>
</person-group>
<article-title xml:lang="en"><![CDATA[Autonomous Mobile Surveying for Science Rovers Using in Situ Distributed Remote Sensing]]></article-title>
<source><![CDATA[IEEE International Conference on Systems, Man and Cybernetics]]></source>
<year>2007</year>
<page-range>2348-2353</page-range><publisher-loc><![CDATA[Montreal ]]></publisher-loc>
</nlm-citation>
</ref>
<ref id="B17">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Zhang]]></surname>
<given-names><![CDATA[H]]></given-names>
</name>
<name>
<surname><![CDATA[Liu]]></surname>
<given-names><![CDATA[D]]></given-names>
</name>
</person-group>
<source><![CDATA[Fuzzy Modeling and Fuzzy Control]]></source>
<year>2006</year>
<page-range>33-75</page-range><publisher-loc><![CDATA[Boston ]]></publisher-loc>
<publisher-name><![CDATA[Birkhauser]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B18">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Zhao]]></surname>
<given-names><![CDATA[B]]></given-names>
</name>
<name>
<surname><![CDATA[Zhu]]></surname>
<given-names><![CDATA[Z]]></given-names>
</name>
<name>
<surname><![CDATA[Mao]]></surname>
<given-names><![CDATA[E.R.]]></given-names>
</name>
<name>
<surname><![CDATA[Song]]></surname>
<given-names><![CDATA[Z.H.]]></given-names>
</name>
</person-group>
<article-title xml:lang="en"><![CDATA[Vision System Calibration of Agricultural Wheeled-Mobile Robot Based on BP Neural Network]]></article-title>
<source><![CDATA[International Conference on Machine Learning and Cybernetics]]></source>
<year>2007</year>
<page-range>340-344</page-range><publisher-loc><![CDATA[Hong Kong ]]></publisher-loc>
</nlm-citation>
</ref>
</ref-list>
</back>
</article>
