SciELO - Scientific Electronic Library Online

 
vol.43 número3Correlation-based inter and intra-band predictions for lossless compression of multispectral imagesPost-compensation of a CT first-order ΣΔ ADC using PWL dynamic system índice de autoresíndice de materiabúsqueda de artículos
Home Pagelista alfabética de revistas  

Servicios Personalizados

Revista

Articulo

Indicadores

  • No hay articulos citadosCitado por SciELO

Links relacionados

  • No hay articulos similaresSimilares en SciELO

Compartir


Latin American applied research

versión impresa ISSN 0327-0793versión On-line ISSN 1851-8796

Lat. Am. appl. res. vol.43 no.3 Bahía Blanca jul. 2013

 

Homography-based pose estimation to guide a miniature helicopter during 3d-trajectory tracking

A.S. Brandão, J.A. Sarapura, M. Sarcinelli-Filhoand R. Carelli

Graduate Program on Electrical Engineering, Federal University of Espírito Santo, Vitória -ES, Brazil
lexandre.brandao@ufv.br, mario.sarcinelli@ele.ufes.br

Institute of Automatics, National University of San Juan, San Juan, Argentine
jsarapura@inaut.unsj.edu.ar, rcarelli@inaut.unsj.edu.ar

Abstract — This work proposes a pose-based visual servoing control, through using planar homography, to estimate the position and orientation of a miniature helicopter relative to a known pattern. Once having the current flight information, the nonlinear underactuated controller presented in one of our previous works, which attends all flight phases, is used to guide the rotorcraft during a 3D-trajectory tracking task. In the sequel, the simulation framework and the results obtained using it are presented and discussed, validating the proposed controller when a visual system is used to determine the helicopter pose information.

Keywords — Aerial Vehicles; Planar Homography; Underactuated Machines; Nonlinear Control.

I. INTRODUCTION

In the last decades the research effort related to Unmanned Aerial Vehicles (UAV) has grown substantially, aiming at either military or civil applications, such as inspection of large areas in public safety applications, natural risk management, intervention in hostile environments, infrastructure maintenance and precision agriculture (Kendoul et al., 2010). In such cases, the use of a UAV is extremely advantageous, compared to the use of one or even several Unmanned Ground Vehicles (UGV), due to its 3D mobility.

Consider, for instance, a miniature helicopter, similar to a real helicopter, this UAV is one of the most complex flying machines, due to its high maneuverability, which allows it to takeoff and land vertically, to hover, to rotate, to move aside or ahead while keeping the height, to change the direction of its movement very quickly, as well as to completely stop (Castillo et al., 2005). For these capabilities, it is quite useful for many tasks as aforementioned. Moreover, the complexity of the dynamic model of such rotorcraft makes the use of nonlinear flight controllers a good option to take into account that such vehicles represent an inherently unstable, nonlinear, multi-variable and underactuated system, with complex and highly coupled dynamics.

Actually, a meaningful research effort has been devoted to design flight controllers to guide miniature helicopters autonomously. Traditionally, such controllers involve inner and outer loops, which are responsible, respectively, for stabilizing the vehicle dynamics and for controlling its navigation based on its kinematics model (Antunes et al., 2010). However, guaranteeing the stability and the performance of the two control systems when working independently is not enough to guarantee the stability and performance of the whole control system, due to the highly coupled dynamics. Another research line considers an integrated solution for the dynamic and kinematic systems, commonly based on nonlinear control techniques.

In Dzul et al. (2003), for instance, it is proposed a robust controller based on classical and adaptive pole placement techniques, to control the yaw angle and the height of a mini-helicopter, whose dynamic model was obtained using the Euler-Lagrange formulation. In Palomino et al. (2003), it is proposed a system to control the pose of a PVTOL (Planar Vertical Takeoff and Landing) aircraft, which is based on the linearization of the simplified dynamic model (such simplification is associated to the planar movement). The stability of such system is also analyzed, using the theory of Lyapunov for linear systems. In Kahn and Foch (2003) and Buskey et al. (2003), it is implemented the pose control of a mini-helicopter, using an adaptive neural controller and a group of nested PID controllers, respectively. In Marconi and Naldi (2006), it is designed a robust controller for reference tracking, considering longitudinal, lateral, vertical and yaw movements, and considering the parametric uncertainties associated to the aircraft model as well. In Budiyono and Wibowo (2007), it is discussed a trajectory tracking control based on optimal control techniques for a UAV, using a linearized complete dynamic model for the aircraft. In Martini et al. (2008), robust control with state observer is applied to a nonlinear Lagrangian model of a helicopter to control it under vertical wind gust. In Antunes et al. (2010), it is proposed a trajectory tracking controller based on gain-scheduling, applied to the linearized dynamic model of a UAV, considering the issues of the multiple time-rating sensor used during navigation. In Kendoul et al. (2010), a nonlinear model-based controller is designed to a quad-rotor, using inner-outer loop control scheme. A multi-variable PD controller is proposed in Lara et al. (2010) to stabilize the attitude of a quad-rotor, considering the data transmission delay in the stability analysis.

It is known that one of the major problems in UAV navigation is the difficulty to define its pose and linear velocities, i.e, its non-inertial variables. An approach to overcome such difficulty is to use vision based sensors, due to their ever-growing capability to capture information, which makes them suitable inertial sensors (Soria et al., 2008). According to Gonçalves et al. (2010), pose estimation and automatic landing based on vision systems is not new in aircraft control. In this case, concepts related to visual servo control should be considered, i.e., the problem becomes to use an artificial vision system to provide sensorial information to control the motion of a robot (an autonomous aircraft, in this case). Such a problem can be classified as positionbased visual servoing (PBSV) or image-based visual servoing (IBSV) (Hutchinson et al., 1996). PBSV involves the reconstruction of the pose of a target from the vision data to control the robot. Therefore, it is required an excellent geometric model and a well-done calibration process. By its turn, IBSV uses the vision data directly in the control loop, aiming at controlling the dynamics of the characteristics in the image plane. Many recent works have proposed different solutions to these control problems. In Hamel and Mahony (2007), for instance, planar homography (a non-singular linear relationship between points on planes) is applied to estimate the position of targets used to control, in a IBSV way, an underactuated quad-rotor capable of vertical takeoff and landing, and quasi-stationary hover. In the same line, in Gonçalves et al. (2010) it is proposed a IBSV controller to guide an airplane in automatic approaching and landing tasks. A vision-based flight is proposed in Kendoul et al. (2010) to execute landing tasks over a ground-target, after tracking it autonomously. A self-positioning system based on SLAM concepts is developed in Caballero et al. (2009), using homography-based techniques and natural landmarks to define the pose of a helicopter in outdoor applications.

According to Kendoul et al. (2010), a UAV equipped with an onboard camera can be considered as an eye in the sky. Therefore, it can be used as an extra sensor for a ground formation or a land station, for example, giving to them a 2D½ or 3D visual information, either if only the UAV sees downwards or if the UAV sees and is seen by the land station, respectively. As an example in such context, in Chaimowicz and Kumar (2004) a UAV formation composed by blimps takes aerial images and uses them to monitor and to command a heterogenous ground formation during urban area surveillance tasks, such as searching for targets and mapping the environment. In Michael et al. (2007), in turn, a decentralized control is implemented to guide ground robots while they are followed by an aerial vehicle (a 6DOF cable-controlled robot). Such vehicle delivers images used to define the pose and shape of the formation, as well as to aid it during obstacle avoidance. In Brandao et al., 2010a), it is proposed a planar homography-based system to estimate the pose of the helicopter and the shape and position of the UGV formation, during approaching and tracking tasks.

In such context, the contribution of this work is the proposal of an homography-based pose estimation subsystem for a miniature helicopter during 3D-trajectory tracking control. Applying IBSV techniques, a set of image characteristics associated to a landmark of known size and shape is extracted, which is adopted to estimate the rotorcraft pose. The nonlinear controller adopted to guide the helicopter, as well as the nonlinear underactuated dynamic model from which it is derived, have already been presented in our previous work (Brandao et al., 2010b).

To discuss such topics, the paper is hereinafter organized as following. Section II proposes a high-level flight controller to guide the UAV during trajectory tracking and presents the reference profile necessary to guide the aircraft maneuvers. In the sequel, Section III describes the artificial vision system used onboard the aircraft and proposes the image-based visual servoing system used to determine the pose of the helicopter relatively to the landmark (whose pose and dimensions are known). Following, Section IV describes the simulation framework used to validate the whole control system, as well as presents and discusses the results obtained for a simulated flight corresponding to a 3D-trajectory tracking. Finally, Section V highlights the main conclusions of the work.

II. THE HIGH-LEVEL NONLINEAR UNDERACTUATED CONTROLLER

The complete model of a miniature helicopter embeds four subsystems: the actuator dynamics, the rotary wing dynamics, the force and moment generation, and the rigid body dynamics (John and Sastry, 1999; Ahmed et al., 2010), as shown in Fig. 1. The first three subsystems are associated to the low-level controller of the rotorcraft. If a small vertical take-off and landing (VTOL) vehicle is considered (vehicles weighing less than 20kg), then the relationship between the actuator dynamics and generation of aerodynamic forces and torques can be approximated by a linear function (Kondak et al., 2007). Taking this into account, the controller proposed by Brandao et al. (2010b) focuses on the high-level control (the last block), allowing to determine the force and torque inputs (called abstract system inputs) necessary to control the miniature helicopter, i.e.,

(1)

or simply In such a case, wind gusts, ground effect, air resistance, and so on, represented by the vector of disturbances and friction forces acting at the rotorcraft, are disregarded.


Figure 1: The full model of the helicopter dynamics.

In the sequel, in order to propose a partial feedback linearization controller (Spong, 1994), it is necessary to represent (1) in the underactuated format

(2)

for which a suitable control signal to guide the rotorcraft is

where , with being the tracking error and diagonal positive gain matrices. It is important to mention that are the adopted actuated/active and unactuated/passive variables, respectively.

As discussed in Brandao et al. (2010b), while executing a trajectory tracking, path following or even positioning, the aircraft needs a flight reference profile in the 3D space. In other words, to reach a position (x,y,z), the helicopter should change its orientation during the navigation, so that an attitude reference profile (φd , θdd) should be defined. In this case, the roll and pitch angles can be obtained through using the underactuated system constraints, considering that the longitudinal and lateral displacements are decoupled. Thus, one has

(3)
(4)

knowing that and and assuming that where are diagonal gain matrices.

Finally, the yaw angle is commonly obtained by the velocity of the reference trajectory in the xY plane, that is,

(5)

Thus, given the reference signals in the 3D workspace the helicopter is capable of following a trajectory, tracking a path or even reaching a pose, using the controller proposed in Brandao et al. (2010).

III. THE VISION SYSTEM USED TO ESTIMATE THE POSE OF THE MINIATURE HELICOPTER

The estimated pose (position and orientation) is important for autonomous helicopters in applications such as surveillance, autonomous navigation and cooperative tasks with other UAVs or ground robots. The goal is to use visual information to determine the pose of the helicopter instead of using heavy and complex navigation systems or GPS.

This section presents a strategy for estimating the pose of the helicopter using visual information (from a known reference pattern) and flat homography. By a planar homography, images of points on a plane in one view are related to corresponding image points in another view using a homogeneous representation. In contrast to other pose estimation methods (Altug and Taylor, 2004; Amidi et al., 1999; Nordberg et al., 2002), this proposal only uses a camera onboard the helicopter. The camera is responsible for capturing images from a flat pattern or set of known reference marks (the adopted pattern with known geometric size is shown in Fig. 2). In order to determine the characteristics of the image, which are the centroids of the squares in the extremities of the landmark, it is used the color segmentation algorithm described in Roberti et al. (2009).


Figure 2: The landmark H with its dimensions.

The 2D projective transformation between the points of the pattern and the points in the image plane is commonly called homography induced by plane (the plane pattern) and is used to retrieve information from movement with respect to a camera on the helicopter. As the camera is fixed on the vehicle, it is possible to obtain information of the pose of the UAV through just a constant Euclidean transformation. According to Ma et al. (2003), the minimum number of points to determine the homography is four. However, more than four points could be used to reduce the effect of noise on the features detection.

In this paper, four points were taken to define the homography which are the centers of mass of the four squares located at the ends of the mark in the shape of H. Note that the center of the pattern H represents the origin of the coordinate system associated with the inertial frame <G>. The camera frame <C> has a fixed position relative to the helicopter frame <h>, which pose at time t with respect to <G> is given by , where η and ξ represent the orientation and position vector of the UAV, and g is the Euclidean transformation that takes the coordinates of the points in <h> to points in <G>.

According to Ma et al. (2003), the model of the imaging camera can be written as

(6)

where λ is the unknown depth or distance to the camera of the x0 point expressed in the frame <G>, x are the coordinates of the image, K is the matrix of intrinsic parameters (assumed known by prior calibration), P0 is the canonical projection matrix and g is the Euclidean transformation that takes the coordinates of the points in <G> to points in <C>.

Let us call xvto a virtual image captured at an initial time and x to the image taken at time t from pose of the UAV, both images of the same point x0 in <G> of the pattern H. And let us call xv and x to the homogeneous coordinates in <C> of point x0 corresponding to the virtual pose of the camera and its pose at time t. These image coordinates are related by or , where H is the homography matrix and the

symbol ∼ expresses a proportional relationship. If we call N to the normal vector expressed in <C> of the pattern plane H at initial time (virtual image) and d to the distance from this plane to the optical center of the first image, then by the decomposition algorithm presented in Faugeras and Lustman (1988) and Zhang and Hanson (1995), H takes the form

(7)

where R and t are rotation matrix and translation vector, with g=(R, t) being the transformation of coordinates of the pattern points in the virtual camera frame to the coordinates in the camera frame at time t. The matrix H can be estimated from the four points of the pattern H and from this estimate break it down to retrieve R and t that contain the information of movement of the camera, using a four-point algorithm for flat scene. The decomposition of the homography matrix H using the algorithm of four points (Ma et al., 2003), returns four possible solutions of the form from which two of them can be eliminated by using the positive depth constraint, i.e. (i.e., the points of the pattern H should be facing the camera) where .

From the two remaining solutions, it is selected that one whose normal vector N to the plane of the pattern H is closer to e3 in the Euclidean norm. Since this vector is expressed in the frame associated with the camera that takes the virtual image, the solution is immediate. Finally, it should be emphasized that from the breakdown of H only the ratio t/d can be obtained due to the inherent ambiguity of (7). To solve the scaling factor the knowledge of the physical dimension of the pattern H is used. Then, the pose of the helicopter at time t is given by

(8)
(9)

where is the Euclidean transformation of <h> to <G> and is equivalent to represents the time instant of the virtual image, and g=(Rc, tc) is constant Euclidean transformation of <C> to <h>; without loss of generality tc =0 in (8).

Once computed the pose of the helicopter through the homografy-based PSVB system, such variables are used in the control loop of the vehicle.

III. THE SIMULATION FRAMEWORK, RESULTS AND DISCUSSION

This section presents a simulation framework developed to validate the servo visual system here proposed. Figure 3 illustrates its interface, composed by three windows. Figure 3(a), labeled External Vision, shows the whole scene seen by a viewer located at a pose defined by the four sliding bars. Figure 3(b), labeled Internal Vision, illustrates the camera view, whereas Figure 3(c) presents the segmentation of the image correspondent to the Internal Vision.

Figure 3:
The simulation framework developed. (a) An external view of the system. (b) The image captured by the camera onboard the helicopter. (c) The segmentation of the camera frame detecting the corners and the centroids of the four colored extremities of the landmark.

The dynamic model of the helicopter and the nonlinear underactuated controller proposed by Brandao et al. (2010b) are adopted to guide the UAV during a 3D trajectory-tracking flight simulation. The reference profile is a sloped one, given by {xd=cos t/4, yd=sin t/4, zd=3-cos t/4}. To reach a desired pose, the attitude references are given by (3), (4) and (5), for the roll, pitch and yaw angles, respectively. It is worthy to mention that before starting the tracking task itself, the helicopter should take-off vertically, till reaching 2.0m of altitude. Then, after tracking the sloped trajectory, the UAV should reach (0.0, 0.0, 1.5)m (in a positioning task), and then it should accomplish a landing task, thus concluding its mission.

Figure 4(a) and 4(b) illustrate the time evolution of the position and attitude of the aircraft, while Fig. 4(c) illustrates the path traveled by the aircraft during the simulation. Indeed, one can notice that the reference trajectory is effectively tracked by the aircraft, using the controller proposed in our previous work, without any oscillation around the reference values or steady-state delay. Thus, one can perceive that such controller is effective in guiding the autonomous navigation of the miniature helicopter when tracking a reference trajectory. Actually, it is important to highlight that the controller proposed in Brandao et al. (2010b) is capable to guide the helicopter in positioning and tracking tasks, not demanding any gain-scheduling or control law switching for specific parts of the planned flight (take-off, hovering, trajectory tracking, positioning before landing, hovering and landing itself). In other words, the proposed controller attends all the phases of a UAV flight. Then, the controller here proposed is suitable to guide the UAV during any mission involving positioning, trajectory tracking or path following (a sequence of positioning, without time constrains).


Figure 4: Simulation: sloped-plane trajectory reference.

Figure 5 shows the evolution of the characteristics (the centroids of the squares in the extremities of the landmark) in the image plane, with 320x240 pixel resolution. The takeoff task, which occurs during the first five seconds, is shown in Fig. 5(a), while Fig. 5(b) illustrates the following five seconds, which represents the beginning of the trajectory tracking task. In both figures the starting point corresponds to the square marks, whereas the ending point corresponds to circles. The black-solid line is used to connect the centroids at the start and end instants.


Figure 5: Time evolution of the characteristics (centroids of the corner squares) in the image plane a) Takeoff task, 0£t£5 sec., b) Starting the trajectory tracking task 5£t£10 sec.

To conclude the comments about the simulation presented, it is important to stress that when defining the gains of the high level controller, it is given a higher priority (a bigger value) to the altitude control, followed by the control of the yaw angle, with the lower priority for the other two active state variables (identical priorities). This is done to guarantee that the helicopter should reach a safe altitude before moving ahead or aside.

V. CONCLUDING REMARKS

A pose-based visual servoing control is here implemented to allow a 3D pose estimation using the navigation system of the helicopter, through a planar homography algorithm. The results here presented show that using visual information the controller proposed in Brandao et al. (2010b) is able to attend all flight phases when the UAV has to accomplish a planned mission. This includes take-off, hovering and landing tasks, considering positioning or trajectory/path tracking missions, as illustrated through the simulation discussed in this paper.

ACKNOWLEDGEMENTS
The authors thank CNPq for the financial support granted to this work (grant 474.467/2010-4). They also thank CAPES and SPU, which have supported the stay of Mr. BrandÃo in San Juan, Argentina, and Mr. Sarapura in Vitória-ES, Brazil. Mr. BrandÃo also thanks Federal University of Viçosa, Brazil, for supporting his participation in this work. Mr. Sarapura also thanks CONICET/Argentina for his doctoral scholarship.

REFERENCES
1. Ahmed, B., H.R. Pota and M. Garratt, "Flight control of a rotary wing uav using backstepping," Int. J. of Robust and Nonlinear Control, 20, 639-658 (2010).         [ Links ]
2. Altug, E. and C. Taylor, "Vision-based pose estimation and control of a model helicopter," IEEE Int. Conference on Mechatronics, 316-321 (2004).         [ Links ]
3. Amidi, O., T. Kanade and K. Fujita, "A visual odometer for autonomous helicopter flight," Journal of Robotics and Auton. Systems, 28, 185-193 (1999).         [ Links ]
4. Antunes, D., C. Silvestre and R. Cunha, "On the design of multi-rate tracking controllers: application to rotorcraft guidance and control," Int. J. of Robust and Nonlinear Control, 20, 1879-1902 (2010).         [ Links ]
5. Brandao, A.S., J.A. Sarapura, E.M.O. Caldeira, M. Sarcinelli-Filho and R. Carelli, "Decentralized control of a formation involving a miniature helicopter and a team of ground robots based on artificial vision," Latin American Robotics Symposium and Intelligent Robotics Meeting, 126-131 (2010a).         [ Links ]
6. Brandao, A.S., M. Sarcinelli-Filho and R. Carelli, "A nonlinear underactuated controller for 3d-trajectory tracking with a miniature helicopter," IEEE Int. Conf. on Ind. Technology, 1421-1426 (2010b).         [ Links ]
7. Budiyono, A. and S.S. Wibowo, "Optimal tracking controller design for a small scale helicopter," Journal of Bionic Engineering 4, 271-280 (2007).         [ Links ]
8. Buskey, G., J. Roberts, P. Corke and G. Wyeth, "Helicopter automation using a low-cost sensing system," the Australasian Conference on Robotics and Automation, Brisbane, Australia (2003).         [ Links ].
9. Caballero, F., L. Merino, J. Ferruz and A. Ollero, "Vision-based odometry and slam for medium and high altitude flying uavs," J. of Intelligent and Robotic Systems, 54, 137-161 (2009).         [ Links ]
10. Castillo, P., R. Lozano and A. Dzul, Modelling and Control of Mini-Flying Machines, Springer, USA (2005).         [ Links ]
11. Chaimowicz, L. and V. Kumar, "Aerial shepherds: Coordination among uavs and swarms of robots," 7th International Symposium on Distributed Autonomous Robotic Systems, Toulouse, France (2004).         [ Links ]
12. Dzul, A., R. Lozano and P. Castillo, "Adaptive altitude control for a small helicopter in a vertical flying stand," 42nd IEEE Conference on Decision and Control (2003).         [ Links ]
13. Faugeras, O.D. and F. Lustman, "Motion and structure from motion in a piecewise planar environment," International Journal of Pattern Recognition and Artificial Inteligence, 3, 485-508 (1988).         [ Links ]
14. Gonçalves, T., J. Azinheira and P. Rives, "Homographybased visual servoing of an aircraft for automatic approach and landing," IEEE Int. Conference on Robotics and Automation, 9-14 (2010).         [ Links ]
15. Hamel, T. and R. Mahony, "Image based visual servo control for a class of aerial robotic systems," Automatica, 43, 1975- 1983 (2007).         [ Links ]
16. Hutchinson, S., G.D. Hager and P.I. Corke, "A tutorial on visual servo control," IEEE Transactions on Robotics and Automation, 12, 651-670 (1996).         [ Links ]
17. John, T. and S. Sastry, "Differential flatness based full authority helicopter control design," 38th Conference on Decision & Control, Phoenix, Arizona, USA, 1982-1987 (1999).         [ Links ]
18. Kahn, A.D. and R.J. Foch, "Attitude command attitude hold and stability augmentation system for a small-scale helicopter uav," 22nd Digital Avionics Systems Conference, 8.A.4.1-8.A.4.10 (2003).         [ Links ]
19. Kendoul, F., Z. Yu and K. Nonami, "Guidance and nonlinear control system for autonomous flight of minirotorcraft unmanned aerial vehicles," J. of Field Robotics, 27, 311-334 (2010).         [ Links ]
20. Kondak, K., M. Bernard, N. Meyer and G. Hommel, "Autonomously flying vtol-robots: Modeling and control," IEEE International Conference on Robotics and Automation, Rome, Italy, 736-741 (2007).         [ Links ]
21. Lara, D., G. Romero, A. Sanchez, R. Lozano and A. Guerreo, "Robustness margin for attitude control of a four rotor mini-rotorcarft: Case of study," Journal of Field Robotics, 20, 143-152 (2010).         [ Links ]
22. Ma, Y., S. Soatto, J. Kosecka and S.S. Sastry, An Invitation to 3-D Vision: From Images to Geometric Models, Springer (2003).         [ Links ]
23. Marconi, L. and R Naldi, "Robust nonlinear control of a miniature helicopter for aerobatic maneuvers," 32nd Rotorcraft Forum (2006).         [ Links ]
24. Martini, A., F. Léornard and G. Abba, "Robust nonlinear control and stability analysis of 7dof model-scale helicopter under vertical wind gust," International Conference on Intelligent Robots and Systems, Nice, France, 354-359 (2008).         [ Links ]
25. Michael, N., J. Fink and V. Kumar, "Controlling a team of ground robots via an aerial robot," International Conference on Intelligent Robots and Systems, San Diego, CA, USA, 965-970 (2007).         [ Links ]
26. Nordberg, K., P. Doherty, G. Farneback, P. Forssen, G. Granlund, A. Moe and J. Wiklund, "Vision for a uav helicopter," IROS'02, workshop on aerial robotics, Lausanne, Switzerland (2002).         [ Links ]
27. Palomino, A., P. Castillo, I. Fantoni, R. Lozano and C. Pegard, "Control strategy using vision for the stabilization of an experimental pvtol aircraft setup," 42nd IEEE Conference on Decision and Control, 5, 4487-4492 (2003).         [ Links ]
28. Roberti, F., J. Sarapura, M. Toibero and R. Carelli, "Passivity-based visual servoing for 3d moving object tracking," xIII Reunión de Trabajo en Procesamiento de la Información y Control, Santa Fe, Argentina (2009).         [ Links ]
29. Soria, C., L. Pari, R. Carelli, J. Sebastián and A. Traslosheros, "Homography-based tracking control for mobile robots," Image Analysis and Recognition, Vol. 5112 of Lecture Notes in Computer Science, Springer Berlin -Heidelberg, pp. 718-728 (2008).         [ Links ]
30. Spong, M., "Partial feedback linearization of underactuated mechanical systems," International Conference on Intelligent Robots and Systems, and Advanced Robotic Systems and the Real World, 1, 314-321 (1994).         [ Links ]
31.Zhang, Z. and A.R. Hanson, "Scaled euclidean 3d reconstruction based on externally uncalibrated cameras," International Symposium on Computer Vision, Florida, USA, 37-42 (1995).         [ Links ]

Received: April 12, 2012
Accepted: April 19, 2012
Recommended by Subject Editor: Gastón Schlotthauer, María Eugenia Torres and José Luis Figueroa.

Creative Commons License Todo el contenido de esta revista, excepto dónde está identificado, está bajo una Licencia Creative Commons