iCub-main
Loading...
Searching...
No Matches
iKinGazeCtrl

Gaze controller based on iKin.

+ Collaboration diagram for iKinGazeCtrl:

Gaze controller based on iKin.

Copyright (C) 2010 RobotCub Consortium

Authors: Ugo Pattacini, Alessandro Roncone

CopyPolicy: Released under the terms of the GNU GPL v2.0.

Description

This module provides a controller for the iCub gaze capable of steering the neck and the eyes independently performing saccades, pursuit, vergence, OCR (oculo-collic reflex) and VOR (vestibulo-ocular reflex relying on inertial data).

The controller can be seen as cartesian gaze controller since it receives as input a 3D position in the task space. Nonetheless, further command modalities are available: 1) the coordinates (u,v) of just one pixel in the image plane along with a guessed component z in the eye's reference frame can be provided, or alternatively the vergence angle; 2) the position of the target within the two image planes can be converted in the 3D task space using the monocular approach coupled with a pid on the component z; 3) the head-centered azimuth and elevation angles along with the vergence angle can be given to the module both in absolute and relative mode.

Moreover, this module also implements the server part of the Gaze Control Interface. For a tutorial on how to use the interface, please go here.

Note
If the torso is not detected alive then the module will try to keep on working with just the head part.
If you're going to use this controller for your work, please quote it within any resulting publication: Roncone A., Pattacini U., Metta G. & Natale L., "A Cartesian 6-DoF Gaze Controller for Humanoid Robots", Proceedings of Robotics: Science and Systems, Ann Arbor, MI, June 18-22, 2016.

Reminder
If you experience a slow speed motion, please check the shift factors settings within your low-level configuration file of the head part: they should be properly tuned. Usually a value of 8 is enough.

Rule: a lower shift factor allows to yield an higher joint speed and at the same time it increases the value of minimum speed that can be executed.

Example: look in the file icub_head_torso.ini of your robot setup; you should find something similar to:

[VELOCITY]
Shifts 8 8 8 8 8 8 ...
Note
A video on iCub gazing can be seen here.

Libraries

Parameters

–context dir

–from file

–name ctrlName

–robot name

–head name

–torso name

–trajectory_time::neck time

–trajectory_time::eyes time

–cameras::context dir

–cameras::file file

–saccades switch

–neck_position_control switch

–imu::mode switch

–imu::source_port_name name

–imu::stabilization_gain gain

–imu::gyro_noise_threshold thres

–imu::vor gain

–ocr gain

–ping_robot_tmo tmo

–eye_tilt::min min

–eye_tilt::max max

–min_abs_vel vel

–head_version ver

–verbose

–tweak::file file

–tweak::overwrite switch

Ports Accessed

The ports the module is connected to: e.g. /icub/head/command:i and so on.

Ports Created

There are different ways of commanding a new target fixation point:

The module creates the usual ports required for the communication with the robot (through interfaces) and the following ports:

Note
When the tracking mode is active and the controller has reached the target, it keeps on sending velocities to the head in order to compensate for any movements induced by the torso. If tracking mode is switched off, the controller automatically disconnects once the target is attained and reconnects at the next requested target. The controller starts by default in non-tracking mode.

Coordinate System

Positions (meters) refer to the root reference frame attached to the waist as in the official documentation.

Input Data Files

None.

Output Data Files

None.

Configuration Files

A configuration file passed through --cameras::file contains the fields required to specify the cameras intrinsic parameters along with a roto-translation matrix appended to the eye kinematic (see the iKinChain::setHN method) in order to achieve the alignment with the optical axes compensating for possible unknown offsets.

The final roto-translation matrix is meaningful only as result of the calibration of the cameras extrinsic parameters that can be obtained for instance through the stereoCalib module.

Example:

[CAMERA_CALIBRATION_RIGHT]
fx 225.904
fy 227.041
cx 157.858
cy 113.51
[CAMERA_CALIBRATION_LEFT]
fx 219.057
fy 219.028
cx 174.742
cy 102.874
[ALIGN_KIN_LEFT]
HN (0.0 1.0 2.0 ... 15.0) // list of 4x4 doubles (per rows)
[ALIGN_KIN_RIGHT]
HN (0.0 1.0 2.0 ... 15.0) // list of 4x4 doubles (per rows)

Tested OS

Windows, Linux

Author
Ugo Pattacini, Alessandro Roncone