Difference between revisions of "Objects recognition and position calculation (webcam)"
(Created page with " < Go back to main <!-- ==Detección de objetos== Para esta tarea se ha empleado el ''package'' [http://code.google.com/p/find-object/ find_object_2...") |
|||
(75 intermediate revisions by 2 users not shown) | |||
Line 1: | Line 1: | ||
− | [[Mobile manipulation | < | + | [[Mobile manipulation | < go back to main]] |
− | + | ---- | |
− | == | + | {| style="border: solid 0px white; width: 100%" |
− | + | !Related articles | |
+ | ---- | ||
+ | !Other articles | ||
+ | ---- | ||
+ | |- | ||
+ | | valign="top" width="50%" | | ||
+ | |||
+ | [[MYRAbot's arm control (bioloid+arduino)]]<br/> | ||
+ | [[MYRAbot's arm model for simulation (urdf+gazebo)]]<br/> | ||
+ | [[MYRAbot model for simulation (urdf+gazebo)]]<br/> | ||
+ | [[Integration of MYRAbot in moveIt! (gazebo+moveIt!)]]<br/> | ||
+ | [[Voice control (sphinx+festival)]]<br/> | ||
+ | [[MYRAbot's Teleoperation with xbox360 wireless controller (xbox360 controller+joy)]] | ||
+ | |||
+ | | valign="top" | | ||
+ | |||
+ | [[CeRVaNTeS' arm and base control (maxon+epos2)]]<br/> | ||
+ | [[CeRVaNTeS' arm model for simulation (urdf+gazebo)]]<br/> | ||
+ | [[CeRVaNTeS model for simulation (urdf+gazebo)]]<br/> | ||
+ | [[Integration of CeRVaNTeS in moveIt! (gazebo+moveIt!)]]<br/> | ||
+ | [[CeRVaNTeS' Teleoperation with xbox360 wireless controller (xbox360 controller+joy)]] | ||
+ | |||
+ | |} | ||
+ | ---- | ||
+ | |||
+ | |||
+ | =Objects recognition= | ||
+ | |||
+ | We have used the ''package'' [http://code.google.com/p/find-object/ find_object_2d] which has been develop by [https://introlab.3it.usherbrooke.ca/mediawiki-introlab/index.php/Mathieu_Labb%C3%A9 Mathieu Labbé] (Université de Sherbrooke). The program can recognizes objects in the scene using an image of the object. The program has a graphical user interface which allow to capture an image of the scene and save it. | ||
− | [[file:find_object_2d.png|thumb|500px|center| | + | [[file:find_object_2d.png|thumb|500px|center|Screenshot [http://code.google.com/p/find-object/ find_object_2d]]] |
− | + | In the screenshot that is shown above, we can see how the recognized object is within a boundary box which coincide with the corners of the object image. | |
− | + | The program provides several data of each found object in the scene. We will use the next data: | |
− | * | + | * Width of the object image (number of pixels). |
− | * | + | * Height of the object image (number of pixels). |
− | * | + | * Position of the corners of the object image in the scene. |
− | == | + | ==Utilization of a new OpenCV version in ROS Electric== |
− | + | The previous ''package'' needs in order to compile the [http://opencv.org/ OpenCV] version 2.4.3 or higher, but [http://wiki.ros.org/ ROS] Electric has the [http://opencv.org/ OpenCV] version 2.3.1. We will install the [http://opencv.org/ OpenCV] version 2.4.3, modify the ''package'' [http://wiki.ros.org/cv_bridge?distro=electric cv_bridge] and modify the ''package'' [http://code.google.com/p/find-object/ find_object_2d]. | |
− | === | + | ===Installation of OpenCV=== |
− | + | First we will [http://opencv.org/downloads.html download] the [http://opencv.org/ OpenCV] version 2.4.3 by the official web. We will extract the content of the downloaded file in the personal folder. We will execute the next commands in a terminal in order to update the [http://www.ubuntu.com/ ubuntu] ''packages'': | |
<syntaxhighlight> | <syntaxhighlight> | ||
Line 29: | Line 57: | ||
</syntaxhighlight> | </syntaxhighlight> | ||
− | + | When finish the update/upgrade, we will place us in the [http://opencv.org/ OpenCV] folder in a terminal and execute the next commands in the terminal in order to compile: | |
<syntaxhighlight enclose="div"> | <syntaxhighlight enclose="div"> | ||
Line 39: | Line 67: | ||
</syntaxhighlight> | </syntaxhighlight> | ||
− | + | We will execute the next command in the terminal in order to configure [http://opencv.org/ OpenCV]: | |
<syntaxhighlight>sudo gedit /etc/ld.so.conf.d/opencv.conf</syntaxhighlight> | <syntaxhighlight>sudo gedit /etc/ld.so.conf.d/opencv.conf</syntaxhighlight> | ||
− | + | We add the next code line to the opened file and save the changes: | |
<syntaxhighlight>/usr/local/lib</syntaxhighlight> | <syntaxhighlight>/usr/local/lib</syntaxhighlight> | ||
− | + | Now, we will execute the next commands in a terminal: | |
<syntaxhighlight> | <syntaxhighlight> | ||
Line 54: | Line 82: | ||
</syntaxhighlight> | </syntaxhighlight> | ||
− | + | We add the next code lines at the end of the opened file and save the changes: | |
<syntaxhighlight> | <syntaxhighlight> | ||
Line 61: | Line 89: | ||
</syntaxhighlight> | </syntaxhighlight> | ||
− | + | We will restart the PC to finish the installation. | |
− | === | + | ===Modification of the ''package'' cv_bridge=== |
− | + | We must modify the file "manifest.xml" of the ''package'' [http://wiki.ros.org/cv_bridge?distro=electric cv_bridge] in order to avoid the use of the previous version of [http://opencv.org/ OpenCV]. We will execute the next commands in a terminal: | |
<syntaxhighlight> | <syntaxhighlight> | ||
Line 72: | Line 100: | ||
</syntaxhighlight> | </syntaxhighlight> | ||
− | + | We will delete or comment the next code lines of the opened file: | |
<syntaxhighlight lang="xml"> | <syntaxhighlight lang="xml"> | ||
Line 80: | Line 108: | ||
</syntaxhighlight> | </syntaxhighlight> | ||
− | === | + | ===Modification of the ''package'' find_object_2d=== |
− | + | We must modify the file "CMakeLists.txt" of the package [http://code.google.com/p/find-object/ find_object_2d] in order to avoid the use of the previous version of OpenCV for the compilation. We will execute the next commands in a terminal: | |
<syntaxhighlight> | <syntaxhighlight> | ||
Line 89: | Line 117: | ||
</syntaxhighlight> | </syntaxhighlight> | ||
− | + | We will modify the next code line of the opened file: | |
<syntaxhighlight>find_package(OpenCV REQUIRED)</syntaxhighlight> | <syntaxhighlight>find_package(OpenCV REQUIRED)</syntaxhighlight> | ||
− | + | for: | |
<syntaxhighlight>find_package(OpenCV 2.4 REQUIRED)</syntaxhighlight> | <syntaxhighlight>find_package(OpenCV 2.4 REQUIRED)</syntaxhighlight> | ||
− | + | We have to add the next code lines to the file "CMakeLists.txt" in order to use a external [http://opencv.org/ OpenCV] library to [http://wiki.ros.org/ ROS], if the file don't have this lines. | |
− | + | ||
<syntaxhighlight> | <syntaxhighlight> | ||
− | find_package(OpenCV " | + | find_package(OpenCV "VERSION" REQUIRED) |
include_directories(${OpenCV_INCLUDE_DIRS}) | include_directories(${OpenCV_INCLUDE_DIRS}) | ||
... | ... | ||
− | target_link_libraries(" | + | target_link_libraries("PROGRAM_NAME" ${OpenCV_LIBS}) |
</syntaxhighlight> | </syntaxhighlight> | ||
− | == | + | =Position calculation= |
− | + | ||
+ | We will use the data provided by the ''package'' [http://code.google.com/p/find-object/ find_object_2d] in order to calculate the approximate position of the objects towards the [[Control brazo MYRAbot (bioloid+arduino) |MYRAbot's arm]] coordinates. We have to know the next physical parameters: | ||
− | * | + | * Position of the webcam towards the arm (h, d). |
− | * | + | * Tilt angle of the webcam towards the horizontal (α). |
− | * | + | * Perpendicular distance to the webcam lens of the objects when we take the capture (''dZ0''). |
− | [[file:webcam_MYRAbot_esquema.jpg|thumb|500px|center| | + | [[file:webcam_MYRAbot_esquema.jpg|thumb|500px|center|Sketch position calculation]] |
− | + | We have used the variation of area of the boundary box in order to calculate the distance between the recognized object and the webcam lens. The obtained equations are shown below, we have calculated the parameters using different known positions of an object: | |
− | [[file: | + | [[file:Webcam MYRAbot ecuaciones-en.jpg]] |
− | == | + | ==Program to publish the position of the objects== |
− | + | First [[MYRAbot's arm control (bioloid+arduino)#Creating custom messages in ROS|we will create some custom messages]] within the ''package'' "[http://code.google.com/p/find-object/ find_object_2d]" in order to publish the position of the recognized objects in the scene. We will create the basis message that contains the identifier and the placement point of the recognized object. We will create a file named "Point_id.msg" with the content that is shown below: | |
<syntaxhighlight>int16 id | <syntaxhighlight>int16 id | ||
geometry_msgs/Point punto</syntaxhighlight> | geometry_msgs/Point punto</syntaxhighlight> | ||
− | + | ||
+ | We will create a file named "PointObjects.msg" with the content that is shown below, in order to contain an array with the previous messages for each recognized object: | ||
<syntaxhighlight>Point_id[] objeto</syntaxhighlight> | <syntaxhighlight>Point_id[] objeto</syntaxhighlight> | ||
− | + | The program subscribes the ''topic'' "objects" (published by "find_object_2d_node") and publishes the ''topic'' "point". We will create a file named "objects_detected.cpp" within the ''package'' "[http://code.google.com/p/find-object/ find_object_2d]" with the content that is shown below: | |
<syntaxhighlight lang="cpp" enclose="div"> | <syntaxhighlight lang="cpp" enclose="div"> | ||
Line 251: | Line 281: | ||
</syntaxhighlight> | </syntaxhighlight> | ||
− | === | + | ===Compiling the program=== |
− | + | We have to add the next code lines at the end to the file "CMakeLists.txt" of the ''package'' "[http://code.google.com/p/find-object/ find_object_2d]", in order to compile and create the executable file: | |
<syntaxhighlight> | <syntaxhighlight> | ||
Line 260: | Line 290: | ||
</syntaxhighlight> | </syntaxhighlight> | ||
− | + | We will execute the next commands in a terminal: | |
<syntaxhighlight> | <syntaxhighlight> | ||
Line 267: | Line 297: | ||
</syntaxhighlight> | </syntaxhighlight> | ||
− | === | + | ===Executing the program=== |
− | + | We have to install the ''package'' "[http://wiki.ros.org/camera_umd?distro=electric camera_umd]" in order to launch our [http://en.wikipedia.org/wiki/USB USB] webcam. We will execute the next command in a terminal in order to launch the webcam: | |
<syntaxhighlight>rosrun uvc_camera camera_node</syntaxhighlight> | <syntaxhighlight>rosrun uvc_camera camera_node</syntaxhighlight> | ||
− | * '' | + | * ''note:'' If we have plugged several [http://en.wikipedia.org/wiki/USB USB] webcams, we have to set the webcam that we want to use (default webcam is "/dev/video0"). We will execute the next command in a terminal in order to change the used device: |
− | <syntaxhighlight>rosparam set uvc_camera/device /dev/ | + | <syntaxhighlight>rosparam set uvc_camera/device /dev/video"DEVICE_NUMBER"</syntaxhighlight> |
− | + | We will create a file named "encontrar_objetos.launch" within the folder "launch" of the ''package'' "[http://code.google.com/p/find-object/ find_object_2d]" with the content that is shown below, in order to launch the webcam, the [http://code.google.com/p/find-object/ find_object_2d] program and our program: | |
<syntaxhighlight lang="xml"> | <syntaxhighlight lang="xml"> | ||
Line 295: | Line 325: | ||
</syntaxhighlight> | </syntaxhighlight> | ||
− | + | We will execute the next command in a terminal in order to start the ''launcher'': | |
<syntaxhighlight>roslaunch find_object_2d encontrar_objetos.launch</syntaxhighlight> | <syntaxhighlight>roslaunch find_object_2d encontrar_objetos.launch</syntaxhighlight> | ||
− | == | + | =To point objects (Choose)= |
+ | |||
+ | We will use the previous program in order to control the [[MYRAbot's arm control (bioloid+arduino)|arm]] to point the selected object. The program subscribes the ''topics'' "point" (published by the previous program), "selected_object" (where we publish the identifier of the selected object) and "pose_arm" (where the states of the [[MYRAbot's arm control (bioloid+arduino)|arm]] are published). Also the program publishes the ''topics'' "move_arm" and "hand_arm" in order to control the [[MYRAbot's arm control (bioloid+arduino)|arm]]. When we start the program, the [[MYRAbot's arm control (bioloid+arduino)|arm]] goes to the initial position. When we publish the identifier of the selected object in the ''topic'' "selected_object" (we use the identifier 100 to return the arm to the initial position), the arm moves to point the object if it is present in the scene. Passed a short time, the [[MYRAbot's arm control (bioloid+arduino)|arm]] returns to the initial position. We will create a file named "escoja.cpp" within the folder "src" of the ''package'' "[[MYRAbot's arm control (bioloid+arduino)#Control programs|brazo_fer]]" with the content that is shown below: | ||
− | |||
<syntaxhighlight lang="cpp" enclose="div"> | <syntaxhighlight lang="cpp" enclose="div"> | ||
Line 343: | Line 374: | ||
double y = destino.y; | double y = destino.y; | ||
− | y = y + z*tan(atan2(45,250)); | + | y = y + z*tan(atan2(45,250)); |
int coordenadas_correctas = 1; | int coordenadas_correctas = 1; | ||
Line 385: | Line 416: | ||
− | if (isnan(gamma)) | + | if (isnan(gamma)) |
{ | { | ||
coordenadas_correctas = 0; | coordenadas_correctas = 0; | ||
Line 407: | Line 438: | ||
} | } | ||
else { | else { | ||
− | std::cout<<" | + | std::cout<<"Error, invalid coordinates or unattainable point"<<std::endl; |
::nuevo_punto = 0; | ::nuevo_punto = 0; | ||
Line 503: | Line 534: | ||
ros::Time ahora; | ros::Time ahora; | ||
− | ahora = ros::Time::now(); | + | ahora = ros::Time::now(); |
while (ros::Time::now() < (ahora + ros::Duration(1))) | while (ros::Time::now() < (ahora + ros::Duration(1))) | ||
Line 561: | Line 592: | ||
if (::cont_obj == 0) | if (::cont_obj == 0) | ||
{ | { | ||
− | std::cout<<" | + | std::cout<<"Object is not in the scene"<<std::endl; |
::nuevo_punto = 0; | ::nuevo_punto = 0; | ||
} | } | ||
Line 613: | Line 644: | ||
</syntaxhighlight> | </syntaxhighlight> | ||
− | + | We have to add the next line to the file "CMakeLists.txt" of the ''package'' "[[MYRAbot's arm control (bioloid+arduino)#Control programs|brazo_fer]]" in order to compile and create the executable file: | |
+ | |||
+ | <syntaxhighlight lang=cmake>rosbuild_add_executable(escoja src/escoja.cpp)</syntaxhighlight> | ||
+ | |||
+ | We will execute the next commands in a terminal in order to compile and create the executable file: | ||
+ | |||
+ | <syntaxhighlight>roscd brazo_fer | ||
+ | make</syntaxhighlight> | ||
+ | |||
+ | We will create a file named "escoja.launch" within the folder "launch" of the package "[[MYRAbot's arm control (bioloid+arduino)#Control programs|brazo_fer]]" with the content that is shown below, in order to launch the communication node with[http://www.arduino.cc/es/ arduino], the previous ''launcher'' "encontrar_objetos.launch" and our program: | ||
<syntaxhighlight lang="xml"> | <syntaxhighlight lang="xml"> | ||
Line 629: | Line 669: | ||
</syntaxhighlight> | </syntaxhighlight> | ||
− | + | We will execute the next command in a terminal in order to start the ''launcher'': | |
<syntaxhighlight>roslaunch brazo_fer escoja.launch</syntaxhighlight> | <syntaxhighlight>roslaunch brazo_fer escoja.launch</syntaxhighlight> | ||
+ | <!-- | ||
En la sección "[[Modelo para simulación MYRAbot (urdf+gazebo)#Ejecución del programa (entre la cerveza y la cola, la plataforma) escoja|Ejecución del programa (entre la cerveza y la cola, la plataforma) escoja]]" del apartado dedicado al [[Modelo para simulación MYRAbot (urdf+gazebo)|modelo para simulación MYRAbot]], puede verse un vídeo de la ejecución del programa en el simulador [http://wiki.ros.org/simulator_gazebo?distro=electric gazebo]. | En la sección "[[Modelo para simulación MYRAbot (urdf+gazebo)#Ejecución del programa (entre la cerveza y la cola, la plataforma) escoja|Ejecución del programa (entre la cerveza y la cola, la plataforma) escoja]]" del apartado dedicado al [[Modelo para simulación MYRAbot (urdf+gazebo)|modelo para simulación MYRAbot]], puede verse un vídeo de la ejecución del programa en el simulador [http://wiki.ros.org/simulator_gazebo?distro=electric gazebo]. | ||
+ | --> | ||
− | = | + | =Tracking objects= |
− | + | The program subscribes the ''topics'' "selected_object" (where we publish the identifier of the selected object) and "pose_arm" (where the states of the [[MYRAbot's arm control (bioloid+arduino)|arm]] are published). Also the program publishes the ''topics'' "move_arm" and "hand_arm" in order to control the [[MYRAbot's arm control (bioloid+arduino)|arm]]. When we start the program, the [[MYRAbot's arm control (bioloid+arduino)|arm]] goes to the initial position. When we publish the identifier of the selected object in the ''topic'' "selected_object" (we use the identifier 100 to return the arm to the initial position), the arm moves to track the object if it is present in the scene. We will create a file named "seguir_objetos.cpp" within the folder "src" of the ''package'' "[[MYRAbot's arm control (bioloid+arduino)#Control programs|brazo_fer]]" with the content that is shown below: | |
<syntaxhighlight lang="cpp" enclose="div"> | <syntaxhighlight lang="cpp" enclose="div"> | ||
Line 897: | Line 939: | ||
</syntaxhighlight> | </syntaxhighlight> | ||
− | + | We have to add the next line to the file "CMakeLists.txt" of the ''package'' "[[MYRAbot's arm control (bioloid+arduino)#Control programs|brazo_fer]]" in order to compile and create the executable file: | |
+ | |||
+ | <syntaxhighlight lang=cmake>rosbuild_add_executable(seguir src/seguir.cpp)</syntaxhighlight> | ||
+ | |||
+ | We will execute the next commands in a terminal in order to compile and create the executable file: | ||
+ | |||
+ | <syntaxhighlight>roscd brazo_fer | ||
+ | make</syntaxhighlight> | ||
+ | |||
+ | We will create a file named "seguir.launch" within the folder "launch" of the package "[[MYRAbot's arm control (bioloid+arduino)#Control programs|brazo_fer]]" with the content that is shown below, in order to launch the communication node with[http://www.arduino.cc/es/ arduino], the previous ''launcher'' "encontrar_objetos.launch" and our program: | ||
<syntaxhighlight lang="xml"> | <syntaxhighlight lang="xml"> | ||
Line 913: | Line 964: | ||
</syntaxhighlight> | </syntaxhighlight> | ||
− | + | We will execute the next command in a terminal in order to start the ''launcher'': | |
<syntaxhighlight>roslaunch brazo_fer seguir.launch</syntaxhighlight> | <syntaxhighlight>roslaunch brazo_fer seguir.launch</syntaxhighlight> | ||
− | + | We can see in this [[MYRAbot model for simulation (urdf+gazebo)#Execution_2|video]] the execution of this program in [http://wiki.ros.org/simulator_gazebo?distro=electric gazebo] simlator. | |
− | --> | + | |
+ | |||
+ | ---- | ||
+ | {| style="border: solid 0px white; width: 100%" | ||
+ | !Related articles | ||
+ | ---- | ||
+ | !Other articles | ||
+ | ---- | ||
+ | |- | ||
+ | | valign="top" width="50%" | | ||
+ | |||
+ | [[MYRAbot's arm control (bioloid+arduino)]]<br/> | ||
+ | [[MYRAbot's arm model for simulation (urdf+gazebo)]]<br/> | ||
+ | [[MYRAbot model for simulation (urdf+gazebo)]]<br/> | ||
+ | [[Integration of MYRAbot in moveIt! (gazebo+moveIt!)]]<br/> | ||
+ | [[Voice control (sphinx+festival)]]<br/> | ||
+ | [[MYRAbot's Teleoperation with xbox360 wireless controller (xbox360 controller+joy)]] | ||
+ | |||
+ | | valign="top" | | ||
+ | |||
+ | [[CeRVaNTeS' arm and base control (maxon+epos2)]]<br/> | ||
+ | [[CeRVaNTeS' arm model for simulation (urdf+gazebo)]]<br/> | ||
+ | [[CeRVaNTeS model for simulation (urdf+gazebo)]]<br/> | ||
+ | [[Integration of CeRVaNTeS in moveIt! (gazebo+moveIt!)]]<br/> | ||
+ | [[CeRVaNTeS' Teleoperation with xbox360 wireless controller (xbox360 controller+joy)]] | ||
+ | |||
+ | |} | ||
+ | ---- | ||
+ | |||
− | [[Mobile manipulation | < | + | [[Mobile manipulation | < go back to main]] |
Latest revision as of 11:08, 29 September 2014
Contents
Objects recognition
We have used the package find_object_2d which has been develop by Mathieu Labbé (Université de Sherbrooke). The program can recognizes objects in the scene using an image of the object. The program has a graphical user interface which allow to capture an image of the scene and save it.
In the screenshot that is shown above, we can see how the recognized object is within a boundary box which coincide with the corners of the object image.
The program provides several data of each found object in the scene. We will use the next data:
- Width of the object image (number of pixels).
- Height of the object image (number of pixels).
- Position of the corners of the object image in the scene.
Utilization of a new OpenCV version in ROS Electric
The previous package needs in order to compile the OpenCV version 2.4.3 or higher, but ROS Electric has the OpenCV version 2.3.1. We will install the OpenCV version 2.4.3, modify the package cv_bridge and modify the package find_object_2d.
Installation of OpenCV
First we will download the OpenCV version 2.4.3 by the official web. We will extract the content of the downloaded file in the personal folder. We will execute the next commands in a terminal in order to update the ubuntu packages:
sudo apt-get update
sudo apt-get upgrade
When finish the update/upgrade, we will place us in the OpenCV folder in a terminal and execute the next commands in the terminal in order to compile:
mkdir build
cd build
cmake -D WITH_TBB=ON -D BUILD_NEW_PYTHON_SUPPORT=ON -D WITH_V4L=ON -D INSTALL_C_EXAMPLES=ON -D INSTALL_PYTHON_EXAMPLES=ON -D BUILD_EXAMPLES=ON -D WITH_QT=ON -D WITH_OPENGL=ON ..
make
sudo make install
We will execute the next command in the terminal in order to configure OpenCV:
sudo gedit /etc/ld.so.conf.d/opencv.conf
We add the next code line to the opened file and save the changes:
/usr/local/lib
Now, we will execute the next commands in a terminal:
sudo ldconfig
sudo gedit /etc/bash.bashrc
We add the next code lines at the end of the opened file and save the changes:
PKG_CONFIG_PATH=$PKG_CONFIG_PATH:/usr/local/lib/pkgconfig
export PKG_CONFIG_PATH
We will restart the PC to finish the installation.
Modification of the package cv_bridge
We must modify the file "manifest.xml" of the package cv_bridge in order to avoid the use of the previous version of OpenCV. We will execute the next commands in a terminal:
roscd cv_bridge
sudo gedit manifest.xml
We will delete or comment the next code lines of the opened file:
<depend package="opencv2" />
...
<rosdep name="opencv2.3"/>
Modification of the package find_object_2d
We must modify the file "CMakeLists.txt" of the package find_object_2d in order to avoid the use of the previous version of OpenCV for the compilation. We will execute the next commands in a terminal:
roscd find_object_2d
sudo gedit CMakeLists.txt
We will modify the next code line of the opened file:
find_package(OpenCV REQUIRED)
for:
find_package(OpenCV 2.4 REQUIRED)
We have to add the next code lines to the file "CMakeLists.txt" in order to use a external OpenCV library to ROS, if the file don't have this lines.
find_package(OpenCV "VERSION" REQUIRED)
include_directories(${OpenCV_INCLUDE_DIRS})
...
target_link_libraries("PROGRAM_NAME" ${OpenCV_LIBS})
Position calculation
We will use the data provided by the package find_object_2d in order to calculate the approximate position of the objects towards the MYRAbot's arm coordinates. We have to know the next physical parameters:
- Position of the webcam towards the arm (h, d).
- Tilt angle of the webcam towards the horizontal (α).
- Perpendicular distance to the webcam lens of the objects when we take the capture (dZ0).
We have used the variation of area of the boundary box in order to calculate the distance between the recognized object and the webcam lens. The obtained equations are shown below, we have calculated the parameters using different known positions of an object:
Program to publish the position of the objects
First we will create some custom messages within the package "find_object_2d" in order to publish the position of the recognized objects in the scene. We will create the basis message that contains the identifier and the placement point of the recognized object. We will create a file named "Point_id.msg" with the content that is shown below:
int16 id
geometry_msgs/Point punto
We will create a file named "PointObjects.msg" with the content that is shown below, in order to contain an array with the previous messages for each recognized object:
Point_id[] objeto
The program subscribes the topic "objects" (published by "find_object_2d_node") and publishes the topic "point". We will create a file named "objects_detected.cpp" within the package "find_object_2d" with the content that is shown below:
#include <ros/ros.h>
#include <std_msgs/Float32MultiArray.h>
#include <opencv2/opencv.hpp>
#include <QTransform>
#include <geometry_msgs/Point.h>
#include <std_msgs/Int16.h>
#include <find_object_2d/PointObjects.h>
#include <find_object_2d/Point_id.h>
#define dZ0 450
#define alfa 40
#define h 310
#define d 50
#define PI 3.14159265
void objectsDetectedCallback(const std_msgs::Float32MultiArray& msg)
{
ros::NodeHandle nh;
ros::Publisher position_pub_=nh.advertise<find_object_2d::PointObjects>("point", 1);
find_object_2d::PointObjects p_objects;
find_object_2d::Point_id objeto;
p_objects.objeto = std::vector<find_object_2d::Point_id>(msg.data.size()/12);
for(unsigned int i=0; i<msg.data.size(); i+=12)
{
// get data
int id = (int)msg.data[i];
float objectWidth = msg.data[i+1];
float objectHeight = msg.data[i+2];
// Find corners Qt
QTransform qtHomography(msg.data[i+3], msg.data[i+4], msg.data[i+5],
msg.data[i+6], msg.data[i+7], msg.data[i+8],
msg.data[i+9], msg.data[i+10], msg.data[i+11]);
QPointF qtTopLeft = qtHomography.map(QPointF(0,0));
QPointF qtTopRight = qtHomography.map(QPointF(objectWidth,0));
QPointF qtBottomLeft = qtHomography.map(QPointF(0,objectHeight));
QPointF qtBottomRight = qtHomography.map(QPointF(objectWidth,objectHeight));
geometry_msgs::Point punto;
float widthTop = sqrt(pow(qtTopRight.x() - qtTopLeft.x(),2) + pow(qtTopRight.y() - qtTopLeft.y(),2));
float widthBottom = sqrt(pow(qtBottomRight.x() - qtBottomLeft.x(),2) + pow(qtBottomRight.y() - qtBottomLeft.y(),2));
float heightLeft = sqrt(pow(qtBottomLeft.x() - qtTopLeft.x(),2) + pow(qtBottomLeft.y() - qtTopLeft.y(),2));
float heightRight = sqrt(pow(qtBottomRight.x() - qtTopRight.x(),2) + pow(qtBottomRight.y() - qtTopRight.y(),2));
float dArea_0 = (objectHeight*objectWidth) - (((widthTop + widthBottom)/2) * ((heightLeft + heightRight)/2));
float dZ_0 = dZ0 + (dArea_0/10);
float dY_0 = (((480/2) - (((qtTopLeft.y() + qtTopRight.y())/2) + ((heightLeft + heightRight)/4)))*dZ_0)/585;
float beta_0 = atan2(dY_0,dZ_0);
objectHeight = objectHeight/cos((alfa*PI)/180);
float height = ((heightLeft + heightRight)/2)/cos(((alfa*PI)/180)-beta_0);
float dArea = (objectHeight*objectWidth) - (((widthTop + widthBottom)/2) * height);
float dZ = dZ0 + (dArea/38);
float dX = (((640/2) - (((qtTopLeft.x() + qtBottomLeft.x())/2) + ((widthTop + widthBottom)/4)))*dZ)/585;
float dY = (((480/2) - (((qtTopLeft.y() + qtTopRight.y())/2) + ((heightLeft + heightRight)/4)))*dZ)/585;
float beta = atan2(dY,dZ);
punto.x = dX;
punto.y = h-((dZ/cos(beta))*sin(((alfa*PI)/180)-beta));
punto.z = ((dZ/cos(beta))*cos(((alfa*PI)/180)-beta))-d;
//Validate detection
int paralelepipedo;
if (abs(widthTop - widthBottom) < 20 && abs(heightLeft - heightRight) < 15)
{
paralelepipedo = 1;
}
else
{
paralelepipedo = 0;
}
if (paralelepipedo == 1)
{
objeto.punto = punto;
objeto.id = id;
p_objects.objeto[i/12] = objeto;
}
}
position_pub_.publish(p_objects);
}
int main(int argc, char** argv)
{
ros::init(argc, argv, "objects_detected");
ros::NodeHandle nh;
ros::Subscriber subs = nh.subscribe("objects", 1, objectsDetectedCallback);
ros::Publisher position_pub_=nh.advertise<find_object_2d::PointObjects>("point", 1);
ros::spin();
return 0;
}
Compiling the program
We have to add the next code lines at the end to the file "CMakeLists.txt" of the package "find_object_2d", in order to compile and create the executable file:
rosbuild_add_executable(objects_detected src/objects_detected.cpp)
target_link_libraries(objects_detected ${LIBRARIES})
We will execute the next commands in a terminal:
roscd find_object_2d
make
Executing the program
We have to install the package "camera_umd" in order to launch our USB webcam. We will execute the next command in a terminal in order to launch the webcam:
rosrun uvc_camera camera_node
- note: If we have plugged several USB webcams, we have to set the webcam that we want to use (default webcam is "/dev/video0"). We will execute the next command in a terminal in order to change the used device:
rosparam set uvc_camera/device /dev/video"DEVICE_NUMBER"
We will create a file named "encontrar_objetos.launch" within the folder "launch" of the package "find_object_2d" with the content that is shown below, in order to launch the webcam, the find_object_2d program and our program:
<launch>
<node name="camera_node" pkg="uvc_camera" type="camera_node">
<param name="/device" value="/dev/videoNÚMERO_DE_NUESTRO_DISPOSITIVO"/>
</node>
<node name="find_object_2d" pkg="find_object_2d" type="find_object_2d">
<remap from="image" to="image_raw"/>
</node>
<node name="objects_detected" pkg="find_object_2d" type="objects_detected" />
</launch>
We will execute the next command in a terminal in order to start the launcher:
roslaunch find_object_2d encontrar_objetos.launch
To point objects (Choose)
We will use the previous program in order to control the arm to point the selected object. The program subscribes the topics "point" (published by the previous program), "selected_object" (where we publish the identifier of the selected object) and "pose_arm" (where the states of the arm are published). Also the program publishes the topics "move_arm" and "hand_arm" in order to control the arm. When we start the program, the arm goes to the initial position. When we publish the identifier of the selected object in the topic "selected_object" (we use the identifier 100 to return the arm to the initial position), the arm moves to point the object if it is present in the scene. Passed a short time, the arm returns to the initial position. We will create a file named "escoja.cpp" within the folder "src" of the package "brazo_fer" with the content that is shown below:
#include "ros/ros.h"
#include "brazo_fer/Servos.h"
#include "brazo_fer/WriteServos.h"
#include "brazo_fer/ReadServos.h"
#include "geometry_msgs/Point.h"
#include "find_object_2d/PointObjects.h"
#include "find_object_2d/Point_id.h"
#include "std_msgs/Int16.h"
#include "math.h"
#define PI 3.14159265
#define L1 104
#define L2 104
#define Lp 60
brazo_fer::Servos move, vel, pg, eg, cg;
find_object_2d::PointObjects p_objects;
int cont = 0;
int cont_obj = 0;
int nuevo_punto = 0;
int id_0 = 0;
int id_1 = 0;
double x = 0;
double y = 0;
double z = 0;
brazo_fer::Servos inversa(geometry_msgs::Point destino)
{
double z = destino.z;
double x = destino.x;
double y = destino.y;
y = y + z*tan(atan2(45,250));
int coordenadas_correctas = 1;
double alfa, beta, beta_a, beta_p, gamma, delta, delta_a, delta_p;
double z_p;
double L_a, L;
alfa = (atan2(x,z)*180)/PI;
z_p = sqrt(pow(z,2)+pow(x,2));
L = sqrt(pow(z_p,2)+pow(y,2));
L_a = sqrt(pow(y,2)+pow(z_p-Lp,2));
beta_p = atan2(y,z_p-Lp);
beta_a = acos((pow(L1,2)+pow(L_a,2)-pow(L2,2))/(2*L1*L_a));
beta = ((beta_p+beta_a)*180)/PI;
gamma = acos((pow(L1,2)+pow(L2,2)-pow(L_a,2))/(2*L1*L2));
delta_a = PI-(beta_a+gamma);
gamma = (gamma*180)/PI;
delta_p = acos((pow(L_a,2)+pow(Lp,2)-pow(L,2))/(2*L_a*Lp));
if (y<0) {
delta = ((2*PI-(delta_p-delta_a))*180)/PI;
}
else {
delta = ((delta_p+delta_a)*180)/PI;
if (isnan(delta)) {
delta = ((PI+delta_a)*180)/PI;
}
}
if (isnan(gamma))
{
coordenadas_correctas = 0;
}
::move.base = ((alfa+150)*1023)/300;
::move.arti1 = ((beta+60)*1023)/300;
::move.arti2 = ((gamma-30)*1023)/300;
::move.arti3 = ((delta-30)*1023)/300;
::move.pinza = 511;
::vel.base = abs(::move.base - ::pg.base)/5;
::vel.arti1 = abs(::move.arti1 - ::pg.arti1)/5;
::vel.arti2 = abs(::move.arti2 - ::pg.arti2)/5;
::vel.arti3 = abs(::move.arti3 - ::pg.arti3)/5;
::vel.pinza = abs(::move.pinza - ::pg.pinza);
if (coordenadas_correctas == 1 && (205 <= ::move.base && ::move.base <= 818) && (120 <= ::move.arti1 && ::move.arti1 <= 920) && ::move.arti2 >= 50 && ::move.arti3 <= 828) {
return ::move;
}
else {
std::cout<<"Error, invalid coordinates or unattainable point"<<std::endl;
::nuevo_punto = 0;
::id_1 = ::id_0;
return ::pg;
}
}
void indicar(geometry_msgs::Point indicado)
{
ros::NodeHandle n;
ros::Publisher move_pub_=n.advertise<brazo_fer::WriteServos>("move_arm", 1);
ros::Publisher hand_pub_=n.advertise<brazo_fer::WriteServos>("hand_arm", 1);
brazo_fer::Servos mover;
double x0 = indicado.x;
double z0 = indicado.z;
indicado.z = indicado.z - 100*cos(atan2(x0,z0));
indicado.x = indicado.x - 100*sin(atan2(x0,z0));
mover = inversa(indicado);
brazo_fer::WriteServos indicar;
indicar.posicion = mover;
indicar.velocidad = ::vel;
indicar.par.base = 1;
indicar.par.arti1 = 1;
indicar.par.arti2 = 1;
indicar.par.arti3 = 1;
indicar.par.pinza = 1;
move_pub_.publish(indicar);
hand_pub_.publish(indicar);
}
void home()
{
ros::NodeHandle n;
ros::Publisher move_pub_=n.advertise<brazo_fer::WriteServos>("move_arm", 1);
ros::Publisher hand_pub_=n.advertise<brazo_fer::WriteServos>("hand_arm", 1);
brazo_fer::Servos mover;
geometry_msgs::Point home;
home.x = 0;
home.y = 80;
home.z = 50;
mover = inversa(home);
brazo_fer::WriteServos inicio;
inicio.posicion = mover;
inicio.velocidad = ::vel;
inicio.par.base = 1;
inicio.par.arti1 = 1;
inicio.par.arti2 = 1;
inicio.par.arti3 = 1;
inicio.par.pinza = 1;
move_pub_.publish(inicio);
hand_pub_.publish(inicio);
}
void posicion(const brazo_fer::ReadServos& pose)
{
brazo_fer::Servos p = pose.posicion;
::pg = pose.posicion;
::eg = pose.estado;
::cg = pose.corriente;
if (::cont == 0)
{
home();
::cont = 1;
}
else if (::nuevo_punto > 8)
{
if (((p.base-15) < ::move.base && ::move.base < (p.base+15)) && ((p.arti1-15) < ::move.arti1 && ::move.arti1 < (p.arti1+15)) && ((p.arti2-15) < ::move.arti2 && ::move.arti2 < (p.arti2+15)) && ((p.arti3-15) < ::move.arti3 && ::move.arti3 < (p.arti3+15)))
{
ros::Time ahora;
ahora = ros::Time::now();
while (ros::Time::now() < (ahora + ros::Duration(1)))
{}
home();
::nuevo_punto = 0;
::id_1 = ::id_0;
}
}
}
void punto(const find_object_2d::PointObjects& point)
{
::p_objects = point;
if (::id_0 == ::id_1)
{
::cont_obj = 0;
::x = 0;
::y = 0;
::z = 0;
}
else
{
if (::id_1 == 100)
{
home();
::nuevo_punto = 0;
::id_1 = ::id_0;
}
else
{
int max = ::p_objects.objeto.size();
if (::nuevo_punto < 8)
{
for (int i=0; i < max; i++)
{
if (::p_objects.objeto[i].id == ::id_1)
{
::x = ::x + ::p_objects.objeto[i].punto.x;
::y = ::y + ::p_objects.objeto[i].punto.y;
::z = ::z + ::p_objects.objeto[i].punto.z;
::cont_obj = ::cont_obj + 1;
}
}
::nuevo_punto = ::nuevo_punto + 1;
}
else
{
if (::cont_obj == 0)
{
std::cout<<"Object is not in the scene"<<std::endl;
::nuevo_punto = 0;
}
else
{
geometry_msgs::Point punto_medio;
punto_medio.x = ::x/::cont_obj;
punto_medio.y = ::y/::cont_obj;
punto_medio.z = ::z/::cont_obj;
indicar(punto_medio);
::nuevo_punto = ::nuevo_punto + 1;
}
::id_1 = ::id_0;
}
}
}
}
void objeto(const std_msgs::Int16& id)
{
::id_1 = id.data;
}
int main(int argc, char **argv)
{
ros::init(argc, argv, "indicar_objetos");
ros::NodeHandle n;
ros::Subscriber pose_sub_= n.subscribe("pose_arm", 1, posicion);
ros::Subscriber point_sub_= n.subscribe("point", 1, punto);
ros::Subscriber object_sub_= n.subscribe("selected_object", 1, objeto);
ros::Publisher move_pub_=n.advertise<brazo_fer::WriteServos>("move_arm", 1);
ros::Publisher hand_pub_=n.advertise<brazo_fer::WriteServos>("hand_arm", 1);
ros::spin();
return 0;
}
We have to add the next line to the file "CMakeLists.txt" of the package "brazo_fer" in order to compile and create the executable file:
rosbuild_add_executable(escoja src/escoja.cpp)
We will execute the next commands in a terminal in order to compile and create the executable file:
roscd brazo_fer
make
We will create a file named "escoja.launch" within the folder "launch" of the package "brazo_fer" with the content that is shown below, in order to launch the communication node witharduino, the previous launcher "encontrar_objetos.launch" and our program:
<launch>
<node name="serial_node" pkg="rosserial_python" type="serial_node.py" args="/dev/ttyACM0" />
<node name="indicar" pkg="brazo_fer" type="escoja" />
<include file="$(find find_object_2d)/launch/encontrar_objetos.launch"/>
</launch>
We will execute the next command in a terminal in order to start the launcher:
roslaunch brazo_fer escoja.launch
Tracking objects
The program subscribes the topics "selected_object" (where we publish the identifier of the selected object) and "pose_arm" (where the states of the arm are published). Also the program publishes the topics "move_arm" and "hand_arm" in order to control the arm. When we start the program, the arm goes to the initial position. When we publish the identifier of the selected object in the topic "selected_object" (we use the identifier 100 to return the arm to the initial position), the arm moves to track the object if it is present in the scene. We will create a file named "seguir_objetos.cpp" within the folder "src" of the package "brazo_fer" with the content that is shown below:
#include "ros/ros.h"
#include "brazo_fer/Servos.h"
#include "brazo_fer/WriteServos.h"
#include "brazo_fer/ReadServos.h"
#include "geometry_msgs/Point.h"
#include "find_object_2d/PointObjects.h"
#include "find_object_2d/Point_id.h"
#include "std_msgs/Int16.h"
#include "math.h"
#define PI 3.14159265
#define L1 104
#define L2 104
#define Lp 60
brazo_fer::Servos move_0, move_1, vel, pg, eg, cg;
find_object_2d::PointObjects p_objects;
geometry_msgs::Point punto_obj_0;
int cont = 0;
int cont_obj = 0;
int id;
brazo_fer::Servos inversa(geometry_msgs::Point destino)
{
double z = destino.z;
double x = destino.x;
double y = destino.y;
y = y + z*tan(atan2(45,250)); //Compensación para alcanzar la "y" intruducida, ya que la real es menor debido al peso del conjunto.
int coordenadas_correctas = 1;
double alfa, beta, beta_a, beta_p, gamma, delta, delta_a, delta_p;
double z_p;
double L_a, L;
alfa = (atan2(x,z)*180)/PI;
z_p = sqrt(pow(z,2)+pow(x,2));
L = sqrt(pow(z_p,2)+pow(y,2));
L_a = sqrt(pow(y,2)+pow(z_p-Lp,2));
beta_p = atan2(y,z_p-Lp);
beta_a = acos((pow(L1,2)+pow(L_a,2)-pow(L2,2))/(2*L1*L_a));
beta = ((beta_p+beta_a)*180)/PI;
gamma = acos((pow(L1,2)+pow(L2,2)-pow(L_a,2))/(2*L1*L2));
delta_a = PI-(beta_a+gamma);
gamma = (gamma*180)/PI;
delta_p = acos((pow(L_a,2)+pow(Lp,2)-pow(L,2))/(2*L_a*Lp));
if (y<0) {
delta = ((2*PI-(delta_p-delta_a))*180)/PI;
}
else {
delta = ((delta_p+delta_a)*180)/PI;
if (isnan(delta)) {
delta = ((PI+delta_a)*180)/PI;
}
}
if (isnan(gamma)) // si no hay solución
{
coordenadas_correctas = 0;
}
::move_1.base = ((alfa+150)*1023)/300;
::move_1.arti1 = ((beta+60)*1023)/300;
::move_1.arti2 = ((gamma-30)*1023)/300;
::move_1.arti3 = ((delta-30)*1023)/300;
::move_1.pinza = 511;
::vel.base = abs(::move_1.base - ::pg.base)/5;
::vel.arti1 = abs(::move_1.arti1 - ::pg.arti1)/5;
::vel.arti2 = abs(::move_1.arti2 - ::pg.arti2)/5;
::vel.arti3 = abs(::move_1.arti3 - ::pg.arti3)/5;
::vel.pinza = abs(::move_1.pinza - ::pg.pinza);
if (coordenadas_correctas == 1 && (205 <= ::move_1.base && ::move_1.base <= 818) && (120 <= ::move_1.arti1 && ::move_1.arti1 <= 920) && ::move_1.arti2 >= 50 && ::move_1.arti3 <= 828) {
move_0 = move_1;
return ::move_0;
}
else {
std::cout<<"error coordenadas no validas o punto fuera del alcance"<<std::endl;
return ::move_0;
}
}
void indicar(geometry_msgs::Point indicado)
{
ros::NodeHandle n;
ros::Publisher move_pub_=n.advertise<brazo_fer::WriteServos>("move_arm", 1);
ros::Publisher hand_pub_=n.advertise<brazo_fer::WriteServos>("hand_arm", 1);
brazo_fer::Servos mover;
double x0 = indicado.x;
double z0 = indicado.z;
indicado.z = indicado.z - 150*cos(atan2(x0,z0));
indicado.x = indicado.x - 150*sin(atan2(x0,z0));
mover = inversa(indicado);
brazo_fer::WriteServos indicar;
indicar.posicion = mover;
indicar.velocidad = ::vel;
indicar.par.base = 1;
indicar.par.arti1 = 1;
indicar.par.arti2 = 1;
indicar.par.arti3 = 1;
indicar.par.pinza = 1;
move_pub_.publish(indicar);
hand_pub_.publish(indicar);
}
void home()
{
ros::NodeHandle n;
ros::Publisher move_pub_=n.advertise<brazo_fer::WriteServos>("move_arm", 1);
ros::Publisher hand_pub_=n.advertise<brazo_fer::WriteServos>("hand_arm", 1);
brazo_fer::Servos mover;
geometry_msgs::Point home;
home.x = 0;
home.y = 80;
home.z = 50;
mover = inversa(home);
brazo_fer::WriteServos inicio;
inicio.posicion = mover;
inicio.velocidad = ::vel;
inicio.par.base = 1;
inicio.par.arti1 = 1;
inicio.par.arti2 = 1;
inicio.par.arti3 = 1;
inicio.par.pinza = 1;
move_pub_.publish(inicio);
hand_pub_.publish(inicio);
}
void posicion(const brazo_fer::ReadServos& pose)
{
brazo_fer::Servos p = pose.posicion;
::pg = pose.posicion;
::eg = pose.estado;
::cg = pose.corriente;
}
void punto(const find_object_2d::PointObjects& point)
{
::p_objects = point;
geometry_msgs::Point punto_obj_1;
if (::id == 100 || ::cont == 0)
{
home();
::cont = ::cont + 1;
}
int max = ::p_objects.objeto.size();
for (int i=0; i < max; i++)
{
if (::p_objects.objeto[i].id == ::id)
{
punto_obj_1 = ::p_objects.objeto[i].punto;
::cont_obj = 1;
}
}
if (::cont_obj == 0)
{
std::cout<<"Objeto no presente en la escena"<<std::endl;
}
else if (abs(punto_obj_1.x - ::punto_obj_0.x) > 20 || abs(punto_obj_1.y - ::punto_obj_0.y) > 20 || abs(punto_obj_1.z - ::punto_obj_0.z) > 20)
{
indicar(punto_obj_1);
::punto_obj_0 = punto_obj_1;
}
::cont_obj = 0;
}
void objeto(const std_msgs::Int16& id)
{
::id = id.data;
}
int main(int argc, char **argv)
{
ros::init(argc, argv, "seguir_objetos");
ros::NodeHandle n;
ros::Subscriber pose_sub_= n.subscribe("pose_arm", 1, posicion);
ros::Subscriber point_sub_= n.subscribe("point", 1, punto);
ros::Subscriber object_sub_= n.subscribe("selected_object", 1, objeto);
ros::Publisher move_pub_=n.advertise<brazo_fer::WriteServos>("move_arm", 1);
ros::Publisher hand_pub_=n.advertise<brazo_fer::WriteServos>("hand_arm", 1);
ros::spin();
return 0;
}
We have to add the next line to the file "CMakeLists.txt" of the package "brazo_fer" in order to compile and create the executable file:
rosbuild_add_executable(seguir src/seguir.cpp)
We will execute the next commands in a terminal in order to compile and create the executable file:
roscd brazo_fer
make
We will create a file named "seguir.launch" within the folder "launch" of the package "brazo_fer" with the content that is shown below, in order to launch the communication node witharduino, the previous launcher "encontrar_objetos.launch" and our program:
<launch>
<node name="serial_node" pkg="rosserial_python" type="serial_node.py" args="/dev/ttyACM0" />
<node name="seguir" pkg="brazo_fer" type="seguir_objetos" />
<include file="$(find find_object_2d)/launch/encontrar_objetos.launch"/>
</launch>
We will execute the next command in a terminal in order to start the launcher:
roslaunch brazo_fer seguir.launch
We can see in this video the execution of this program in gazebo simlator.