Difference between revisions of "RoCKIn2014"
WikiSheriff (talk | contribs) (→Ros: Debugging Techniques) |
WikiSheriff (talk | contribs) (→RoCKIn Camp 2014) |
||
Line 11: | Line 11: | ||
* '''Project Codename''' | * '''Project Codename''' | ||
Watermelon :D | Watermelon :D | ||
+ | |||
+ | |||
+ | * '''Advisor:''' | ||
+ | [http://robotica.unileon.es/~vmo Vicente Matellán Olivera] | ||
Revision as of 09:50, 20 October 2013
RoCKIn Camp 2014
- Project Name:
- Official Web Page
RoCKIn@home
- Project Codename
Watermelon :D
- Advisor:
Vicente Matellán Olivera
- Staff:
Technical software: Fernando Casado Technical software: Víctor Rodríguez Technical software: Francisco Lera Technical hardware: Carlos Rodríguez
- Other Information:
* Academic Year: 2013-2014 * SVN Repositories: soon ... * Tags: Augmented Reality, Elderly people, Tele-Assistence * Technology: ROS, PCL, c++, svn, OpenCV, cmake, OpenGL, Qt, Aruco, * State: Development
Project Summary
This challenge focuses on domestic service robots. The project aims robots with enhanced networking and cognitive abilities. They will be able to perform socially useful tasks such as supporting the impaired and the elderly (one of the main goal of our group).
In the initial stages of the competition individual robots will begin by overcoming basic individual tasks, such as navigation through the rooms of a house, manipulating objects or recognizing faces, and then coordinate to handle house-keeping tasks simultaneously, some of them in natural interaction with humans.
Robot
We want to take part in RoCKIn with the platform developed during the las two years in the Catedra Telefónica-ule.
Robot Hardware
- iRobot Roomba 520
- Dinamixel Arm (5x12a)
- wood frame (yes, it is made with wood)
- Notebook (Atom processor) (display+computer are separeted)
- Kinect
- Arduino Mega
Robot Software
- ROS (robot control)
- MYRA (C/C++, ArUCo, Qt, openCV)
Proposal
We want to deploy in this robot the minimal functional abilities to be part of RoCKIn 2014.
- Navigation
- Mapping
- Person recognition
- Person tracking
- Object recognition
- Object manipulation
- Speech recognition
- Gesture recognition
- Cognition
We are going to separate the development in three phases:
- Phase I: Initial Setup
- Phase II: Integration and architecture
- Phase III: Platform test
- Phase IV: Improvements and complex tasks
- Technical Challenge: Furniture-type Object perception
- Open Challenge: Present and demonstrate most important (scientific) achievements
Phase I: Initial Setup
Hardware Preparation
We need to make some initial tasks to fulfill the basic hardware setup, configuration, and customization of the robot.
- Task 1
- Get power from roomba brush (It is going to be used in the arm )
- Task 2
- Emergency Stop Button
- Task 3
- Start Button
ToDo
Software Preparation
Environment setup
We are going to define the basis of the system to be deployed.
- Operative System : Ubuntu 12.04 LTS
- Software Restriction : ROS Fuerte
- Core drivers for Roomba : How to install roomba package
Packages search
We use ROS so we can find at least a package for each ability ready to deploy in a robot. In this way, this task involves search and test each package to evaluate if we are able to deploy in our platform.
Navigation 2D navigation stack Turtlebot Navigation
Mapping SLAM
Object recognition Simple Qt interface to try OpenCV implementations of SIFT, SURF, FAST, BRIEF and other feature detectors and descriptors. find-object stack
Speech recognition Speech Recognition and Text-to-Speech (TTS) in π robot Packages pocketsphinx and Festival
Cognition To be done during stacks integration
Person recognition
Person tracking
Object manipulation
Gesture recognition
Ros: Debugging Techniques
It is possible to make debugging in ROS in two ways
- Launch file
Following the Roslaunch techniques
launch-prefix="xterm -e gdb --args" : run your node in a gdb in a separate xterm window, manually type run to start it launch-prefix="gdb -ex run --args" : run your node in gdb in the same xterm as your launch without having to type run to start it launch-prefix="valgrind" : run your node in valgrind launch-prefix="xterm -e" : run your node in a separate xterm window launch-prefix="nice" : nice your process to lower its CPU usage launch-prefix="screen -d -m gdb --args" : useful if the node is being run on another machine; you can then ssh to that machine and do screen -D -R to see the gdb session launch-prefix="xterm -e python -m pdb" : run your python node a separate xterm window in pdb for debugging; manually type run to start it
then you only have to do
roslaunch <package> <launch>
- Running a single node
Following the Commandline techniques
rosrun <package> <node>
instead use
roscd <package>
valgrind bin/<node>or
gdb bin/<node>or
gdb GNU gdb (Ubuntu/Linaro 7.4-2012.04-0ubuntu2.1) 7.4-2012.04 Copyright (C) 2012 Free Software Foundation, Inc. License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html> This is free software: you are free to change and redistribute it. There is NO WARRANTY, to the extent permitted by law. Type "show copying" and "show warranty" for details. This GDB was configured as "x86_64-linux-gnu". Para las instrucciones de informe de errores, vea: <http://bugs.launchpad.net/gdb-linaro/>. (gdb) file <route to node> (gdb) run
Don't forget to add Debug in the CMakeLists.txt
cmake_minimum_required(VERSION 2.4.6)
include($ENV{ROS_ROOT}/core/rosbuild/rosbuild.cmake)
set(ROS_BUILD_TYPE Debug)
#set(ROS_BUILD_TYPE Release)
rosbuild_init(node)
set(EXECUTABLE_OUTPUT_PATH ${PROJECT_SOURCE_DIR}/bin)
rosbuild_gensrv()
rosbuild_add_boost_directories()
add_subdirectory(src)
Phase II: Integration and Architecture
Non-Critical (but to-do)
- Android/iOS Teleoperation
- Desktop Qt interface
- Create robot model for Gazebo
- Create robot model for rviz (the same as Gazebo?)
Wishlist
- Computer i7 processor, 8GB RAM, Nvidia (1-2 GB)
- ASUS Xtion Pro Live Color RGB Sensor