Difference between revisions of "RoCKIn2014"

From robotica.unileon.es
Jump to: navigation, search
(2 =)
m
 
(64 intermediate revisions by 2 users not shown)
Line 1: Line 1:
==RoCKIn Camp 2014==
+
=RoCKIn Project Description=
  
 
* '''Project Name:'''  
 
* '''Project Name:'''  
 
[[Image:LogoRockin.png|130px]]
 
[[Image:LogoRockin.png|130px]]
 +
 +
* '''Next Challenge'''
 +
[http://rockinrobotchallenge.eu/rockin2014.php RoCKIn@home Toulouse]
  
 
* '''Official Web Page'''
 
* '''Official Web Page'''
 
  [http://rockinrobotchallenge.eu/home.php RoCKIn@home]  
 
  [http://rockinrobotchallenge.eu/home.php RoCKIn@home]  
 +
 +
=Watermelon Project: Team Description=
  
 
* '''Project Codename'''  
 
* '''Project Codename'''  
  Watermelon
+
  Watermelon Project
  
 
* '''Advisor:'''  
 
* '''Advisor:'''  
Line 14: Line 19:
  
 
* '''Staff:'''  
 
* '''Staff:'''  
  Technical software: Fernando Casado
+
  Technical- Manipulation/Grasping, Simulation: Fernando Casado
  Technical software: Víctor Rodríguez  
+
  Technical- Navigation: Rubén Rodríguez
  Technical software: Francisco Lera
+
  Technical- SW Integration, Middleware, Perception: Francisco Martín Rico
  Technical hardware: Carlos Rodríguez
+
Technical- SW Integration, HRI Dialogue, Team Leader: Francisco Lera
 +
  Technical- Hardware: Carlos Rodríguez
 +
 
 +
* '''Former Staff:'''
 +
 
 +
Technical-Perception: Víctor Rodríguez  
  
 
* '''Other Information:'''  
 
* '''Other Information:'''  
* '''Academic Year:''' 2013-2014
+
'''Academic Year:''' 2013-2014
* '''SVN Repositories:''' soon...  
+
'''SVN Repositories:''' soon...  
* '''Tags:''' Augmented Reality, Elderly people, Remote-Assistance
+
'''Tags:''' Augmented Reality, Elderly people, Remote-Assistance
* '''Technology:''' ROS, PCL, c++, svn, OpenCV, cmake, OpenGL, Qt, Aruco,  
+
'''Technology:''' ROS, PCL, c++, svn, OpenCV, cmake, OpenGL, Qt, Aruco,  
* '''State:''' Development
+
'''State:''' Development
  
== Project Summary ==
+
=Project Summary=
  
 
This challenge focuses on domestic service robots. The project aims to create robots with enhanced networking and cognitive abilities. They should be able to perform useful tasks such as helping the impaired and the elderly (one of the main goals of our group).  
 
This challenge focuses on domestic service robots. The project aims to create robots with enhanced networking and cognitive abilities. They should be able to perform useful tasks such as helping the impaired and the elderly (one of the main goals of our group).  
Line 32: Line 42:
 
In the initial stages of the competition, individual robots will begin by overcoming basic individual tasks, such as navigating through the rooms of a house, manipulating objects or recognizing faces, and then coordinating to handle house-keeping tasks simultaneously, some of them under natural interaction with humans.  
 
In the initial stages of the competition, individual robots will begin by overcoming basic individual tasks, such as navigating through the rooms of a house, manipulating objects or recognizing faces, and then coordinating to handle house-keeping tasks simultaneously, some of them under natural interaction with humans.  
  
==Robot==
+
=RoCKIn Evolution=
 
 
We want to take part in RoCKIn with the platform developed during the last two years in the Catedra Telefónica-ULE.
 
  
[[Image:RobotWatermelon0.JPG|thumb|130px|MYRABot robot.]]
+
[[RoCKIn2014Camp | Rome - RoCKIn Camp 2014]]
  
===Robot Hardware===
+
=Robot=
  
# iRobot Roomba 520
+
We want to take part in RoCKIn with the platform developed during the last two years in the Catedra Telefónica-ULE.  
# Dinamixel Arm (5x12a)
 
# Wooden frame (yes, it is made of wood)
 
# Notebook (Atom processor) (the display has been taken apart from the main body)
 
# Kinect
 
# Arduino Mega
 
 
 
=== Robot Software ===
 
 
 
# ROS (robot control)
 
# MYRA (C/C++, ArUCo, Qt, openCV)
 
 
 
==Proposal==
 
 
 
We want to develop and deploy minimal functional abilities to be part of RoCKIn 2014.
 
 
 
* Navigation
 
* Mapping
 
* People recognition
 
* Person tracking
 
* Object recognition
 
* Object manipulation
 
* Speech recognition
 
* Gesture recognition
 
* Cognition
 
 
 
We are going to separate the development in three phases:
 
 
 
# Phase I: Initial Setup
 
# Phase II: Integration and architecture
 
# Phase III: Platform test
 
# Phase IV: Improvements and complex tasks
 
## Technical Challenge: Furniture-type Object perception
 
## Open Challenge: Exhibit and demonstrate the most important (scientific) achievements
 
 
 
===Phase I: Initial Setup===
 
 
 
[[RoCKIn2014PhaseI | Outline]]: Tasks developed in this phase
 
 
 
===Phase II: Integration and Architecture===
 
 
 
# Multiple kinect cameras in a PC
 
 
 
In this task we tested how to launch  two different [http://wiki.ros.org/openni_camera openni cameras] in the same computer. This is an easy task if you know the shortcuts :D
 
  
Problems outline:
+
[[Image:RoCKIn_robot.JPG|thumb|230px|MYRABot robot.]]
  
===== 1 =====
+
==Robot Hardware==
We  have two differents cameras Xtion PRo and kinect.
 
  
The typical problem with Xtion pro model:
+
{| style="color:black; background-color:#ffffcc;" cellpadding="8" cellspacing="0" border="1"
 +
| Component
 +
| Model
 +
| Description
 +
|- style="color:black; background-color:#ffffff;" cellpadding="8" cellspacing="0" border="1"
 +
| Frame
 +
| n/a
 +
| Poplar laminated wood
 +
|- style="color:black; background-color:#ffffff;" cellpadding="8" cellspacing="0" border="1"
 +
| Computer
 +
| LG X110 -
 +
| Notebooks. Atom processors
 +
|- style="color:black; background-color:#ffffff;" cellpadding="8" cellspacing="0" border="1"
 +
| Controllers
 +
| (a)Arduino 2560, (b)USB2serial
 +
| (a)arm, range sensors, (b)roomba
 +
|- style="color:black; background-color:#ffffff;" cellpadding="8" cellspacing="0" border="1"
 +
| Base
 +
| Roomba 520
 +
| Vacuum cleaner
 +
|- style="color:black; background-color:#ffffff;" cellpadding="8" cellspacing="0" border="1"
 +
| Ultrasound Sensors
 +
| Maxsensor mb1220 (x5)
 +
| Range: 7 meters
 +
|- style="color:black; background-color:#ffffff;" cellpadding="8" cellspacing="0" border="1"
 +
| RGB sensor
 +
| Logitech
 +
| Webcam
 +
|- style="color:black; background-color:#ffffff;" cellpadding="8" cellspacing="0" border="1"
 +
| RGBD sensors
 +
| Kinect, Asus Xtion
 +
|
 +
|- style="color:black; background-color:#ffffff;" cellpadding="8" cellspacing="0" border="1"
 +
| Battery
 +
| standard
 +
| 12V, 7A
 +
|- style="color:black; background-color:#ffffff;" cellpadding="8" cellspacing="0" border="1"
 +
| Arm (Actuator)
 +
| Dinamixel AX12 servos (x5)
 +
| Joints and servomotors from Bioloid
 +
|}
  
We launch
+
==Robot Software==
<pre>
 
$ roslaunch openni_launch openni.launch
 
</pre>
 
  
we get
+
{| style="color:black; background-color:#ffffcc;" cellpadding="8" cellspacing="0" border="1"
<pre>
+
| Option
[ INFO] [1383245066.923739155]: Number devices connected: 1
+
| Control Software
[ INFO] [1383245066.923896787]: 1. device on bus 002:05 is a PrimeSense Device (601) from PrimeSense (1d27) with serial id ''
+
| Version
[ INFO] [1383245066.925020672]: Searching for device with index = 1
+
| Description
[ INFO] [1383245067.026459550]: No matching device found.... waiting for devices. Reason: openni_wrapper::OpenNIDevice::OpenNIDevice(xn::Context&, const xn::NodeInfo&, const xn::NodeInfo&, const xn::NodeInfo&, const xn::NodeInfo&) @ /tmp/buildd/ros-fuerte-openni-camera-1.8.6/debian/ros-fuerte-openni-camera/opt/ros/fuerte/stacks/openni_camera/src/openni_device.cpp @ 61 : creating depth generator failed. Reason: USB interface is not supported!
+
|- style="color:black; background-color:#ffffff;" cellpadding="8" cellspacing="0" border="1"
</pre>
+
| Robot
 +
| ROS
 +
| Fuerte
 +
|
 +
|- style="color:black; background-color:#ffffff;" cellpadding="8" cellspacing="0" border="1"
 +
| Simulator
 +
| Gazebo and ROS
 +
| Gazebo 1.19, ROS - HYDRO
 +
|
 +
|}
  
SOLUTION:
+
=Videos TDP2014=
  
We have to modify global parameters of openni:
+
==MoveIt!==
  
<pre>
+
Test 1:
$ sudo vi /etc/openni/GlobalDefaults.ini
 
</pre>
 
  
We go to line and uncomment the '''UsbInterface''' parameter:
+
<videoflash>Y9dv5mEiNK0</videoflash>
  
<pre>
+
<videoflash>ODUJIuZn5ew</videoflash>
; USB interface to be used. 0 - FW Default, 1 - ISO endpoints, 2 - BULK endpoints. Default: Arm - 2, other platforms - 1
 
UsbInterface=2
 
</pre>
 
  
Reason:
+
==Arm Control==
  
[http://answers.ros.org/question/61211/problem-with-xtion-pro-live-and-openni_camera/?answer=61212#post-id-61212 Here] and [http://answers.ros.org/question/77651/asus-xtion-on-usb-30-ros-hydro-ubuntu-1210/ here] but I think that the real reason is related to [http://www.makelinux.net/ldd3/chp-13-sect-1 USB port management]
+
Arm control with Xbox Joystick
  
From my point of view this is the outline ([http://www.makelinux.net/ldd3/chp-13-sect-1  copy paste from USB port management ]
+
<videoflash>00As38JDjLA</videoflash>
  
ENDPOINT -> The most basic form of USB communication is through something called an endpoint. A USB endpoint can carry data in only one direction, either from the host computer to the device (called an OUT endpoint) or from the device to the host computer (called an IN endpoint). Endpoints can be thought of as unidirectional pipes.
+
=Acknowledgments=
  
1 ISO-> Isochronous endpoints also transfer large amounts of data, but the data is not always guaranteed to make it through. These endpoints are used in devices that can handle loss of data, and rely more on keeping a constant stream of data flowing. Real-time data collections, such as audio and video devices, almost always use these endpoints.
+
'''Organizations:'''
  
2 BULK-> Bulk endpoints transfer large amounts of data. These endpoints are usually much larger (they can hold more characters at once) than interrupt endpoints. They are common for devices that need to transfer any data that must get through with no data loss. These transfers are not guaranteed by the USB protocol to always make it through in a specific amount of time. If there is not enough room on the bus to send the whole BULK packet, it is split up across multiple transfers to or from the device. These endpoints are common on printers, storage, and network devices.
+
[http://www.fablableon.org/ Fab-Lab León]
  
0 FW [WARNING, I'm not sure]-> USB interfaces are themselves bundled up into configurations. A USB device can have multiple configurations and might switch between them in order to change the state of the device. For example, some devices that allow firmware to be downloaded to them contain multiple configurations to accomplish this. A single configuration can be enabled only at one point in time. Linux does not handle multiple configuration USB devices very well, but, thankfully, they are rare.
+
[http://catedratelefonica.unileon.es/ Cátedra Telefónica-Unileon]
  
If someone find a better answer I will be happy to know it.
+
'''People:'''
  
===== 2 ======
+
Alvaro Botas (MYRA software)
  
This problem is related with the previous configuration. It makes possible to work with Xtion but now we have problems with kinect
+
Joaquín Olmo (First design of MYRABot prototype)
  
<pre>
+
Julián Orfo (Navigation in simulator environment)
[ INFO] [1383246210.089689756]: Number devices connected: 1
 
[ INFO] [1383246210.089891318]: 1. device on bus 001:17 is a Xbox NUI Camera (2ae) from Microsoft (45e) with serial id 'A00366A15277050A'
 
[ INFO] [1383246210.091390182]: Searching for device with index = 1
 
[ INFO] [1383246210.193031599]: No matching device found.... waiting for devices. Reason: openni_wrapper::OpenNIDevice::OpenNIDevice(xn::Context&, const xn::NodeInfo&, const xn::NodeInfo&, const xn::NodeInfo&, const xn::NodeInfo&) @ /tmp/buildd/ros-fuerte-openni-camera-1.8.6/debian/ros-fuerte-openni-camera/opt/ros/fuerte/stacks/openni_camera/src/openni_device.cpp @ 61 : creating depth generator failed. Reason: USB interface is not supported!
 
</pre>
 
  
Solution
+
=Sponsorship=
  
<pre>
+
[[Image:Sponsor1.png|830px]]
; USB interface to be used. 0 - FW Default, 1 - ISO endpoints, 2 - BULK endpoints. Default: Arm - 2, other platforms - 1
 
UsbInterface=0
 
</pre>
 
  
 +
If you want to contribute financially to the project, please contact us.
  
[[Image:Screenshot1.png|center|500px]]
+
=Wishlist=
  
===Non-Critical (but to-do)===
+
<span style="color:#009000"> &#9745; </span> Roomba battery
  
# Android/iOS Teleoperation
+
<span style="color:#ff0000"> &#9744; </span> Arduino Mega (x2)
# Desktop Qt interface
 
# Create robot model for Gazebo
 
# Create robot model for rviz (the same as Gazebo?)
 
  
==Wishlist==
+
<span style="color:#ff0000"> &#9744; </span> Roomba base (520, 560)
  
* Computer with i7 processor, 8GB RAM, Nvidia GPU (1-2 GB)
+
<span style="color:#ff0000"> &#9744; </span> Ultrabook
* ASUS Xtion Pro Live Color RGB Sensor
 
* Roomba battery
 
* Arduino Mega (x2)
 
* Roomba base (520, 560)
 

Latest revision as of 11:33, 14 February 2015

RoCKIn Project Description

  • Project Name:

LogoRockin.png

  • Next Challenge
RoCKIn@home Toulouse 
  • Official Web Page
RoCKIn@home 

Watermelon Project: Team Description

  • Project Codename
Watermelon Project
  • Advisor:
Vicente Matellán Olivera
  • Staff:
Technical- Manipulation/Grasping, Simulation: Fernando Casado
Technical- Navigation: Rubén Rodríguez
Technical- SW Integration, Middleware, Perception: Francisco Martín Rico
Technical- SW Integration, HRI Dialogue, Team Leader: Francisco Lera
Technical- Hardware: Carlos Rodríguez
  • Former Staff:
Technical-Perception: Víctor Rodríguez 
  • Other Information:
Academic Year: 2013-2014
SVN Repositories: soon... 
Tags: Augmented Reality, Elderly people, Remote-Assistance
Technology: ROS, PCL, c++, svn, OpenCV, cmake, OpenGL, Qt, Aruco, 
State: Development

Project Summary

This challenge focuses on domestic service robots. The project aims to create robots with enhanced networking and cognitive abilities. They should be able to perform useful tasks such as helping the impaired and the elderly (one of the main goals of our group).

In the initial stages of the competition, individual robots will begin by overcoming basic individual tasks, such as navigating through the rooms of a house, manipulating objects or recognizing faces, and then coordinating to handle house-keeping tasks simultaneously, some of them under natural interaction with humans.

RoCKIn Evolution

Rome - RoCKIn Camp 2014

Robot

We want to take part in RoCKIn with the platform developed during the last two years in the Catedra Telefónica-ULE.

MYRABot robot.

Robot Hardware

Component Model Description
Frame n/a Poplar laminated wood
Computer LG X110 - Notebooks. Atom processors
Controllers (a)Arduino 2560, (b)USB2serial (a)arm, range sensors, (b)roomba
Base Roomba 520 Vacuum cleaner
Ultrasound Sensors Maxsensor mb1220 (x5) Range: 7 meters
RGB sensor Logitech Webcam
RGBD sensors Kinect, Asus Xtion
Battery standard 12V, 7A
Arm (Actuator) Dinamixel AX12 servos (x5) Joints and servomotors from Bioloid

Robot Software

Option Control Software Version Description
Robot ROS Fuerte
Simulator Gazebo and ROS Gazebo 1.19, ROS - HYDRO

Videos TDP2014

MoveIt!

Test 1:

<videoflash>Y9dv5mEiNK0</videoflash>

<videoflash>ODUJIuZn5ew</videoflash>

Arm Control

Arm control with Xbox Joystick

<videoflash>00As38JDjLA</videoflash>

Acknowledgments

Organizations:

Fab-Lab León

Cátedra Telefónica-Unileon

People:

Alvaro Botas (MYRA software)

Joaquín Olmo (First design of MYRABot prototype)

Julián Orfo (Navigation in simulator environment)

Sponsorship

Sponsor1.png

If you want to contribute financially to the project, please contact us.

Wishlist

Roomba battery

Arduino Mega (x2)

Roomba base (520, 560)

Ultrabook