Difference between revisions of "RoCKIn2014"

From robotica.unileon.es
Jump to: navigation, search
(Phase II: Integration and Architecture)
m
 
(70 intermediate revisions by 2 users not shown)
Line 1: Line 1:
==RoCKIn Camp 2014==
+
=RoCKIn Project Description=
  
 
* '''Project Name:'''  
 
* '''Project Name:'''  
 
[[Image:LogoRockin.png|130px]]
 
[[Image:LogoRockin.png|130px]]
 +
 +
* '''Next Challenge'''
 +
[http://rockinrobotchallenge.eu/rockin2014.php RoCKIn@home Toulouse]
  
 
* '''Official Web Page'''
 
* '''Official Web Page'''
 
  [http://rockinrobotchallenge.eu/home.php RoCKIn@home]  
 
  [http://rockinrobotchallenge.eu/home.php RoCKIn@home]  
 +
 +
=Watermelon Project: Team Description=
  
 
* '''Project Codename'''  
 
* '''Project Codename'''  
  Watermelon
+
  Watermelon Project
  
 
* '''Advisor:'''  
 
* '''Advisor:'''  
Line 14: Line 19:
  
 
* '''Staff:'''  
 
* '''Staff:'''  
  Technical software: Fernando Casado
+
  Technical- Manipulation/Grasping, Simulation: Fernando Casado
  Technical software: Víctor Rodríguez  
+
  Technical- Navigation: Rubén Rodríguez
  Technical software: Francisco Lera
+
  Technical- SW Integration, Middleware, Perception: Francisco Martín Rico
  Technical hardware: Carlos Rodríguez
+
Technical- SW Integration, HRI Dialogue, Team Leader: Francisco Lera
 +
  Technical- Hardware: Carlos Rodríguez
 +
 
 +
* '''Former Staff:'''
 +
 
 +
Technical-Perception: Víctor Rodríguez  
  
 
* '''Other Information:'''  
 
* '''Other Information:'''  
* '''Academic Year:''' 2013-2014
+
'''Academic Year:''' 2013-2014
* '''SVN Repositories:''' soon...  
+
'''SVN Repositories:''' soon...  
* '''Tags:''' Augmented Reality, Elderly people, Remote-Assistance
+
'''Tags:''' Augmented Reality, Elderly people, Remote-Assistance
* '''Technology:''' ROS, PCL, c++, svn, OpenCV, cmake, OpenGL, Qt, Aruco,  
+
'''Technology:''' ROS, PCL, c++, svn, OpenCV, cmake, OpenGL, Qt, Aruco,  
* '''State:''' Development
+
'''State:''' Development
  
== Project Summary ==
+
=Project Summary=
  
 
This challenge focuses on domestic service robots. The project aims to create robots with enhanced networking and cognitive abilities. They should be able to perform useful tasks such as helping the impaired and the elderly (one of the main goals of our group).  
 
This challenge focuses on domestic service robots. The project aims to create robots with enhanced networking and cognitive abilities. They should be able to perform useful tasks such as helping the impaired and the elderly (one of the main goals of our group).  
Line 32: Line 42:
 
In the initial stages of the competition, individual robots will begin by overcoming basic individual tasks, such as navigating through the rooms of a house, manipulating objects or recognizing faces, and then coordinating to handle house-keeping tasks simultaneously, some of them under natural interaction with humans.  
 
In the initial stages of the competition, individual robots will begin by overcoming basic individual tasks, such as navigating through the rooms of a house, manipulating objects or recognizing faces, and then coordinating to handle house-keeping tasks simultaneously, some of them under natural interaction with humans.  
  
==Robot==
+
=RoCKIn Evolution=
  
We want to take part in RoCKIn with the platform developed during the last two years in the Catedra Telefónica-ULE.
+
[[RoCKIn2014Camp | Rome - RoCKIn Camp 2014]]
  
[[Image:RobotWatermelon0.JPG|thumb|130px|MYRABot robot.]]
+
=Robot=
  
===Robot Hardware===
+
We want to take part in RoCKIn with the platform developed during the last two years in the Catedra Telefónica-ULE.  
 
 
# iRobot Roomba 520
 
# Dinamixel Arm (5x12a)
 
# Wooden frame (yes, it is made of wood)
 
# Notebook (Atom processor) (the display has been taken apart from the main body)
 
# Kinect
 
# Arduino Mega
 
 
 
=== Robot Software ===
 
 
 
# ROS (robot control)
 
# MYRA (C/C++, ArUCo, Qt, openCV)
 
 
 
==Proposal==
 
 
 
We want to develop and deploy minimal functional abilities to be part of RoCKIn 2014.
 
  
* Navigation
+
[[Image:RoCKIn_robot.JPG|thumb|230px|MYRABot robot.]]
* Mapping
 
* People recognition
 
* Person tracking
 
* Object recognition
 
* Object manipulation
 
* Speech recognition
 
* Gesture recognition
 
* Cognition
 
  
We are going to separate the development in three phases:
+
==Robot Hardware==
  
# Phase I: Initial Setup
+
{| style="color:black; background-color:#ffffcc;" cellpadding="8" cellspacing="0" border="1"
# Phase II: Integration and architecture
+
| Component
# Phase III: Platform test
+
| Model
# Phase IV: Improvements and complex tasks
+
| Description
## Technical Challenge: Furniture-type Object perception
+
|- style="color:black; background-color:#ffffff;" cellpadding="8" cellspacing="0" border="1"
## Open Challenge: Exhibit and demonstrate the most important (scientific) achievements
+
| Frame
 +
| n/a
 +
| Poplar laminated wood
 +
|- style="color:black; background-color:#ffffff;" cellpadding="8" cellspacing="0" border="1"
 +
| Computer
 +
| LG X110 -
 +
| Notebooks. Atom processors
 +
|- style="color:black; background-color:#ffffff;" cellpadding="8" cellspacing="0" border="1"
 +
| Controllers
 +
| (a)Arduino 2560, (b)USB2serial
 +
| (a)arm, range sensors, (b)roomba
 +
|- style="color:black; background-color:#ffffff;" cellpadding="8" cellspacing="0" border="1"
 +
| Base
 +
| Roomba 520
 +
| Vacuum cleaner
 +
|- style="color:black; background-color:#ffffff;" cellpadding="8" cellspacing="0" border="1"
 +
| Ultrasound Sensors
 +
| Maxsensor mb1220 (x5)
 +
| Range: 7 meters
 +
|- style="color:black; background-color:#ffffff;" cellpadding="8" cellspacing="0" border="1"
 +
| RGB sensor
 +
| Logitech
 +
| Webcam
 +
|- style="color:black; background-color:#ffffff;" cellpadding="8" cellspacing="0" border="1"
 +
| RGBD sensors
 +
| Kinect, Asus Xtion
 +
|
 +
|- style="color:black; background-color:#ffffff;" cellpadding="8" cellspacing="0" border="1"
 +
| Battery
 +
| standard
 +
| 12V, 7A
 +
|- style="color:black; background-color:#ffffff;" cellpadding="8" cellspacing="0" border="1"
 +
| Arm (Actuator)
 +
| Dinamixel AX12 servos (x5)
 +
| Joints and servomotors from Bioloid
 +
|}
  
===Phase I: Initial Setup===
+
==Robot Software==
  
[[RoCKIn2014PhaseI | Outline]]: Tasks developed in this phase
+
{| style="color:black; background-color:#ffffcc;" cellpadding="8" cellspacing="0" border="1"
 +
| Option
 +
| Control Software
 +
| Version
 +
| Description
 +
|- style="color:black; background-color:#ffffff;" cellpadding="8" cellspacing="0" border="1"
 +
| Robot
 +
| ROS
 +
| Fuerte
 +
|
 +
|- style="color:black; background-color:#ffffff;" cellpadding="8" cellspacing="0" border="1"
 +
| Simulator
 +
| Gazebo and ROS
 +
| Gazebo 1.19, ROS - HYDRO
 +
|
 +
|}
  
===Phase II: Integration and Architecture===
+
=Videos TDP2014=
  
# Multiple kinect cameras in a PC
+
==MoveIt!==
  
In this task we tested how to launch  two different [http://wiki.ros.org/openni_camera openni cameras] in the same computer. This is an easy task if you know the shortcuts :D
+
Test 1:  
  
Problems outline:
+
<videoflash>Y9dv5mEiNK0</videoflash>
  
 +
<videoflash>ODUJIuZn5ew</videoflash>
  
We  have two differents cameras Xtion PRo and kinect.
+
==Arm Control==
  
The typical problem with Xtion pro model:
+
Arm control with Xbox Joystick
  
We launch
+
<videoflash>00As38JDjLA</videoflash>
<pre>
 
$ roslaunch openni_launch openni.launch
 
</pre>
 
  
we get
+
=Acknowledgments=
<pre>
 
[ INFO] [1383245066.923739155]: Number devices connected: 1
 
[ INFO] [1383245066.923896787]: 1. device on bus 002:05 is a PrimeSense Device (601) from PrimeSense (1d27) with serial id ''
 
[ INFO] [1383245066.925020672]: Searching for device with index = 1
 
[ INFO] [1383245067.026459550]: No matching device found.... waiting for devices. Reason: openni_wrapper::OpenNIDevice::OpenNIDevice(xn::Context&, const xn::NodeInfo&, const xn::NodeInfo&, const xn::NodeInfo&, const xn::NodeInfo&) @ /tmp/buildd/ros-fuerte-openni-camera-1.8.6/debian/ros-fuerte-openni-camera/opt/ros/fuerte/stacks/openni_camera/src/openni_device.cpp @ 61 : creating depth generator failed. Reason: USB interface is not supported!
 
</pre>
 
  
SOLUTION:
+
'''Organizations:'''
  
We have to modify global parameters of openni:
+
[http://www.fablableon.org/ Fab-Lab León]
  
<pre>
+
[http://catedratelefonica.unileon.es/ Cátedra Telefónica-Unileon]
$ sudo vi /etc/openni/GlobalDefaults.ini
 
</pre>
 
  
We go to line and uncomment the '''UsbInterface''' parameter:
+
'''People:'''
  
<pre>
+
Alvaro Botas (MYRA software)
; USB interface to be used. 0 - FW Default, 1 - ISO endpoints, 2 - BULK endpoints. Default: Arm - 2, other platforms - 1
 
UsbInterface=2
 
</pre>
 
  
Reason:
+
Joaquín Olmo (First design of MYRABot prototype)
[http://answers.ros.org/question/61211/problem-with-xtion-pro-live-and-openni_camera/?answer=61212#post-id-61212 Here] and [http://answers.ros.org/question/77651/asus-xtion-on-usb-30-ros-hydro-ubuntu-1210/ here] but I think that the real reason is related to [http://www.makelinux.net/ldd3/chp-13-sect-1 USB port management]
 
  
Resumen
+
Julián Orfo (Navigation in simulator environment)
  
ENDPOINT -> The most basic form of USB communication is through something called an endpoint. A USB endpoint can carry data in only one direction, either from the host computer to the device (called an OUT endpoint) or from the device to the host computer (called an IN endpoint). Endpoints can be thought of as unidirectional pipes.
+
=Sponsorship=
  
1 ISO-> Isochronous endpoints also transfer large amounts of data, but the data is not always guaranteed to make it through. These endpoints are used in devices that can handle loss of data, and rely more on keeping a constant stream of data flowing. Real-time data collections, such as audio and video devices, almost always use these endpoints.
+
[[Image:Sponsor1.png|830px]]
  
2 BULK-> Bulk endpoints transfer large amounts of data. These endpoints are usually much larger (they can hold more characters at once) than interrupt endpoints. They are common for devices that need to transfer any data that must get through with no data loss. These transfers are not guaranteed by the USB protocol to always make it through in a specific amount of time. If there is not enough room on the bus to send the whole BULK packet, it is split up across multiple transfers to or from the device. These endpoints are common on printers, storage, and network devices.
+
If you want to contribute financially to the project, please contact us.
  
0 FW ->
+
=Wishlist=
  
===Non-Critical (but to-do)===
+
<span style="color:#009000"> &#9745; </span> Roomba battery
  
# Android/iOS Teleoperation
+
<span style="color:#ff0000"> &#9744; </span> Arduino Mega (x2)
# Desktop Qt interface
 
# Create robot model for Gazebo
 
# Create robot model for rviz (the same as Gazebo?)
 
  
==Wishlist==
+
<span style="color:#ff0000"> &#9744; </span> Roomba base (520, 560)
  
* Computer with i7 processor, 8GB RAM, Nvidia GPU (1-2 GB)
+
<span style="color:#ff0000"> &#9744; </span> Ultrabook
* ASUS Xtion Pro Live Color RGB Sensor
 
* Roomba battery
 
* Arduino Mega (x2)
 
* Roomba base (520, 560)
 

Latest revision as of 11:33, 14 February 2015

RoCKIn Project Description

  • Project Name:

LogoRockin.png

  • Next Challenge
RoCKIn@home Toulouse 
  • Official Web Page
RoCKIn@home 

Watermelon Project: Team Description

  • Project Codename
Watermelon Project
  • Advisor:
Vicente Matellán Olivera
  • Staff:
Technical- Manipulation/Grasping, Simulation: Fernando Casado
Technical- Navigation: Rubén Rodríguez
Technical- SW Integration, Middleware, Perception: Francisco Martín Rico
Technical- SW Integration, HRI Dialogue, Team Leader: Francisco Lera
Technical- Hardware: Carlos Rodríguez
  • Former Staff:
Technical-Perception: Víctor Rodríguez 
  • Other Information:
Academic Year: 2013-2014
SVN Repositories: soon... 
Tags: Augmented Reality, Elderly people, Remote-Assistance
Technology: ROS, PCL, c++, svn, OpenCV, cmake, OpenGL, Qt, Aruco, 
State: Development

Project Summary

This challenge focuses on domestic service robots. The project aims to create robots with enhanced networking and cognitive abilities. They should be able to perform useful tasks such as helping the impaired and the elderly (one of the main goals of our group).

In the initial stages of the competition, individual robots will begin by overcoming basic individual tasks, such as navigating through the rooms of a house, manipulating objects or recognizing faces, and then coordinating to handle house-keeping tasks simultaneously, some of them under natural interaction with humans.

RoCKIn Evolution

Rome - RoCKIn Camp 2014

Robot

We want to take part in RoCKIn with the platform developed during the last two years in the Catedra Telefónica-ULE.

MYRABot robot.

Robot Hardware

Component Model Description
Frame n/a Poplar laminated wood
Computer LG X110 - Notebooks. Atom processors
Controllers (a)Arduino 2560, (b)USB2serial (a)arm, range sensors, (b)roomba
Base Roomba 520 Vacuum cleaner
Ultrasound Sensors Maxsensor mb1220 (x5) Range: 7 meters
RGB sensor Logitech Webcam
RGBD sensors Kinect, Asus Xtion
Battery standard 12V, 7A
Arm (Actuator) Dinamixel AX12 servos (x5) Joints and servomotors from Bioloid

Robot Software

Option Control Software Version Description
Robot ROS Fuerte
Simulator Gazebo and ROS Gazebo 1.19, ROS - HYDRO

Videos TDP2014

MoveIt!

Test 1:

<videoflash>Y9dv5mEiNK0</videoflash>

<videoflash>ODUJIuZn5ew</videoflash>

Arm Control

Arm control with Xbox Joystick

<videoflash>00As38JDjLA</videoflash>

Acknowledgments

Organizations:

Fab-Lab León

Cátedra Telefónica-Unileon

People:

Alvaro Botas (MYRA software)

Joaquín Olmo (First design of MYRABot prototype)

Julián Orfo (Navigation in simulator environment)

Sponsorship

Sponsor1.png

If you want to contribute financially to the project, please contact us.

Wishlist

Roomba battery

Arduino Mega (x2)

Roomba base (520, 560)

Ultrabook