Jump to: navigation, search


We want to develop and deploy minimal functional abilities to be part of RoCKIn 2014.

  • Navigation
  • Mapping
  • People recognition
  • Person tracking
  • Object recognition
  • Object manipulation
  • Speech recognition
  • Gesture recognition
  • Cognition


We want to take part in RoCKIn with the platform developed during the last two years in the Catedra Telefónica-ULE.

MYRABot robot.

Robot Hardware

  1. iRobot Roomba 520
  2. Dinamixel Arm (5x12a)
  3. Wooden frame (yes, it is made of wood)
  4. Notebook (Atom processor) (the display has been taken apart from the main body)
  5. Kinect
  6. Arduino Mega

Robot Software

  1. ROS (robot control)
  2. MYRA (C/C++, ArUCo, Qt, openCV)

Project setup

We defined the development as four phases :

  1. Phase I: Initial Setup
  2. Phase II: Integration and architecture
  3. Phase III: Platform test
  4. Phase IV: Improvements and complex tasks
    1. Technical Challenge: Furniture-type Object perception
    2. Open Challenge: Exhibit and demonstrate the most important (scientific) achievements

Phase I: Initial Setup

Outline: Tasks developed in this phase

Phase II: Integration and Architecture

Outline: Tasks developed in this phase

Phase III: Platform test


WARNING: This part is still in Spanish, please feel free to ask if you have any doubt.

Platform for gazebo


<wikiflv width="300" height="250" logo="true">/videos/Test1.flv</wikiflv>

<wikiflv width="300" height="250" logo="true">/videos/Test2.flv</wikiflv>

ImageTidy2.jpg ImageTidy.jpg

Non-Critical (but to-do)

  1. Android/iOS Teleoperation
  2. Desktop Qt interface (WIP)

RoCKIn Camp 2014 - Summary

These were the highlight during the camp:

Migration to ROS hydro


Object Recognition