<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
		<id>https://robotica.unileon.es/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Am6</id>
		<title>robotica.unileon.es - User contributions [en]</title>
		<link rel="self" type="application/atom+xml" href="https://robotica.unileon.es/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Am6"/>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Special:Contributions/Am6"/>
		<updated>2026-05-02T06:40:54Z</updated>
		<subtitle>User contributions</subtitle>
		<generator>MediaWiki 1.27.0</generator>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Tools&amp;diff=5352</id>
		<title>Tools</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Tools&amp;diff=5352"/>
				<updated>2019-11-12T11:26:44Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This site summarizes different applications and tools develeoped by the Robotics Group during their researches.&lt;br /&gt;
&lt;br /&gt;
__FORCETOC__&lt;br /&gt;
&lt;br /&gt;
== Available Tools ==&lt;br /&gt;
&lt;br /&gt;
Currently the following tools are available:&lt;br /&gt;
&lt;br /&gt;
=== Laboratorio de simulación robótica ===&lt;br /&gt;
&lt;br /&gt;
El objetivo de esta herramienta es acercar a aquel que lo desee el mundo de la robótica y la posibilidad de interactuar con un robot real. Con el registro en la herramienta podrás tener acceso a varios servicios, como son:&lt;br /&gt;
* Acceso al laboratorio de simulación robótica a través de una interfaz web.&lt;br /&gt;
* Acceso a una máquina a través de SSH en la que podrás almacenar la información de tus prácticas y pruebas.&lt;br /&gt;
* Posibilidad de publicar dudas a través del foro de esta misma página.&lt;br /&gt;
&lt;br /&gt;
[http://www.roboticalabs.unileon.es Go to laboratorio de simulación robótica]&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Tools&amp;diff=5351</id>
		<title>Tools</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Tools&amp;diff=5351"/>
				<updated>2019-11-12T11:21:49Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This site summarizes different applications and tools develeoped by the Robotics Group during their researches.&lt;br /&gt;
&lt;br /&gt;
__FORCETOC__&lt;br /&gt;
&lt;br /&gt;
== Available Tools ==&lt;br /&gt;
&lt;br /&gt;
Currently the following tools are available:&lt;br /&gt;
&lt;br /&gt;
=== Laboratorio de simulación robótica ===&lt;br /&gt;
&lt;br /&gt;
El objetivo de esta herramienta es acercar a aquel que lo desee el mundo de la robótica y la posibilidad de interactuar con un robot real.&lt;br /&gt;
&lt;br /&gt;
[http://www.roboticalabs.unileon.es Go to laboratorio de simulación robótica]&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Tools&amp;diff=5350</id>
		<title>Tools</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Tools&amp;diff=5350"/>
				<updated>2019-11-12T11:21:14Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This site summarizes different applications an tools develeoped by the Robotics Group during their researches.&lt;br /&gt;
&lt;br /&gt;
__FORCETOC__&lt;br /&gt;
&lt;br /&gt;
== Available Tools ==&lt;br /&gt;
&lt;br /&gt;
Currently the following tools are available:&lt;br /&gt;
&lt;br /&gt;
=== Laboratorio de simulación robótica ===&lt;br /&gt;
&lt;br /&gt;
El objetivo de esta herramienta es acercar a aquel que lo desee el mundo de la robótica y la posibilidad de interactuar con un robot real.&lt;br /&gt;
&lt;br /&gt;
[http://www.roboticalabs.unileon.es Go to laboratorio de simulación robótica]&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Tools&amp;diff=5349</id>
		<title>Tools</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Tools&amp;diff=5349"/>
				<updated>2019-11-12T11:18:42Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This site summarizes different applications an tools develeoped by the Robotics Group during their researches.&lt;br /&gt;
&lt;br /&gt;
__FORCETOC__&lt;br /&gt;
&lt;br /&gt;
== Available Tools ==&lt;br /&gt;
&lt;br /&gt;
=== Laboratorio de simulación robótica ===&lt;br /&gt;
&lt;br /&gt;
El objetivo de esta herramienta es acercar a aquel que lo desee el mundo de la robótica y la posibilidad de interactuar con un robot real.&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Tools&amp;diff=5348</id>
		<title>Tools</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Tools&amp;diff=5348"/>
				<updated>2019-11-12T11:15:18Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: Created page with &amp;quot;Herramientas&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Herramientas&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=MediaWiki:Sidebar&amp;diff=5347</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=MediaWiki:Sidebar&amp;diff=5347"/>
				<updated>2019-11-12T11:13:25Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
* Resources&lt;br /&gt;
** Home|Home&lt;br /&gt;
** People|People&lt;br /&gt;
** Publications|Publications&lt;br /&gt;
** https://www.youtube.com/channel/UCos8JoR4tXnEDAAwVonYpdw | Videos&lt;br /&gt;
** Software|Software&lt;br /&gt;
** Datasets|Datasets&lt;br /&gt;
** Tools|Tools&lt;br /&gt;
* Activities&lt;br /&gt;
** Activities|Events&lt;br /&gt;
** Projects|Projects&lt;br /&gt;
** Testbed |Leon@Home Testbed&lt;br /&gt;
** RoboCup|RoboCup@Home Team&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5225</id>
		<title>Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5225"/>
				<updated>2018-02-13T18:07:20Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: /* v1.0 [Nov-2017] */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;blockquote style=&amp;quot;background-color:#ffe;border:1px solid #fb0;padding:5px 10px&amp;quot;&amp;gt;SciCrunch! reference: '''Range-based people tracker classifiers Benchmark Dataset, RRID:SCR_015743.'''&amp;lt;/blockquote&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to evaluate the performance of different approaches for detecting and tracking people by using LIDAR sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers. &lt;br /&gt;
&lt;br /&gt;
Further information at [https://www.frontiersin.org/articles/10.3389/fnbot.2017.00072/full Álvarez-Aparicio et al. (2017)].&lt;br /&gt;
&lt;br /&gt;
== Materials and methods ==&lt;br /&gt;
&lt;br /&gt;
The following section describes the materials (shown in Figure 1) used to gather data, which include: a certified study area, an autonomous robot with an on-board LIDAR sensor, and a real-time location system (RTLS) to obtain ground-truth data about person location. Recorded data include location estimates calculated by two people trackers, LD and PeTra, also described below. Finally, the recording procedure used to build the dataset is explained.&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig1.jpg|frame|center|'''Fig. 1''': Materials used in the experiments. (A) Orbi-One robot. (B) Leon@Home Testbed plan. (C) Floor plan of the apartment. Red dots show the location of KIO anchors. Black numbered dots denote Orbi-One locations during the data gathering. (D) KIO anchors. (E) General view of the apartment. (F) Furniture in mock-up apartment.]]&lt;br /&gt;
&lt;br /&gt;
=== Leon@Home Testbed ===&lt;br /&gt;
&lt;br /&gt;
Data have been gathered at Leon@Home Testbed. This is a Certified Testbed of the European Robotics league (ERL). Its main purpose is to benchmark service robots in a realistic home environment. Our testbed is made up of four parts, shown in Figure 1B: a mock-up apartment, a control zone with direct vision (glass wall) into the apartment, a small workshop, and a larger development zone, where researchers work.&lt;br /&gt;
&lt;br /&gt;
Leon@Home Testbed is located on the second floor of the Módulo de Investigación en Cibernética (Building for Research in Cybernetics) on the Vegazana Campus of the University of León (Spain). The apartment is a single bedroom mock-up home built in an 8 m × 7 m space. Figure 1C shows a plan of the apartment. 60 cm high walls divide it into a kitchen, living room, bathroom, and bedroom. The furniture (Figures 1E,F) has been chosen to test different robot abilities. For instance, the kitchen cabinets all have different types of handles.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-one ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One (Figure 1A) is an assistant robot manufactured by Robotnik. It has several sensors, among them, a RGBD camera, a LIDAR sensor, and an inertial unit. It can operate a manipulator arm attached to its torso and has a wheeled base for moving around the room. Orbi-One includes a wireless access point, which allows WiFi communications with other robots and computers.&lt;br /&gt;
&lt;br /&gt;
The software to control the robot hardware is based on a ROS framework. ROS is basically a set of libraries for robotics similar to operating system services, providing hardware abstraction for sensors and actuators, low-level device control, and inter-process communication. Computation takes place in processes named Nodes, which can receive and send Messages. Nodes publish Messages into information buffers called Topics.&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
In order to acquire ground-truth data about person location in the study area, we need an RTLS for indoor environments. The KIO RTLS commercial solution by Eliko has been used. KIO is a precise RTLS for tracking any object in 2- or 3-dimensional space. The Ultra Wideband technology enables to micro-position objects through obstructions. KIO also works in non-line-of-sight conditions and both indoors and outdoors.&lt;br /&gt;
&lt;br /&gt;
KIO comes in two main configurations. The Regular Cell configuration guarantees a reliable accuracy of ±30 cm, according to the manufacturer’s specifications. The Small Cell configuration is designed for location-critical applications and provides reliable ±5 cm accuracy, according to the manufacturer’s specifications. Calibration done by the authors of this paper on the mock-up apartment shows that the error is higher in some areas, and lower in others, but on average, the claims of the manufacturer are correct.&lt;br /&gt;
&lt;br /&gt;
KIO calculates the position of a mobile transceiver, called a Tag. In order to do so, KIO uses radio beacons, called Anchors, distributed in known positions in the surroundings. Figure 1D shows a KIO anchor. KIO tags are the same size and must be placed on-board the tracking subject, in our case people. The red dots in Figure 1C show the location of the six anchors used in these experiments. They are placed on the ceiling. The distribution of the anchors has been chosen following the method shown in Guerrero-Higueras et al. (2017).&lt;br /&gt;
&lt;br /&gt;
=== Leg Detector (LD) ===&lt;br /&gt;
&lt;br /&gt;
LD is a ROS package, which takes messages published by a LIDAR sensor as input and uses a machine-learning-trained classifier to detect groups of laser readings as possible legs. The code is available in a public repository, but is unsupported at this time.&lt;br /&gt;
&lt;br /&gt;
LD publishes the location for the individual legs. It can also attempt to pair the legs and publish their average as an estimate of where the center of a person is. LD may optionally also publish visualization marker messages to indicate where detections happened.&lt;br /&gt;
&lt;br /&gt;
=== PeTra ===&lt;br /&gt;
&lt;br /&gt;
PeTra is a person-tracker tool for detecting and tracking, developed by the Robotics Group at the University of León. The system is based on a Convolutional Neural Network (CNN) using a configuration based on the U-Net architecture by Ronneberger et al. (2015).&lt;br /&gt;
&lt;br /&gt;
The system performs the following steps in real time:&lt;br /&gt;
&lt;br /&gt;
1. First, the data provided by the LIDAR sensor are processed to build a two dimensional occupancy map centered around the robot. This occupancy map is represented as a binary matrix, where 1s denote positions where the LIDAR scan found an obstacle, and 0s denote positions where the LIDAR scan either went through without detecting any obstacle or did not go through that position.&lt;br /&gt;
&lt;br /&gt;
2. Then, the occupancy map is relayed to the network as input data. The network produces a second occupancy map representing the zones where legs have been detected.&lt;br /&gt;
&lt;br /&gt;
3. Finally, center of mass calculations return the location of persons. PeTra also publishes locations for the individual legs and Marker messages for visualization.&lt;br /&gt;
&lt;br /&gt;
PeTra system have been trained using the following data: [http://robotica.unileon.es/~datasets/LegTracking/PeTra_training_dataset/npy_train_test_globales.tar.gz npy_train_test_globales.tar.gz]&lt;br /&gt;
&lt;br /&gt;
=== Recording procedure ===&lt;br /&gt;
&lt;br /&gt;
The data were gathered in 14 different situations. In all of them, Orbi-One stood still as one or more people, carrying a KIO tag, moved around it. Three different locations for Orbi-One were defined (see Figure 1C) resulting in 42 scenarios (14 situations × 3 Orbi-One locations). Figure 2 shows the 14 different recognition scenarios recorded. These scenarios have been chosen according to different situations that may occur in robotics competitions such as ERL or RoboCup.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig2.png|frame|center|'''Fig. 2''': recognition scenarios recorded as shown at Álvarez-Aparicio et al. (2017).]]&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
A rosbag file was created for each scenario (except for situations 3, 12, and 13 where 3 rosbag files where recorded, for situation 4 where 4 rosbag files where recorded, and for situation 9 where 5 rosbag files where recorded), recording LIDAR sensor measurements, location estimates from PeTra and LD, locations from KIO RTLS, and other useful data. Specifically, the following data were included in the rosbag files:&lt;br /&gt;
&lt;br /&gt;
* LIDAR sensor data. Data from LIDAR sensors are provided as ROS LaserScan Messages, which include, among other information, the following: acquisition time of the first ray in the scan, start/end angle of the scan, angular distance between measurements, and range data.&lt;br /&gt;
* PeTra location estimates provided as ROS PointStamped Messages, which include a position [x, y, z] and a timestamp.&lt;br /&gt;
* Location estimates calculated by LD. It publishes data for individual legs (as ROS PositionMeasurementArray Messages). It also attempts to pair the legs and publishes their average as an estimate of where the center of a person is as a ROS PositionMeasurement Message.&lt;br /&gt;
* Locations provided by KIO RTLS also provided as ROS PointStamped Messages.&lt;br /&gt;
* Messages from /map, /odom, and /tf ROS topics, which include map information, odometry of the robot base, and transformation information, respectively.&lt;br /&gt;
&lt;br /&gt;
Differents versions of the dataset are enumerated below.&lt;br /&gt;
&lt;br /&gt;
=== v1.0 [Nov-2017] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a first version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
{|class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align:center;&amp;quot;| Location 1. Kitchen&lt;br /&gt;
! Location 2. Living room&lt;br /&gt;
! Location 3. Bedroom&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
* Situation 1 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_01.bag test_01.bag]: '''duration:''' 14:56s, '''size:''' 227.8 MB, '''start date/time:''' Jul 20, 2017 12:49:21.16&lt;br /&gt;
&lt;br /&gt;
* Situation 2  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Jul 26, 2017 11:01:24.72&lt;br /&gt;
&lt;br /&gt;
* Situation 3  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_1.bag test_03_1.bag]: '''duration:''' 39.9s, '''size:''' 10.4 MB, '''start date/time:''' Jul 20, 2017 13:27:25.50&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_2.bag test_03_2.bag]: '''duration:''' 39.1s, '''size:''' 10.2 MB, '''start date/time:''' Jul 20, 2017 13:28:56.41&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_3.bag test_03_3.bag]: '''duration:''' 40.5s, '''size:''' 10.5 MB, '''start date/time:''' Jul 20, 2017 13:30:04.94&lt;br /&gt;
&lt;br /&gt;
* Situation 4  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_1.bag test_04_1.bag]: '''duration:''' 58.3s, '''size:''' 15.0 MB, '''start date/time:''' Jul 25, 2017 10:39:52.62 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_2.bag test_04_2.bag]: '''duration:''' 57.2s, '''size:''' 14.7 MB, '''start date/time:''' Jul 25, 2017 10:41:16.31&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_3.bag test_04_3.bag]: '''duration:''' 50.5s, '''size:''' 13.0 MB, '''start date/time:''' Jul 25, 2017 10:42:44.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_4.bag test_04_4.bag]: '''duration:''' 1:01s, '''size:''' 15.7 MB, '''start date/time:''' Jul 25, 2017 10:43:52.44&lt;br /&gt;
&lt;br /&gt;
* Situation 5  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_05.bag test_05.bag]: '''duration:''' 15:15s, '''size:''' 236.8 MB, '''start date/time:''' Jul 26, 2017 11:33:13.31&lt;br /&gt;
&lt;br /&gt;
* Situation 6  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_06.bag test_06.bag]: '''duration:''' 09:18s, '''size:''' 143.3 MB, '''start date/time:''' Jul 26, 2017 12:25:45.12&lt;br /&gt;
&lt;br /&gt;
* Situation 7  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Jul 25, 2017 11:40:01.65&lt;br /&gt;
&lt;br /&gt;
* Situation 8  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_08.bag test_08.bag]: '''duration:''' 03:39s, '''size:''' 55.8 MB, '''start date/time:''' Jul 25, 2017 12:25:29.22&lt;br /&gt;
&lt;br /&gt;
* Situation 9  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_1.bag test_09_1.bag]: '''duration:''' 22.9s, '''size:''' 6.1 MB, '''start date/time:''' Jul 25, 2017 10:50:02.95 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_2.bag test_09_2.bag]: '''duration:''' 20.8s, '''size:''' 5.6 MB, '''start date/time:''' Jul 25, 2017 10:51:02.62&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Jul 25, 2017 10:51:45.96 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_4.bag test_09_4.bag]: '''duration:''' 29.3s, '''size:''' 7.9 MB, '''start date/time:''' Jul 25, 2017 10:52:51.24&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_5.bag test_09_5.bag]: '''duration:''' 36.9s, '''size:''' 9.7 MB, '''start date/time:''' Jul 25, 2017 10:54:00.13&lt;br /&gt;
&lt;br /&gt;
* Situation 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_10.bag test_10.bag]: '''duration:''' 15:50s, '''size:''' 240.5 MB, '''start date/time:''' Jul 20, 2017 13:07:40.16&lt;br /&gt;
&lt;br /&gt;
* Situation 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 58.7 MB, '''start date/time:''' Jul 25, 2017 11:48:33.90&lt;br /&gt;
&lt;br /&gt;
* Situation 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_1.bag test_12_1.bag]: '''duration:''' 43.6s, '''size:''' 11.3 MB, '''start date/time:''' Jul 20, 2017 13:33:23.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_2.bag test_12_2.bag]: '''duration:''' 44.3s, '''size:''' 11.5 MB, '''start date/time:''' Jul 20, 2017 13:34:24.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_3.bag test_12_3.bag]: '''duration:''' 37.3s, '''size:'''  9.7 MB, '''start date/time:''' Jul 20, 2017 13:35:31.55&lt;br /&gt;
&lt;br /&gt;
* Situation 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_1.bag test_13_1.bag]: '''duration:''' 57.8s, '''size:''' 14.9 MB, '''start date/time:''' Jul 25, 2017 11:01:15.23&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_2.bag test_13_2.bag]: '''duration:''' 59.9s, '''size:''' 15.4 MB, '''start date/time:''' Jul 25, 2017 11:02:37.85&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_3.bag test_13_3.bag]: '''duration:''' 54.0s, '''size:''' 13.9 MB, '''start date/time:''' Jul 25, 2017 11:04:30.69&lt;br /&gt;
&lt;br /&gt;
* Situation 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_14.bag test_14.bag]: '''duration:''' 05.57s, '''size:''' 90.5 MB, '''start date/time:''' Jul 25, 2017 11:10:32.97&lt;br /&gt;
|&lt;br /&gt;
* Situation 1 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_01.bag test_01.bag]: '''duration:''' 15:37s, '''size:''' 239.7 MB, '''start date/time:''' Nov 16, 2017 12:20:19.65&lt;br /&gt;
&lt;br /&gt;
* Situation 2  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_02.bag test_02.bag]: '''duration:''' 15:05s, '''size:''' 231.4 MB, '''start date/time:''' Nov 27, 2017 18:32:24.35 &lt;br /&gt;
&lt;br /&gt;
* Situation 3  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_1.bag test_03_1.bag]: '''duration:''' 55.2s, '''size:''' 14.5 MB, '''start date/time:''' Nov 16, 2017 12:43:58.29 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_2.bag test_03_2.bag]: '''duration:''' 44.4s, '''size:''' 10.2 MB, '''start date/time:''' Nov 16, 2017 12:46:31.99&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_3.bag test_03_3.bag]: '''duration:''' 51.1s, '''size:''' 13.4 MB, '''start date/time:''' Nov 16, 2017 12:47:29.66&lt;br /&gt;
&lt;br /&gt;
* Situation 4  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_1.bag test_04_1.bag]: '''duration:''' 01:23s, '''size:''' 21.6 MB, '''start date/time:''' Nov 16, 2017 1:50:22.18 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_2.bag test_04_2.bag]: '''duration:''' 01:20s, '''size:''' 21.0 MB, '''start date/time:''' Nov 16, 2017 12:52:18.44&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_3.bag test_04_3.bag]: '''duration:''' 01:09s, '''size:''' 18.1 MB, '''start date/time:''' Nov 16, 2017 12:53:51.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_4.bag test_04_4.bag]: '''duration:''' 1:21s, '''size:''' 21.3 MB, '''start date/time:''' Nov 16, 2017 12:55:31.81&lt;br /&gt;
&lt;br /&gt;
* Situation 5  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_05.bag test_05.bag]: '''duration:''' 15:07s, '''size:''' 232.4 MB, '''start date/time:''' Nov 28 2017 18:33:16.24&lt;br /&gt;
&lt;br /&gt;
* Situation 6  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_06.bag test_06.bag]: '''duration:''' 09:14s, '''size:''' 136.2 MB, '''start date/time:''' Nov 29 2017 13:12:42.61 &lt;br /&gt;
&lt;br /&gt;
* Situation 7  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_07.bag test_07.bag]: '''duration:''' 4:39s, '''size:''' 71.5 MB, '''start date/time:''' Nov 22 2017 12:48:33.86 &lt;br /&gt;
&lt;br /&gt;
* Situation 8  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_08.bag test_08.bag]: '''duration:''' 03:04s, '''size:''' 47.2 MB, '''start date/time:''' Nov 22 2017 12:59:08.58&lt;br /&gt;
&lt;br /&gt;
* Situation 9  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_1.bag test_09_1.bag]: '''duration:''' 28.8s, '''size:''' 7.7 MB, '''start date/time:''' Nov 28, 2017 17:17:46.49 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_2.bag test_09_2.bag]: '''duration:''' 31.9s, '''size:''' 8.4 MB, '''start date/time:''' Nov 28, 2017 17:20:48.26&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Nov 28, 2017 17:20:05.96&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_4.bag test_09_4.bag]: '''duration:''' 29.9s, '''size:''' 7.9 MB, '''start date/time:''' Nov 28, 2017 17:19:26.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_5.bag test_09_5.bag]: '''duration:''' 30.8s, '''size:''' 8.1 MB, '''start date/time:''' Nov 28, 2017 17:21:20.55&lt;br /&gt;
&lt;br /&gt;
* Situation 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_10.bag test_10.bag]: '''duration:''' 16:05s, '''size:''' 246.6 MB, '''start date/time:''' Nov 22 2017 11:55:02.39&lt;br /&gt;
&lt;br /&gt;
* Situation 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_11.bag test_11.bag]: '''duration:''' 03:25s, '''size:''' 52.8 MB, '''start date/time:''' Nov 22 2017 13:08:11.99&lt;br /&gt;
&lt;br /&gt;
* Situation 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_1.bag test_12_1.bag]: '''duration:''' 01:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:55:26.25&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_2.bag test_12_2.bag]: '''duration:''' 1:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:56:52.47&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_3.bag test_12_3.bag]: '''duration:''' 50.2s, '''size:'''  13.1 MB, '''start date/time:''' Nov 16, 2017 11:44:00.10&lt;br /&gt;
&lt;br /&gt;
* Situation 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_1.bag test_13_1.bag]: '''duration:''' 58.2s, '''size:''' 15.1 MB, '''start date/time:''' Nov 22, 2017 11:48:35.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_2.bag test_13_2.bag]: '''duration:''' 01:05s, '''size:''' 16.9 MB, '''start date/time:''' Nov 22, 2017 11:49:48.18&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_3.bag test_13_3.bag]: '''duration:''' 01:14s, '''size:''' 19.2 MB, '''start date/time:''' Nov 22, 2017 11:51:22.68&lt;br /&gt;
&lt;br /&gt;
* Situation 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_14.bag test_14.bag]: '''duration:''' 06:17s, '''size:''' 96.3 MB, '''start date/time:''' Nov 22 2017 12:23:07.92&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
* Situation 1 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_01.bag test_01.bag]: '''duration:''' 15:17s, '''size:''' 234.6 MB, '''start date/time:''' Nov 21, 2017 18:56:54.87&lt;br /&gt;
&lt;br /&gt;
* Situation 2  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 17:37:34.08&lt;br /&gt;
&lt;br /&gt;
* Situation 3  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_1.bag test_03_1.bag]: '''duration:''' 44.7s, '''size:''' 11.7 MB, '''start date/time:''' Nov 21, 2017 13:16:51.65&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_2.bag test_03_2.bag]: '''duration:''' 39.9s, '''size:''' 10.5 MB, '''start date/time:''' Nov 21, 2017 19:19:06.27&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_3.bag test_03_3.bag]: '''duration:''' 41.2s, '''size:''' 10.8 MB, '''start date/time:''' Nov 21, 2017 19:18:06.86&lt;br /&gt;
&lt;br /&gt;
* Situation 4  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_1.bag test_04_1.bag]: '''duration:''' 01:01s, '''size:''' 16.0 MB, '''start date/time:''' Nov 21, 2017 19:21:30.99 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_2.bag test_04_2.bag]: '''duration:''' 01:01s, '''size:''' 15.9 MB, '''start date/time:''' Nov 21, 2017 19:25:38.00&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_3.bag test_04_3.bag]: '''duration:''' 58.0s, '''size:''' 15.1 MB, '''start date/time:''' Nov 21, 2017 19:27:16.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_4.bag test_04_4.bag]: '''duration:''' 1:00s, '''size:''' 15.8 MB, '''start date/time:''' Nov 21, 2017 19:29:39.28&lt;br /&gt;
&lt;br /&gt;
* Situation 5  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_05.bag test_05.bag]: '''duration:''' 15:11s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 19:05:20.19&lt;br /&gt;
&lt;br /&gt;
* Situation 6  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_06.bag test_06.bag]: '''duration:''' 09:03s, '''size:''' 133.6 MB, '''start date/time:''' Nov 26, 2017 13:24:25.64&lt;br /&gt;
&lt;br /&gt;
* Situation 7  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Nov 27, 2017 17:34:50.22&lt;br /&gt;
&lt;br /&gt;
* Situation 8  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_08.bag test_08.bag]: '''duration:''' 03:47s, '''size:''' 58.3 MB, '''start date/time:''' Nov 27, 2017 17:41:48.32&lt;br /&gt;
&lt;br /&gt;
* Situation 9  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_1.bag test_09_1.bag]: '''duration:''' 31.3s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:23:58.31 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_2.bag test_09_2.bag]: '''duration:''' 30.4s, '''size:''' 8.0 MB, '''start date/time:''' Nov 28, 2017 17:24:41.09&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_3.bag test_09_3.bag]: '''duration:''' 31.2s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:25:22.22&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_4.bag test_09_4.bag]: '''duration:''' 31.4s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:26:06.98&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_5.bag test_09_5.bag]: '''duration:''' 33.2s, '''size:''' 8.8 MB, '''start date/time:''' Nov 28, 2017 17:26:51.32&lt;br /&gt;
&lt;br /&gt;
* Situation 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_10.bag test_10.bag]: '''duration:''' 10:00s, '''size:''' 153.7 MB, '''start date/time:''' Nov 22, 2017 18:15:29.53&lt;br /&gt;
&lt;br /&gt;
* Situation 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 59.1 MB, '''start date/time:''' Nov 22, 2017 18:25:54.55&lt;br /&gt;
&lt;br /&gt;
* Situation 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_1.bag test_12_1.bag]: '''duration:''' 41.3s, '''size:''' 10.8 MB, '''start date/time:''' Nov 22, 2017 18:35:11.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_2.bag test_12_2.bag]: '''duration:''' 41.8s, '''size:''' 10.9 MB, '''start date/time:''' Nov 22, 2017 18:36:12.11&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_3.bag test_12_3.bag]: '''duration:''' 39.8s, '''size:'''  10.5 MB, '''start date/time:''' Nov 22, 2017 18:37:05.06&lt;br /&gt;
&lt;br /&gt;
* Situation 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_1.bag test_13_1.bag]: '''duration:''' 01:01s, '''size:''' 14.9 MB, '''start date/time:''' Nov 22, 2017 18:40:13.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_2.bag test_13_2.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:41:28.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_3.bag test_13_3.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:42:39.52&lt;br /&gt;
&lt;br /&gt;
* Situation 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_14.bag test_14.bag]: '''duration:''' 06:12s, '''size:''' 95.1 MB, '''start date/time:''' Nov 23, 2017 12:53:06.48&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== v2.0 [Feb-2018] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a second version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
{|class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align:center;&amp;quot;| Location 1. Kitchen&lt;br /&gt;
! Location 2. Living room&lt;br /&gt;
! Location 3. Bedroom&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_01.bag cocina_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_02.bag cocina_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_03.bag cocina_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_04.bag cocina_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_05.bag cocina_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_06.bag cocina_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_07.bag cocina_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_08.bag cocina_08.bag]&lt;br /&gt;
* Situation  9: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_09.bag cocina_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_10.bag cocina_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_11.bag cocina_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_12.bag cocina_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_13.bag cocina_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_14.bag cocina_14.bag]&lt;br /&gt;
|&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_01.bag salon_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_02.bag salon_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_03.bag salon_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_04.bag salon_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_05.bag salon_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_06.bag salon_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_07.bag salon_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_08.bag salon_08.bag]&lt;br /&gt;
* Situation  9: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_09.bag salon_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_10.bag salon_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_11.bag salon_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_12.bag salon_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_13.bag salon_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_14.bag salon_14.bag]&lt;br /&gt;
|&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_01.bag habitacion_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_02.bag habitacion_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_03.bag habitacion_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_04.bag habitacion_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_05.bag habitacion_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_06.bag habitacion_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_07.bag habitacion_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_08.bag habitacion_08.bag]&lt;br /&gt;
* Situation  9: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_09.bag habitacion_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_10.bag habitacion_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_11.bag habitacion_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_12.bag habitacion_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_13.bag habitacion_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_14.bag habitacion_14.bag]&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5224</id>
		<title>Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5224"/>
				<updated>2018-02-13T18:06:00Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;blockquote style=&amp;quot;background-color:#ffe;border:1px solid #fb0;padding:5px 10px&amp;quot;&amp;gt;SciCrunch! reference: '''Range-based people tracker classifiers Benchmark Dataset, RRID:SCR_015743.'''&amp;lt;/blockquote&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to evaluate the performance of different approaches for detecting and tracking people by using LIDAR sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers. &lt;br /&gt;
&lt;br /&gt;
Further information at [https://www.frontiersin.org/articles/10.3389/fnbot.2017.00072/full Álvarez-Aparicio et al. (2017)].&lt;br /&gt;
&lt;br /&gt;
== Materials and methods ==&lt;br /&gt;
&lt;br /&gt;
The following section describes the materials (shown in Figure 1) used to gather data, which include: a certified study area, an autonomous robot with an on-board LIDAR sensor, and a real-time location system (RTLS) to obtain ground-truth data about person location. Recorded data include location estimates calculated by two people trackers, LD and PeTra, also described below. Finally, the recording procedure used to build the dataset is explained.&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig1.jpg|frame|center|'''Fig. 1''': Materials used in the experiments. (A) Orbi-One robot. (B) Leon@Home Testbed plan. (C) Floor plan of the apartment. Red dots show the location of KIO anchors. Black numbered dots denote Orbi-One locations during the data gathering. (D) KIO anchors. (E) General view of the apartment. (F) Furniture in mock-up apartment.]]&lt;br /&gt;
&lt;br /&gt;
=== Leon@Home Testbed ===&lt;br /&gt;
&lt;br /&gt;
Data have been gathered at Leon@Home Testbed. This is a Certified Testbed of the European Robotics league (ERL). Its main purpose is to benchmark service robots in a realistic home environment. Our testbed is made up of four parts, shown in Figure 1B: a mock-up apartment, a control zone with direct vision (glass wall) into the apartment, a small workshop, and a larger development zone, where researchers work.&lt;br /&gt;
&lt;br /&gt;
Leon@Home Testbed is located on the second floor of the Módulo de Investigación en Cibernética (Building for Research in Cybernetics) on the Vegazana Campus of the University of León (Spain). The apartment is a single bedroom mock-up home built in an 8 m × 7 m space. Figure 1C shows a plan of the apartment. 60 cm high walls divide it into a kitchen, living room, bathroom, and bedroom. The furniture (Figures 1E,F) has been chosen to test different robot abilities. For instance, the kitchen cabinets all have different types of handles.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-one ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One (Figure 1A) is an assistant robot manufactured by Robotnik. It has several sensors, among them, a RGBD camera, a LIDAR sensor, and an inertial unit. It can operate a manipulator arm attached to its torso and has a wheeled base for moving around the room. Orbi-One includes a wireless access point, which allows WiFi communications with other robots and computers.&lt;br /&gt;
&lt;br /&gt;
The software to control the robot hardware is based on a ROS framework. ROS is basically a set of libraries for robotics similar to operating system services, providing hardware abstraction for sensors and actuators, low-level device control, and inter-process communication. Computation takes place in processes named Nodes, which can receive and send Messages. Nodes publish Messages into information buffers called Topics.&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
In order to acquire ground-truth data about person location in the study area, we need an RTLS for indoor environments. The KIO RTLS commercial solution by Eliko has been used. KIO is a precise RTLS for tracking any object in 2- or 3-dimensional space. The Ultra Wideband technology enables to micro-position objects through obstructions. KIO also works in non-line-of-sight conditions and both indoors and outdoors.&lt;br /&gt;
&lt;br /&gt;
KIO comes in two main configurations. The Regular Cell configuration guarantees a reliable accuracy of ±30 cm, according to the manufacturer’s specifications. The Small Cell configuration is designed for location-critical applications and provides reliable ±5 cm accuracy, according to the manufacturer’s specifications. Calibration done by the authors of this paper on the mock-up apartment shows that the error is higher in some areas, and lower in others, but on average, the claims of the manufacturer are correct.&lt;br /&gt;
&lt;br /&gt;
KIO calculates the position of a mobile transceiver, called a Tag. In order to do so, KIO uses radio beacons, called Anchors, distributed in known positions in the surroundings. Figure 1D shows a KIO anchor. KIO tags are the same size and must be placed on-board the tracking subject, in our case people. The red dots in Figure 1C show the location of the six anchors used in these experiments. They are placed on the ceiling. The distribution of the anchors has been chosen following the method shown in Guerrero-Higueras et al. (2017).&lt;br /&gt;
&lt;br /&gt;
=== Leg Detector (LD) ===&lt;br /&gt;
&lt;br /&gt;
LD is a ROS package, which takes messages published by a LIDAR sensor as input and uses a machine-learning-trained classifier to detect groups of laser readings as possible legs. The code is available in a public repository, but is unsupported at this time.&lt;br /&gt;
&lt;br /&gt;
LD publishes the location for the individual legs. It can also attempt to pair the legs and publish their average as an estimate of where the center of a person is. LD may optionally also publish visualization marker messages to indicate where detections happened.&lt;br /&gt;
&lt;br /&gt;
=== PeTra ===&lt;br /&gt;
&lt;br /&gt;
PeTra is a person-tracker tool for detecting and tracking, developed by the Robotics Group at the University of León. The system is based on a Convolutional Neural Network (CNN) using a configuration based on the U-Net architecture by Ronneberger et al. (2015).&lt;br /&gt;
&lt;br /&gt;
The system performs the following steps in real time:&lt;br /&gt;
&lt;br /&gt;
1. First, the data provided by the LIDAR sensor are processed to build a two dimensional occupancy map centered around the robot. This occupancy map is represented as a binary matrix, where 1s denote positions where the LIDAR scan found an obstacle, and 0s denote positions where the LIDAR scan either went through without detecting any obstacle or did not go through that position.&lt;br /&gt;
&lt;br /&gt;
2. Then, the occupancy map is relayed to the network as input data. The network produces a second occupancy map representing the zones where legs have been detected.&lt;br /&gt;
&lt;br /&gt;
3. Finally, center of mass calculations return the location of persons. PeTra also publishes locations for the individual legs and Marker messages for visualization.&lt;br /&gt;
&lt;br /&gt;
PeTra system have been trained using the following data: [http://robotica.unileon.es/~datasets/LegTracking/PeTra_training_dataset/npy_train_test_globales.tar.gz npy_train_test_globales.tar.gz]&lt;br /&gt;
&lt;br /&gt;
=== Recording procedure ===&lt;br /&gt;
&lt;br /&gt;
The data were gathered in 14 different situations. In all of them, Orbi-One stood still as one or more people, carrying a KIO tag, moved around it. Three different locations for Orbi-One were defined (see Figure 1C) resulting in 42 scenarios (14 situations × 3 Orbi-One locations). Figure 2 shows the 14 different recognition scenarios recorded. These scenarios have been chosen according to different situations that may occur in robotics competitions such as ERL or RoboCup.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig2.png|frame|center|'''Fig. 2''': recognition scenarios recorded as shown at Álvarez-Aparicio et al. (2017).]]&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
A rosbag file was created for each scenario (except for situations 3, 12, and 13 where 3 rosbag files where recorded, for situation 4 where 4 rosbag files where recorded, and for situation 9 where 5 rosbag files where recorded), recording LIDAR sensor measurements, location estimates from PeTra and LD, locations from KIO RTLS, and other useful data. Specifically, the following data were included in the rosbag files:&lt;br /&gt;
&lt;br /&gt;
* LIDAR sensor data. Data from LIDAR sensors are provided as ROS LaserScan Messages, which include, among other information, the following: acquisition time of the first ray in the scan, start/end angle of the scan, angular distance between measurements, and range data.&lt;br /&gt;
* PeTra location estimates provided as ROS PointStamped Messages, which include a position [x, y, z] and a timestamp.&lt;br /&gt;
* Location estimates calculated by LD. It publishes data for individual legs (as ROS PositionMeasurementArray Messages). It also attempts to pair the legs and publishes their average as an estimate of where the center of a person is as a ROS PositionMeasurement Message.&lt;br /&gt;
* Locations provided by KIO RTLS also provided as ROS PointStamped Messages.&lt;br /&gt;
* Messages from /map, /odom, and /tf ROS topics, which include map information, odometry of the robot base, and transformation information, respectively.&lt;br /&gt;
&lt;br /&gt;
Differents versions of the dataset are enumerated below.&lt;br /&gt;
&lt;br /&gt;
=== v1.0 [Nov-2017] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a first version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
{|class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align:center;&amp;quot;| Location 1. Kitchen&lt;br /&gt;
! Location 2. Living room&lt;br /&gt;
! Location 3. Bedroom&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_01.bag test_01.bag]: '''duration:''' 14:56s, '''size:''' 227.8 MB, '''start date/time:''' Jul 20, 2017 12:49:21.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Jul 26, 2017 11:01:24.72&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_1.bag test_03_1.bag]: '''duration:''' 39.9s, '''size:''' 10.4 MB, '''start date/time:''' Jul 20, 2017 13:27:25.50&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_2.bag test_03_2.bag]: '''duration:''' 39.1s, '''size:''' 10.2 MB, '''start date/time:''' Jul 20, 2017 13:28:56.41&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_3.bag test_03_3.bag]: '''duration:''' 40.5s, '''size:''' 10.5 MB, '''start date/time:''' Jul 20, 2017 13:30:04.94&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_1.bag test_04_1.bag]: '''duration:''' 58.3s, '''size:''' 15.0 MB, '''start date/time:''' Jul 25, 2017 10:39:52.62 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_2.bag test_04_2.bag]: '''duration:''' 57.2s, '''size:''' 14.7 MB, '''start date/time:''' Jul 25, 2017 10:41:16.31&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_3.bag test_04_3.bag]: '''duration:''' 50.5s, '''size:''' 13.0 MB, '''start date/time:''' Jul 25, 2017 10:42:44.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_4.bag test_04_4.bag]: '''duration:''' 1:01s, '''size:''' 15.7 MB, '''start date/time:''' Jul 25, 2017 10:43:52.44&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_05.bag test_05.bag]: '''duration:''' 15:15s, '''size:''' 236.8 MB, '''start date/time:''' Jul 26, 2017 11:33:13.31&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_06.bag test_06.bag]: '''duration:''' 09:18s, '''size:''' 143.3 MB, '''start date/time:''' Jul 26, 2017 12:25:45.12&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Jul 25, 2017 11:40:01.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_08.bag test_08.bag]: '''duration:''' 03:39s, '''size:''' 55.8 MB, '''start date/time:''' Jul 25, 2017 12:25:29.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_1.bag test_09_1.bag]: '''duration:''' 22.9s, '''size:''' 6.1 MB, '''start date/time:''' Jul 25, 2017 10:50:02.95 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_2.bag test_09_2.bag]: '''duration:''' 20.8s, '''size:''' 5.6 MB, '''start date/time:''' Jul 25, 2017 10:51:02.62&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Jul 25, 2017 10:51:45.96 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_4.bag test_09_4.bag]: '''duration:''' 29.3s, '''size:''' 7.9 MB, '''start date/time:''' Jul 25, 2017 10:52:51.24&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_5.bag test_09_5.bag]: '''duration:''' 36.9s, '''size:''' 9.7 MB, '''start date/time:''' Jul 25, 2017 10:54:00.13&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_10.bag test_10.bag]: '''duration:''' 15:50s, '''size:''' 240.5 MB, '''start date/time:''' Jul 20, 2017 13:07:40.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 58.7 MB, '''start date/time:''' Jul 25, 2017 11:48:33.90&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_1.bag test_12_1.bag]: '''duration:''' 43.6s, '''size:''' 11.3 MB, '''start date/time:''' Jul 20, 2017 13:33:23.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_2.bag test_12_2.bag]: '''duration:''' 44.3s, '''size:''' 11.5 MB, '''start date/time:''' Jul 20, 2017 13:34:24.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_3.bag test_12_3.bag]: '''duration:''' 37.3s, '''size:'''  9.7 MB, '''start date/time:''' Jul 20, 2017 13:35:31.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_1.bag test_13_1.bag]: '''duration:''' 57.8s, '''size:''' 14.9 MB, '''start date/time:''' Jul 25, 2017 11:01:15.23&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_2.bag test_13_2.bag]: '''duration:''' 59.9s, '''size:''' 15.4 MB, '''start date/time:''' Jul 25, 2017 11:02:37.85&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_3.bag test_13_3.bag]: '''duration:''' 54.0s, '''size:''' 13.9 MB, '''start date/time:''' Jul 25, 2017 11:04:30.69&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_14.bag test_14.bag]: '''duration:''' 05.57s, '''size:''' 90.5 MB, '''start date/time:''' Jul 25, 2017 11:10:32.97&lt;br /&gt;
|&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_01.bag test_01.bag]: '''duration:''' 15:37s, '''size:''' 239.7 MB, '''start date/time:''' Nov 16, 2017 12:20:19.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_02.bag test_02.bag]: '''duration:''' 15:05s, '''size:''' 231.4 MB, '''start date/time:''' Nov 27, 2017 18:32:24.35 &lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_1.bag test_03_1.bag]: '''duration:''' 55.2s, '''size:''' 14.5 MB, '''start date/time:''' Nov 16, 2017 12:43:58.29 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_2.bag test_03_2.bag]: '''duration:''' 44.4s, '''size:''' 10.2 MB, '''start date/time:''' Nov 16, 2017 12:46:31.99&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_3.bag test_03_3.bag]: '''duration:''' 51.1s, '''size:''' 13.4 MB, '''start date/time:''' Nov 16, 2017 12:47:29.66&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_1.bag test_04_1.bag]: '''duration:''' 01:23s, '''size:''' 21.6 MB, '''start date/time:''' Nov 16, 2017 1:50:22.18 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_2.bag test_04_2.bag]: '''duration:''' 01:20s, '''size:''' 21.0 MB, '''start date/time:''' Nov 16, 2017 12:52:18.44&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_3.bag test_04_3.bag]: '''duration:''' 01:09s, '''size:''' 18.1 MB, '''start date/time:''' Nov 16, 2017 12:53:51.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_4.bag test_04_4.bag]: '''duration:''' 1:21s, '''size:''' 21.3 MB, '''start date/time:''' Nov 16, 2017 12:55:31.81&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_05.bag test_05.bag]: '''duration:''' 15:07s, '''size:''' 232.4 MB, '''start date/time:''' Nov 28 2017 18:33:16.24&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_06.bag test_06.bag]: '''duration:''' 09:14s, '''size:''' 136.2 MB, '''start date/time:''' Nov 29 2017 13:12:42.61 &lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_07.bag test_07.bag]: '''duration:''' 4:39s, '''size:''' 71.5 MB, '''start date/time:''' Nov 22 2017 12:48:33.86 &lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_08.bag test_08.bag]: '''duration:''' 03:04s, '''size:''' 47.2 MB, '''start date/time:''' Nov 22 2017 12:59:08.58&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_1.bag test_09_1.bag]: '''duration:''' 28.8s, '''size:''' 7.7 MB, '''start date/time:''' Nov 28, 2017 17:17:46.49 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_2.bag test_09_2.bag]: '''duration:''' 31.9s, '''size:''' 8.4 MB, '''start date/time:''' Nov 28, 2017 17:20:48.26&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Nov 28, 2017 17:20:05.96&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_4.bag test_09_4.bag]: '''duration:''' 29.9s, '''size:''' 7.9 MB, '''start date/time:''' Nov 28, 2017 17:19:26.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_5.bag test_09_5.bag]: '''duration:''' 30.8s, '''size:''' 8.1 MB, '''start date/time:''' Nov 28, 2017 17:21:20.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_10.bag test_10.bag]: '''duration:''' 16:05s, '''size:''' 246.6 MB, '''start date/time:''' Nov 22 2017 11:55:02.39&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_11.bag test_11.bag]: '''duration:''' 03:25s, '''size:''' 52.8 MB, '''start date/time:''' Nov 22 2017 13:08:11.99&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_1.bag test_12_1.bag]: '''duration:''' 01:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:55:26.25&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_2.bag test_12_2.bag]: '''duration:''' 1:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:56:52.47&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_3.bag test_12_3.bag]: '''duration:''' 50.2s, '''size:'''  13.1 MB, '''start date/time:''' Nov 16, 2017 11:44:00.10&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_1.bag test_13_1.bag]: '''duration:''' 58.2s, '''size:''' 15.1 MB, '''start date/time:''' Nov 22, 2017 11:48:35.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_2.bag test_13_2.bag]: '''duration:''' 01:05s, '''size:''' 16.9 MB, '''start date/time:''' Nov 22, 2017 11:49:48.18&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_3.bag test_13_3.bag]: '''duration:''' 01:14s, '''size:''' 19.2 MB, '''start date/time:''' Nov 22, 2017 11:51:22.68&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_14.bag test_14.bag]: '''duration:''' 06:17s, '''size:''' 96.3 MB, '''start date/time:''' Nov 22 2017 12:23:07.92&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_01.bag test_01.bag]: '''duration:''' 15:17s, '''size:''' 234.6 MB, '''start date/time:''' Nov 21, 2017 18:56:54.87&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 17:37:34.08&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_1.bag test_03_1.bag]: '''duration:''' 44.7s, '''size:''' 11.7 MB, '''start date/time:''' Nov 21, 2017 13:16:51.65&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_2.bag test_03_2.bag]: '''duration:''' 39.9s, '''size:''' 10.5 MB, '''start date/time:''' Nov 21, 2017 19:19:06.27&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_3.bag test_03_3.bag]: '''duration:''' 41.2s, '''size:''' 10.8 MB, '''start date/time:''' Nov 21, 2017 19:18:06.86&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_1.bag test_04_1.bag]: '''duration:''' 01:01s, '''size:''' 16.0 MB, '''start date/time:''' Nov 21, 2017 19:21:30.99 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_2.bag test_04_2.bag]: '''duration:''' 01:01s, '''size:''' 15.9 MB, '''start date/time:''' Nov 21, 2017 19:25:38.00&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_3.bag test_04_3.bag]: '''duration:''' 58.0s, '''size:''' 15.1 MB, '''start date/time:''' Nov 21, 2017 19:27:16.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_4.bag test_04_4.bag]: '''duration:''' 1:00s, '''size:''' 15.8 MB, '''start date/time:''' Nov 21, 2017 19:29:39.28&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_05.bag test_05.bag]: '''duration:''' 15:11s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 19:05:20.19&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_06.bag test_06.bag]: '''duration:''' 09:03s, '''size:''' 133.6 MB, '''start date/time:''' Nov 26, 2017 13:24:25.64&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Nov 27, 2017 17:34:50.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_08.bag test_08.bag]: '''duration:''' 03:47s, '''size:''' 58.3 MB, '''start date/time:''' Nov 27, 2017 17:41:48.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_1.bag test_09_1.bag]: '''duration:''' 31.3s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:23:58.31 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_2.bag test_09_2.bag]: '''duration:''' 30.4s, '''size:''' 8.0 MB, '''start date/time:''' Nov 28, 2017 17:24:41.09&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_3.bag test_09_3.bag]: '''duration:''' 31.2s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:25:22.22&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_4.bag test_09_4.bag]: '''duration:''' 31.4s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:26:06.98&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_5.bag test_09_5.bag]: '''duration:''' 33.2s, '''size:''' 8.8 MB, '''start date/time:''' Nov 28, 2017 17:26:51.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_10.bag test_10.bag]: '''duration:''' 10:00s, '''size:''' 153.7 MB, '''start date/time:''' Nov 22, 2017 18:15:29.53&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 59.1 MB, '''start date/time:''' Nov 22, 2017 18:25:54.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_1.bag test_12_1.bag]: '''duration:''' 41.3s, '''size:''' 10.8 MB, '''start date/time:''' Nov 22, 2017 18:35:11.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_2.bag test_12_2.bag]: '''duration:''' 41.8s, '''size:''' 10.9 MB, '''start date/time:''' Nov 22, 2017 18:36:12.11&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_3.bag test_12_3.bag]: '''duration:''' 39.8s, '''size:'''  10.5 MB, '''start date/time:''' Nov 22, 2017 18:37:05.06&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_1.bag test_13_1.bag]: '''duration:''' 01:01s, '''size:''' 14.9 MB, '''start date/time:''' Nov 22, 2017 18:40:13.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_2.bag test_13_2.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:41:28.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_3.bag test_13_3.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:42:39.52&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_14.bag test_14.bag]: '''duration:''' 06:12s, '''size:''' 95.1 MB, '''start date/time:''' Nov 23, 2017 12:53:06.48&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== v2.0 [Feb-2018] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a second version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
{|class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align:center;&amp;quot;| Location 1. Kitchen&lt;br /&gt;
! Location 2. Living room&lt;br /&gt;
! Location 3. Bedroom&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_01.bag cocina_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_02.bag cocina_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_03.bag cocina_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_04.bag cocina_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_05.bag cocina_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_06.bag cocina_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_07.bag cocina_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_08.bag cocina_08.bag]&lt;br /&gt;
* Situation  9: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_09.bag cocina_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_10.bag cocina_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_11.bag cocina_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_12.bag cocina_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_13.bag cocina_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_14.bag cocina_14.bag]&lt;br /&gt;
|&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_01.bag salon_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_02.bag salon_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_03.bag salon_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_04.bag salon_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_05.bag salon_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_06.bag salon_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_07.bag salon_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_08.bag salon_08.bag]&lt;br /&gt;
* Situation  9: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_09.bag salon_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_10.bag salon_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_11.bag salon_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_12.bag salon_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_13.bag salon_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_14.bag salon_14.bag]&lt;br /&gt;
|&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_01.bag habitacion_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_02.bag habitacion_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_03.bag habitacion_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_04.bag habitacion_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_05.bag habitacion_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_06.bag habitacion_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_07.bag habitacion_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_08.bag habitacion_08.bag]&lt;br /&gt;
* Situation  9: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_09.bag habitacion_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_10.bag habitacion_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_11.bag habitacion_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_12.bag habitacion_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_13.bag habitacion_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_14.bag habitacion_14.bag]&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5223</id>
		<title>Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5223"/>
				<updated>2018-02-13T18:03:41Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: /* Location 1. Kitchen */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;blockquote style=&amp;quot;background-color:#ffe;border:1px solid #fb0;padding:5px 10px&amp;quot;&amp;gt;SciCrunch! reference: '''Range-based people tracker classifiers Benchmark Dataset, RRID:SCR_015743.'''&amp;lt;/blockquote&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to evaluate the performance of different approaches for detecting and tracking people by using LIDAR sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers. &lt;br /&gt;
&lt;br /&gt;
Further information at [https://www.frontiersin.org/articles/10.3389/fnbot.2017.00072/full Álvarez-Aparicio et al. (2017)].&lt;br /&gt;
&lt;br /&gt;
== Materials and methods ==&lt;br /&gt;
&lt;br /&gt;
The following section describes the materials (shown in Figure 1) used to gather data, which include: a certified study area, an autonomous robot with an on-board LIDAR sensor, and a real-time location system (RTLS) to obtain ground-truth data about person location. Recorded data include location estimates calculated by two people trackers, LD and PeTra, also described below. Finally, the recording procedure used to build the dataset is explained.&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig1.jpg|frame|center|'''Fig. 1''': Materials used in the experiments. (A) Orbi-One robot. (B) Leon@Home Testbed plan. (C) Floor plan of the apartment. Red dots show the location of KIO anchors. Black numbered dots denote Orbi-One locations during the data gathering. (D) KIO anchors. (E) General view of the apartment. (F) Furniture in mock-up apartment.]]&lt;br /&gt;
&lt;br /&gt;
=== Leon@Home Testbed ===&lt;br /&gt;
&lt;br /&gt;
Data have been gathered at Leon@Home Testbed. This is a Certified Testbed of the European Robotics league (ERL). Its main purpose is to benchmark service robots in a realistic home environment. Our testbed is made up of four parts, shown in Figure 1B: a mock-up apartment, a control zone with direct vision (glass wall) into the apartment, a small workshop, and a larger development zone, where researchers work.&lt;br /&gt;
&lt;br /&gt;
Leon@Home Testbed is located on the second floor of the Módulo de Investigación en Cibernética (Building for Research in Cybernetics) on the Vegazana Campus of the University of León (Spain). The apartment is a single bedroom mock-up home built in an 8 m × 7 m space. Figure 1C shows a plan of the apartment. 60 cm high walls divide it into a kitchen, living room, bathroom, and bedroom. The furniture (Figures 1E,F) has been chosen to test different robot abilities. For instance, the kitchen cabinets all have different types of handles.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-one ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One (Figure 1A) is an assistant robot manufactured by Robotnik. It has several sensors, among them, a RGBD camera, a LIDAR sensor, and an inertial unit. It can operate a manipulator arm attached to its torso and has a wheeled base for moving around the room. Orbi-One includes a wireless access point, which allows WiFi communications with other robots and computers.&lt;br /&gt;
&lt;br /&gt;
The software to control the robot hardware is based on a ROS framework. ROS is basically a set of libraries for robotics similar to operating system services, providing hardware abstraction for sensors and actuators, low-level device control, and inter-process communication. Computation takes place in processes named Nodes, which can receive and send Messages. Nodes publish Messages into information buffers called Topics.&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
In order to acquire ground-truth data about person location in the study area, we need an RTLS for indoor environments. The KIO RTLS commercial solution by Eliko has been used. KIO is a precise RTLS for tracking any object in 2- or 3-dimensional space. The Ultra Wideband technology enables to micro-position objects through obstructions. KIO also works in non-line-of-sight conditions and both indoors and outdoors.&lt;br /&gt;
&lt;br /&gt;
KIO comes in two main configurations. The Regular Cell configuration guarantees a reliable accuracy of ±30 cm, according to the manufacturer’s specifications. The Small Cell configuration is designed for location-critical applications and provides reliable ±5 cm accuracy, according to the manufacturer’s specifications. Calibration done by the authors of this paper on the mock-up apartment shows that the error is higher in some areas, and lower in others, but on average, the claims of the manufacturer are correct.&lt;br /&gt;
&lt;br /&gt;
KIO calculates the position of a mobile transceiver, called a Tag. In order to do so, KIO uses radio beacons, called Anchors, distributed in known positions in the surroundings. Figure 1D shows a KIO anchor. KIO tags are the same size and must be placed on-board the tracking subject, in our case people. The red dots in Figure 1C show the location of the six anchors used in these experiments. They are placed on the ceiling. The distribution of the anchors has been chosen following the method shown in Guerrero-Higueras et al. (2017).&lt;br /&gt;
&lt;br /&gt;
=== Leg Detector (LD) ===&lt;br /&gt;
&lt;br /&gt;
LD is a ROS package, which takes messages published by a LIDAR sensor as input and uses a machine-learning-trained classifier to detect groups of laser readings as possible legs. The code is available in a public repository, but is unsupported at this time.&lt;br /&gt;
&lt;br /&gt;
LD publishes the location for the individual legs. It can also attempt to pair the legs and publish their average as an estimate of where the center of a person is. LD may optionally also publish visualization marker messages to indicate where detections happened.&lt;br /&gt;
&lt;br /&gt;
=== PeTra ===&lt;br /&gt;
&lt;br /&gt;
PeTra is a person-tracker tool for detecting and tracking, developed by the Robotics Group at the University of León. The system is based on a Convolutional Neural Network (CNN) using a configuration based on the U-Net architecture by Ronneberger et al. (2015).&lt;br /&gt;
&lt;br /&gt;
The system performs the following steps in real time:&lt;br /&gt;
&lt;br /&gt;
1. First, the data provided by the LIDAR sensor are processed to build a two dimensional occupancy map centered around the robot. This occupancy map is represented as a binary matrix, where 1s denote positions where the LIDAR scan found an obstacle, and 0s denote positions where the LIDAR scan either went through without detecting any obstacle or did not go through that position.&lt;br /&gt;
&lt;br /&gt;
2. Then, the occupancy map is relayed to the network as input data. The network produces a second occupancy map representing the zones where legs have been detected.&lt;br /&gt;
&lt;br /&gt;
3. Finally, center of mass calculations return the location of persons. PeTra also publishes locations for the individual legs and Marker messages for visualization.&lt;br /&gt;
&lt;br /&gt;
PeTra system have been trained using the following data: [http://robotica.unileon.es/~datasets/LegTracking/PeTra_training_dataset/npy_train_test_globales.tar.gz npy_train_test_globales.tar.gz]&lt;br /&gt;
&lt;br /&gt;
=== Recording procedure ===&lt;br /&gt;
&lt;br /&gt;
The data were gathered in 14 different situations. In all of them, Orbi-One stood still as one or more people, carrying a KIO tag, moved around it. Three different locations for Orbi-One were defined (see Figure 1C) resulting in 42 scenarios (14 situations × 3 Orbi-One locations). Figure 2 shows the 14 different recognition scenarios recorded. These scenarios have been chosen according to different situations that may occur in robotics competitions such as ERL or RoboCup.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig2.png|frame|center|'''Fig. 2''': recognition scenarios recorded as shown at Álvarez-Aparicio et al. (2017).]]&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
A rosbag file was created for each scenario (except for situations 3, 12, and 13 where 3 rosbag files where recorded, for situation 4 where 4 rosbag files where recorded, and for situation 9 where 5 rosbag files where recorded), recording LIDAR sensor measurements, location estimates from PeTra and LD, locations from KIO RTLS, and other useful data. Specifically, the following data were included in the rosbag files:&lt;br /&gt;
&lt;br /&gt;
* LIDAR sensor data. Data from LIDAR sensors are provided as ROS LaserScan Messages, which include, among other information, the following: acquisition time of the first ray in the scan, start/end angle of the scan, angular distance between measurements, and range data.&lt;br /&gt;
* PeTra location estimates provided as ROS PointStamped Messages, which include a position [x, y, z] and a timestamp.&lt;br /&gt;
* Location estimates calculated by LD. It publishes data for individual legs (as ROS PositionMeasurementArray Messages). It also attempts to pair the legs and publishes their average as an estimate of where the center of a person is as a ROS PositionMeasurement Message.&lt;br /&gt;
* Locations provided by KIO RTLS also provided as ROS PointStamped Messages.&lt;br /&gt;
* Messages from /map, /odom, and /tf ROS topics, which include map information, odometry of the robot base, and transformation information, respectively.&lt;br /&gt;
&lt;br /&gt;
Differents versions of the dataset are enumerated below.&lt;br /&gt;
&lt;br /&gt;
=== v1.0 [Nov-2017] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a first version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Location 1. Kitchen====&lt;br /&gt;
&lt;br /&gt;
{|class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align:center;&amp;quot;| Location 1. Kitchen&lt;br /&gt;
! Location 2. Living room&lt;br /&gt;
! Location 3. Bedroom&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_01.bag test_01.bag]: '''duration:''' 14:56s, '''size:''' 227.8 MB, '''start date/time:''' Jul 20, 2017 12:49:21.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Jul 26, 2017 11:01:24.72&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_1.bag test_03_1.bag]: '''duration:''' 39.9s, '''size:''' 10.4 MB, '''start date/time:''' Jul 20, 2017 13:27:25.50&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_2.bag test_03_2.bag]: '''duration:''' 39.1s, '''size:''' 10.2 MB, '''start date/time:''' Jul 20, 2017 13:28:56.41&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_3.bag test_03_3.bag]: '''duration:''' 40.5s, '''size:''' 10.5 MB, '''start date/time:''' Jul 20, 2017 13:30:04.94&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_1.bag test_04_1.bag]: '''duration:''' 58.3s, '''size:''' 15.0 MB, '''start date/time:''' Jul 25, 2017 10:39:52.62 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_2.bag test_04_2.bag]: '''duration:''' 57.2s, '''size:''' 14.7 MB, '''start date/time:''' Jul 25, 2017 10:41:16.31&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_3.bag test_04_3.bag]: '''duration:''' 50.5s, '''size:''' 13.0 MB, '''start date/time:''' Jul 25, 2017 10:42:44.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_4.bag test_04_4.bag]: '''duration:''' 1:01s, '''size:''' 15.7 MB, '''start date/time:''' Jul 25, 2017 10:43:52.44&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_05.bag test_05.bag]: '''duration:''' 15:15s, '''size:''' 236.8 MB, '''start date/time:''' Jul 26, 2017 11:33:13.31&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_06.bag test_06.bag]: '''duration:''' 09:18s, '''size:''' 143.3 MB, '''start date/time:''' Jul 26, 2017 12:25:45.12&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Jul 25, 2017 11:40:01.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_08.bag test_08.bag]: '''duration:''' 03:39s, '''size:''' 55.8 MB, '''start date/time:''' Jul 25, 2017 12:25:29.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_1.bag test_09_1.bag]: '''duration:''' 22.9s, '''size:''' 6.1 MB, '''start date/time:''' Jul 25, 2017 10:50:02.95 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_2.bag test_09_2.bag]: '''duration:''' 20.8s, '''size:''' 5.6 MB, '''start date/time:''' Jul 25, 2017 10:51:02.62&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Jul 25, 2017 10:51:45.96 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_4.bag test_09_4.bag]: '''duration:''' 29.3s, '''size:''' 7.9 MB, '''start date/time:''' Jul 25, 2017 10:52:51.24&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_5.bag test_09_5.bag]: '''duration:''' 36.9s, '''size:''' 9.7 MB, '''start date/time:''' Jul 25, 2017 10:54:00.13&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_10.bag test_10.bag]: '''duration:''' 15:50s, '''size:''' 240.5 MB, '''start date/time:''' Jul 20, 2017 13:07:40.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 58.7 MB, '''start date/time:''' Jul 25, 2017 11:48:33.90&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_1.bag test_12_1.bag]: '''duration:''' 43.6s, '''size:''' 11.3 MB, '''start date/time:''' Jul 20, 2017 13:33:23.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_2.bag test_12_2.bag]: '''duration:''' 44.3s, '''size:''' 11.5 MB, '''start date/time:''' Jul 20, 2017 13:34:24.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_3.bag test_12_3.bag]: '''duration:''' 37.3s, '''size:'''  9.7 MB, '''start date/time:''' Jul 20, 2017 13:35:31.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_1.bag test_13_1.bag]: '''duration:''' 57.8s, '''size:''' 14.9 MB, '''start date/time:''' Jul 25, 2017 11:01:15.23&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_2.bag test_13_2.bag]: '''duration:''' 59.9s, '''size:''' 15.4 MB, '''start date/time:''' Jul 25, 2017 11:02:37.85&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_3.bag test_13_3.bag]: '''duration:''' 54.0s, '''size:''' 13.9 MB, '''start date/time:''' Jul 25, 2017 11:04:30.69&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_14.bag test_14.bag]: '''duration:''' 05.57s, '''size:''' 90.5 MB, '''start date/time:''' Jul 25, 2017 11:10:32.97&lt;br /&gt;
|&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_01.bag salon_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_02.bag salon_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_03.bag salon_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_04.bag salon_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_05.bag salon_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_06.bag salon_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_07.bag salon_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_08.bag salon_08.bag]&lt;br /&gt;
* Situation  9: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_09.bag salon_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_10.bag salon_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_11.bag salon_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_12.bag salon_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_13.bag salon_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_14.bag salon_14.bag]&lt;br /&gt;
|&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_01.bag habitacion_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_02.bag habitacion_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_03.bag habitacion_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_04.bag habitacion_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_05.bag habitacion_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_06.bag habitacion_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_07.bag habitacion_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_08.bag habitacion_08.bag]&lt;br /&gt;
* Situation  9: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_09.bag habitacion_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_10.bag habitacion_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_11.bag habitacion_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_12.bag habitacion_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_13.bag habitacion_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_14.bag habitacion_14.bag]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_01.bag test_01.bag]: '''duration:''' 14:56s, '''size:''' 227.8 MB, '''start date/time:''' Jul 20, 2017 12:49:21.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Jul 26, 2017 11:01:24.72&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_1.bag test_03_1.bag]: '''duration:''' 39.9s, '''size:''' 10.4 MB, '''start date/time:''' Jul 20, 2017 13:27:25.50&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_2.bag test_03_2.bag]: '''duration:''' 39.1s, '''size:''' 10.2 MB, '''start date/time:''' Jul 20, 2017 13:28:56.41&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_3.bag test_03_3.bag]: '''duration:''' 40.5s, '''size:''' 10.5 MB, '''start date/time:''' Jul 20, 2017 13:30:04.94&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_1.bag test_04_1.bag]: '''duration:''' 58.3s, '''size:''' 15.0 MB, '''start date/time:''' Jul 25, 2017 10:39:52.62 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_2.bag test_04_2.bag]: '''duration:''' 57.2s, '''size:''' 14.7 MB, '''start date/time:''' Jul 25, 2017 10:41:16.31&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_3.bag test_04_3.bag]: '''duration:''' 50.5s, '''size:''' 13.0 MB, '''start date/time:''' Jul 25, 2017 10:42:44.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_4.bag test_04_4.bag]: '''duration:''' 1:01s, '''size:''' 15.7 MB, '''start date/time:''' Jul 25, 2017 10:43:52.44&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_05.bag test_05.bag]: '''duration:''' 15:15s, '''size:''' 236.8 MB, '''start date/time:''' Jul 26, 2017 11:33:13.31&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_06.bag test_06.bag]: '''duration:''' 09:18s, '''size:''' 143.3 MB, '''start date/time:''' Jul 26, 2017 12:25:45.12&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Jul 25, 2017 11:40:01.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_08.bag test_08.bag]: '''duration:''' 03:39s, '''size:''' 55.8 MB, '''start date/time:''' Jul 25, 2017 12:25:29.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_1.bag test_09_1.bag]: '''duration:''' 22.9s, '''size:''' 6.1 MB, '''start date/time:''' Jul 25, 2017 10:50:02.95 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_2.bag test_09_2.bag]: '''duration:''' 20.8s, '''size:''' 5.6 MB, '''start date/time:''' Jul 25, 2017 10:51:02.62&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Jul 25, 2017 10:51:45.96 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_4.bag test_09_4.bag]: '''duration:''' 29.3s, '''size:''' 7.9 MB, '''start date/time:''' Jul 25, 2017 10:52:51.24&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_5.bag test_09_5.bag]: '''duration:''' 36.9s, '''size:''' 9.7 MB, '''start date/time:''' Jul 25, 2017 10:54:00.13&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_10.bag test_10.bag]: '''duration:''' 15:50s, '''size:''' 240.5 MB, '''start date/time:''' Jul 20, 2017 13:07:40.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 58.7 MB, '''start date/time:''' Jul 25, 2017 11:48:33.90&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_1.bag test_12_1.bag]: '''duration:''' 43.6s, '''size:''' 11.3 MB, '''start date/time:''' Jul 20, 2017 13:33:23.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_2.bag test_12_2.bag]: '''duration:''' 44.3s, '''size:''' 11.5 MB, '''start date/time:''' Jul 20, 2017 13:34:24.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_3.bag test_12_3.bag]: '''duration:''' 37.3s, '''size:'''  9.7 MB, '''start date/time:''' Jul 20, 2017 13:35:31.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_1.bag test_13_1.bag]: '''duration:''' 57.8s, '''size:''' 14.9 MB, '''start date/time:''' Jul 25, 2017 11:01:15.23&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_2.bag test_13_2.bag]: '''duration:''' 59.9s, '''size:''' 15.4 MB, '''start date/time:''' Jul 25, 2017 11:02:37.85&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_3.bag test_13_3.bag]: '''duration:''' 54.0s, '''size:''' 13.9 MB, '''start date/time:''' Jul 25, 2017 11:04:30.69&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_14.bag test_14.bag]: '''duration:''' 05.57s, '''size:''' 90.5 MB, '''start date/time:''' Jul 25, 2017 11:10:32.97&lt;br /&gt;
&lt;br /&gt;
====Location 2. Living room====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_01.bag test_01.bag]: '''duration:''' 15:37s, '''size:''' 239.7 MB, '''start date/time:''' Nov 16, 2017 12:20:19.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_02.bag test_02.bag]: '''duration:''' 15:05s, '''size:''' 231.4 MB, '''start date/time:''' Nov 27, 2017 18:32:24.35 &lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_1.bag test_03_1.bag]: '''duration:''' 55.2s, '''size:''' 14.5 MB, '''start date/time:''' Nov 16, 2017 12:43:58.29 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_2.bag test_03_2.bag]: '''duration:''' 44.4s, '''size:''' 10.2 MB, '''start date/time:''' Nov 16, 2017 12:46:31.99&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_3.bag test_03_3.bag]: '''duration:''' 51.1s, '''size:''' 13.4 MB, '''start date/time:''' Nov 16, 2017 12:47:29.66&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_1.bag test_04_1.bag]: '''duration:''' 01:23s, '''size:''' 21.6 MB, '''start date/time:''' Nov 16, 2017 1:50:22.18 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_2.bag test_04_2.bag]: '''duration:''' 01:20s, '''size:''' 21.0 MB, '''start date/time:''' Nov 16, 2017 12:52:18.44&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_3.bag test_04_3.bag]: '''duration:''' 01:09s, '''size:''' 18.1 MB, '''start date/time:''' Nov 16, 2017 12:53:51.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_4.bag test_04_4.bag]: '''duration:''' 1:21s, '''size:''' 21.3 MB, '''start date/time:''' Nov 16, 2017 12:55:31.81&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_05.bag test_05.bag]: '''duration:''' 15:07s, '''size:''' 232.4 MB, '''start date/time:''' Nov 28 2017 18:33:16.24&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_06.bag test_06.bag]: '''duration:''' 09:14s, '''size:''' 136.2 MB, '''start date/time:''' Nov 29 2017 13:12:42.61 &lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_07.bag test_07.bag]: '''duration:''' 4:39s, '''size:''' 71.5 MB, '''start date/time:''' Nov 22 2017 12:48:33.86 &lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_08.bag test_08.bag]: '''duration:''' 03:04s, '''size:''' 47.2 MB, '''start date/time:''' Nov 22 2017 12:59:08.58&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_1.bag test_09_1.bag]: '''duration:''' 28.8s, '''size:''' 7.7 MB, '''start date/time:''' Nov 28, 2017 17:17:46.49 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_2.bag test_09_2.bag]: '''duration:''' 31.9s, '''size:''' 8.4 MB, '''start date/time:''' Nov 28, 2017 17:20:48.26&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Nov 28, 2017 17:20:05.96&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_4.bag test_09_4.bag]: '''duration:''' 29.9s, '''size:''' 7.9 MB, '''start date/time:''' Nov 28, 2017 17:19:26.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_5.bag test_09_5.bag]: '''duration:''' 30.8s, '''size:''' 8.1 MB, '''start date/time:''' Nov 28, 2017 17:21:20.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_10.bag test_10.bag]: '''duration:''' 16:05s, '''size:''' 246.6 MB, '''start date/time:''' Nov 22 2017 11:55:02.39&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_11.bag test_11.bag]: '''duration:''' 03:25s, '''size:''' 52.8 MB, '''start date/time:''' Nov 22 2017 13:08:11.99&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_1.bag test_12_1.bag]: '''duration:''' 01:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:55:26.25&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_2.bag test_12_2.bag]: '''duration:''' 1:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:56:52.47&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_3.bag test_12_3.bag]: '''duration:''' 50.2s, '''size:'''  13.1 MB, '''start date/time:''' Nov 16, 2017 11:44:00.10&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_1.bag test_13_1.bag]: '''duration:''' 58.2s, '''size:''' 15.1 MB, '''start date/time:''' Nov 22, 2017 11:48:35.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_2.bag test_13_2.bag]: '''duration:''' 01:05s, '''size:''' 16.9 MB, '''start date/time:''' Nov 22, 2017 11:49:48.18&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_3.bag test_13_3.bag]: '''duration:''' 01:14s, '''size:''' 19.2 MB, '''start date/time:''' Nov 22, 2017 11:51:22.68&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_14.bag test_14.bag]: '''duration:''' 06:17s, '''size:''' 96.3 MB, '''start date/time:''' Nov 22 2017 12:23:07.92&lt;br /&gt;
&lt;br /&gt;
====Location 3. Bedroom====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_01.bag test_01.bag]: '''duration:''' 15:17s, '''size:''' 234.6 MB, '''start date/time:''' Nov 21, 2017 18:56:54.87&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 17:37:34.08&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_1.bag test_03_1.bag]: '''duration:''' 44.7s, '''size:''' 11.7 MB, '''start date/time:''' Nov 21, 2017 13:16:51.65&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_2.bag test_03_2.bag]: '''duration:''' 39.9s, '''size:''' 10.5 MB, '''start date/time:''' Nov 21, 2017 19:19:06.27&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_3.bag test_03_3.bag]: '''duration:''' 41.2s, '''size:''' 10.8 MB, '''start date/time:''' Nov 21, 2017 19:18:06.86&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_1.bag test_04_1.bag]: '''duration:''' 01:01s, '''size:''' 16.0 MB, '''start date/time:''' Nov 21, 2017 19:21:30.99 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_2.bag test_04_2.bag]: '''duration:''' 01:01s, '''size:''' 15.9 MB, '''start date/time:''' Nov 21, 2017 19:25:38.00&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_3.bag test_04_3.bag]: '''duration:''' 58.0s, '''size:''' 15.1 MB, '''start date/time:''' Nov 21, 2017 19:27:16.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_4.bag test_04_4.bag]: '''duration:''' 1:00s, '''size:''' 15.8 MB, '''start date/time:''' Nov 21, 2017 19:29:39.28&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_05.bag test_05.bag]: '''duration:''' 15:11s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 19:05:20.19&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_06.bag test_06.bag]: '''duration:''' 09:03s, '''size:''' 133.6 MB, '''start date/time:''' Nov 26, 2017 13:24:25.64&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Nov 27, 2017 17:34:50.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_08.bag test_08.bag]: '''duration:''' 03:47s, '''size:''' 58.3 MB, '''start date/time:''' Nov 27, 2017 17:41:48.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_1.bag test_09_1.bag]: '''duration:''' 31.3s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:23:58.31 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_2.bag test_09_2.bag]: '''duration:''' 30.4s, '''size:''' 8.0 MB, '''start date/time:''' Nov 28, 2017 17:24:41.09&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_3.bag test_09_3.bag]: '''duration:''' 31.2s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:25:22.22&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_4.bag test_09_4.bag]: '''duration:''' 31.4s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:26:06.98&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_5.bag test_09_5.bag]: '''duration:''' 33.2s, '''size:''' 8.8 MB, '''start date/time:''' Nov 28, 2017 17:26:51.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_10.bag test_10.bag]: '''duration:''' 10:00s, '''size:''' 153.7 MB, '''start date/time:''' Nov 22, 2017 18:15:29.53&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 59.1 MB, '''start date/time:''' Nov 22, 2017 18:25:54.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_1.bag test_12_1.bag]: '''duration:''' 41.3s, '''size:''' 10.8 MB, '''start date/time:''' Nov 22, 2017 18:35:11.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_2.bag test_12_2.bag]: '''duration:''' 41.8s, '''size:''' 10.9 MB, '''start date/time:''' Nov 22, 2017 18:36:12.11&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_3.bag test_12_3.bag]: '''duration:''' 39.8s, '''size:'''  10.5 MB, '''start date/time:''' Nov 22, 2017 18:37:05.06&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_1.bag test_13_1.bag]: '''duration:''' 01:01s, '''size:''' 14.9 MB, '''start date/time:''' Nov 22, 2017 18:40:13.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_2.bag test_13_2.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:41:28.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_3.bag test_13_3.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:42:39.52&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_14.bag test_14.bag]: '''duration:''' 06:12s, '''size:''' 95.1 MB, '''start date/time:''' Nov 23, 2017 12:53:06.48&lt;br /&gt;
&lt;br /&gt;
=== v2.0 [Feb-2018] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a second version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
{|class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align:center;&amp;quot;| Location 1. Kitchen&lt;br /&gt;
! Location 2. Living room&lt;br /&gt;
! Location 3. Bedroom&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_01.bag cocina_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_02.bag cocina_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_03.bag cocina_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_04.bag cocina_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_05.bag cocina_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_06.bag cocina_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_07.bag cocina_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_08.bag cocina_08.bag]&lt;br /&gt;
* Situation  9: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_09.bag cocina_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_10.bag cocina_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_11.bag cocina_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_12.bag cocina_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_13.bag cocina_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_14.bag cocina_14.bag]&lt;br /&gt;
|&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_01.bag salon_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_02.bag salon_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_03.bag salon_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_04.bag salon_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_05.bag salon_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_06.bag salon_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_07.bag salon_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_08.bag salon_08.bag]&lt;br /&gt;
* Situation  9: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_09.bag salon_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_10.bag salon_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_11.bag salon_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_12.bag salon_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_13.bag salon_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_14.bag salon_14.bag]&lt;br /&gt;
|&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_01.bag habitacion_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_02.bag habitacion_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_03.bag habitacion_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_04.bag habitacion_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_05.bag habitacion_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_06.bag habitacion_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_07.bag habitacion_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_08.bag habitacion_08.bag]&lt;br /&gt;
* Situation  9: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_09.bag habitacion_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_10.bag habitacion_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_11.bag habitacion_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_12.bag habitacion_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_13.bag habitacion_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_14.bag habitacion_14.bag]&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5222</id>
		<title>Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5222"/>
				<updated>2018-02-13T18:00:27Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: /* Leg Detector (D) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;blockquote style=&amp;quot;background-color:#ffe;border:1px solid #fb0;padding:5px 10px&amp;quot;&amp;gt;SciCrunch! reference: '''Range-based people tracker classifiers Benchmark Dataset, RRID:SCR_015743.'''&amp;lt;/blockquote&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to evaluate the performance of different approaches for detecting and tracking people by using LIDAR sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers. &lt;br /&gt;
&lt;br /&gt;
Further information at [https://www.frontiersin.org/articles/10.3389/fnbot.2017.00072/full Álvarez-Aparicio et al. (2017)].&lt;br /&gt;
&lt;br /&gt;
== Materials and methods ==&lt;br /&gt;
&lt;br /&gt;
The following section describes the materials (shown in Figure 1) used to gather data, which include: a certified study area, an autonomous robot with an on-board LIDAR sensor, and a real-time location system (RTLS) to obtain ground-truth data about person location. Recorded data include location estimates calculated by two people trackers, LD and PeTra, also described below. Finally, the recording procedure used to build the dataset is explained.&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig1.jpg|frame|center|'''Fig. 1''': Materials used in the experiments. (A) Orbi-One robot. (B) Leon@Home Testbed plan. (C) Floor plan of the apartment. Red dots show the location of KIO anchors. Black numbered dots denote Orbi-One locations during the data gathering. (D) KIO anchors. (E) General view of the apartment. (F) Furniture in mock-up apartment.]]&lt;br /&gt;
&lt;br /&gt;
=== Leon@Home Testbed ===&lt;br /&gt;
&lt;br /&gt;
Data have been gathered at Leon@Home Testbed. This is a Certified Testbed of the European Robotics league (ERL). Its main purpose is to benchmark service robots in a realistic home environment. Our testbed is made up of four parts, shown in Figure 1B: a mock-up apartment, a control zone with direct vision (glass wall) into the apartment, a small workshop, and a larger development zone, where researchers work.&lt;br /&gt;
&lt;br /&gt;
Leon@Home Testbed is located on the second floor of the Módulo de Investigación en Cibernética (Building for Research in Cybernetics) on the Vegazana Campus of the University of León (Spain). The apartment is a single bedroom mock-up home built in an 8 m × 7 m space. Figure 1C shows a plan of the apartment. 60 cm high walls divide it into a kitchen, living room, bathroom, and bedroom. The furniture (Figures 1E,F) has been chosen to test different robot abilities. For instance, the kitchen cabinets all have different types of handles.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-one ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One (Figure 1A) is an assistant robot manufactured by Robotnik. It has several sensors, among them, a RGBD camera, a LIDAR sensor, and an inertial unit. It can operate a manipulator arm attached to its torso and has a wheeled base for moving around the room. Orbi-One includes a wireless access point, which allows WiFi communications with other robots and computers.&lt;br /&gt;
&lt;br /&gt;
The software to control the robot hardware is based on a ROS framework. ROS is basically a set of libraries for robotics similar to operating system services, providing hardware abstraction for sensors and actuators, low-level device control, and inter-process communication. Computation takes place in processes named Nodes, which can receive and send Messages. Nodes publish Messages into information buffers called Topics.&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
In order to acquire ground-truth data about person location in the study area, we need an RTLS for indoor environments. The KIO RTLS commercial solution by Eliko has been used. KIO is a precise RTLS for tracking any object in 2- or 3-dimensional space. The Ultra Wideband technology enables to micro-position objects through obstructions. KIO also works in non-line-of-sight conditions and both indoors and outdoors.&lt;br /&gt;
&lt;br /&gt;
KIO comes in two main configurations. The Regular Cell configuration guarantees a reliable accuracy of ±30 cm, according to the manufacturer’s specifications. The Small Cell configuration is designed for location-critical applications and provides reliable ±5 cm accuracy, according to the manufacturer’s specifications. Calibration done by the authors of this paper on the mock-up apartment shows that the error is higher in some areas, and lower in others, but on average, the claims of the manufacturer are correct.&lt;br /&gt;
&lt;br /&gt;
KIO calculates the position of a mobile transceiver, called a Tag. In order to do so, KIO uses radio beacons, called Anchors, distributed in known positions in the surroundings. Figure 1D shows a KIO anchor. KIO tags are the same size and must be placed on-board the tracking subject, in our case people. The red dots in Figure 1C show the location of the six anchors used in these experiments. They are placed on the ceiling. The distribution of the anchors has been chosen following the method shown in Guerrero-Higueras et al. (2017).&lt;br /&gt;
&lt;br /&gt;
=== Leg Detector (LD) ===&lt;br /&gt;
&lt;br /&gt;
LD is a ROS package, which takes messages published by a LIDAR sensor as input and uses a machine-learning-trained classifier to detect groups of laser readings as possible legs. The code is available in a public repository, but is unsupported at this time.&lt;br /&gt;
&lt;br /&gt;
LD publishes the location for the individual legs. It can also attempt to pair the legs and publish their average as an estimate of where the center of a person is. LD may optionally also publish visualization marker messages to indicate where detections happened.&lt;br /&gt;
&lt;br /&gt;
=== PeTra ===&lt;br /&gt;
&lt;br /&gt;
PeTra is a person-tracker tool for detecting and tracking, developed by the Robotics Group at the University of León. The system is based on a Convolutional Neural Network (CNN) using a configuration based on the U-Net architecture by Ronneberger et al. (2015).&lt;br /&gt;
&lt;br /&gt;
The system performs the following steps in real time:&lt;br /&gt;
&lt;br /&gt;
1. First, the data provided by the LIDAR sensor are processed to build a two dimensional occupancy map centered around the robot. This occupancy map is represented as a binary matrix, where 1s denote positions where the LIDAR scan found an obstacle, and 0s denote positions where the LIDAR scan either went through without detecting any obstacle or did not go through that position.&lt;br /&gt;
&lt;br /&gt;
2. Then, the occupancy map is relayed to the network as input data. The network produces a second occupancy map representing the zones where legs have been detected.&lt;br /&gt;
&lt;br /&gt;
3. Finally, center of mass calculations return the location of persons. PeTra also publishes locations for the individual legs and Marker messages for visualization.&lt;br /&gt;
&lt;br /&gt;
PeTra system have been trained using the following data: [http://robotica.unileon.es/~datasets/LegTracking/PeTra_training_dataset/npy_train_test_globales.tar.gz npy_train_test_globales.tar.gz]&lt;br /&gt;
&lt;br /&gt;
=== Recording procedure ===&lt;br /&gt;
&lt;br /&gt;
The data were gathered in 14 different situations. In all of them, Orbi-One stood still as one or more people, carrying a KIO tag, moved around it. Three different locations for Orbi-One were defined (see Figure 1C) resulting in 42 scenarios (14 situations × 3 Orbi-One locations). Figure 2 shows the 14 different recognition scenarios recorded. These scenarios have been chosen according to different situations that may occur in robotics competitions such as ERL or RoboCup.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig2.png|frame|center|'''Fig. 2''': recognition scenarios recorded as shown at Álvarez-Aparicio et al. (2017).]]&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
A rosbag file was created for each scenario (except for situations 3, 12, and 13 where 3 rosbag files where recorded, for situation 4 where 4 rosbag files where recorded, and for situation 9 where 5 rosbag files where recorded), recording LIDAR sensor measurements, location estimates from PeTra and LD, locations from KIO RTLS, and other useful data. Specifically, the following data were included in the rosbag files:&lt;br /&gt;
&lt;br /&gt;
* LIDAR sensor data. Data from LIDAR sensors are provided as ROS LaserScan Messages, which include, among other information, the following: acquisition time of the first ray in the scan, start/end angle of the scan, angular distance between measurements, and range data.&lt;br /&gt;
* PeTra location estimates provided as ROS PointStamped Messages, which include a position [x, y, z] and a timestamp.&lt;br /&gt;
* Location estimates calculated by LD. It publishes data for individual legs (as ROS PositionMeasurementArray Messages). It also attempts to pair the legs and publishes their average as an estimate of where the center of a person is as a ROS PositionMeasurement Message.&lt;br /&gt;
* Locations provided by KIO RTLS also provided as ROS PointStamped Messages.&lt;br /&gt;
* Messages from /map, /odom, and /tf ROS topics, which include map information, odometry of the robot base, and transformation information, respectively.&lt;br /&gt;
&lt;br /&gt;
Differents versions of the dataset are enumerated below.&lt;br /&gt;
&lt;br /&gt;
=== v1.0 [Nov-2017] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a first version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Location 1. Kitchen====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_01.bag test_01.bag]: '''duration:''' 14:56s, '''size:''' 227.8 MB, '''start date/time:''' Jul 20, 2017 12:49:21.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Jul 26, 2017 11:01:24.72&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_1.bag test_03_1.bag]: '''duration:''' 39.9s, '''size:''' 10.4 MB, '''start date/time:''' Jul 20, 2017 13:27:25.50&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_2.bag test_03_2.bag]: '''duration:''' 39.1s, '''size:''' 10.2 MB, '''start date/time:''' Jul 20, 2017 13:28:56.41&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_3.bag test_03_3.bag]: '''duration:''' 40.5s, '''size:''' 10.5 MB, '''start date/time:''' Jul 20, 2017 13:30:04.94&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_1.bag test_04_1.bag]: '''duration:''' 58.3s, '''size:''' 15.0 MB, '''start date/time:''' Jul 25, 2017 10:39:52.62 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_2.bag test_04_2.bag]: '''duration:''' 57.2s, '''size:''' 14.7 MB, '''start date/time:''' Jul 25, 2017 10:41:16.31&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_3.bag test_04_3.bag]: '''duration:''' 50.5s, '''size:''' 13.0 MB, '''start date/time:''' Jul 25, 2017 10:42:44.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_4.bag test_04_4.bag]: '''duration:''' 1:01s, '''size:''' 15.7 MB, '''start date/time:''' Jul 25, 2017 10:43:52.44&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_05.bag test_05.bag]: '''duration:''' 15:15s, '''size:''' 236.8 MB, '''start date/time:''' Jul 26, 2017 11:33:13.31&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_06.bag test_06.bag]: '''duration:''' 09:18s, '''size:''' 143.3 MB, '''start date/time:''' Jul 26, 2017 12:25:45.12&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Jul 25, 2017 11:40:01.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_08.bag test_08.bag]: '''duration:''' 03:39s, '''size:''' 55.8 MB, '''start date/time:''' Jul 25, 2017 12:25:29.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_1.bag test_09_1.bag]: '''duration:''' 22.9s, '''size:''' 6.1 MB, '''start date/time:''' Jul 25, 2017 10:50:02.95 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_2.bag test_09_2.bag]: '''duration:''' 20.8s, '''size:''' 5.6 MB, '''start date/time:''' Jul 25, 2017 10:51:02.62&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Jul 25, 2017 10:51:45.96 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_4.bag test_09_4.bag]: '''duration:''' 29.3s, '''size:''' 7.9 MB, '''start date/time:''' Jul 25, 2017 10:52:51.24&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_5.bag test_09_5.bag]: '''duration:''' 36.9s, '''size:''' 9.7 MB, '''start date/time:''' Jul 25, 2017 10:54:00.13&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_10.bag test_10.bag]: '''duration:''' 15:50s, '''size:''' 240.5 MB, '''start date/time:''' Jul 20, 2017 13:07:40.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 58.7 MB, '''start date/time:''' Jul 25, 2017 11:48:33.90&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_1.bag test_12_1.bag]: '''duration:''' 43.6s, '''size:''' 11.3 MB, '''start date/time:''' Jul 20, 2017 13:33:23.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_2.bag test_12_2.bag]: '''duration:''' 44.3s, '''size:''' 11.5 MB, '''start date/time:''' Jul 20, 2017 13:34:24.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_3.bag test_12_3.bag]: '''duration:''' 37.3s, '''size:'''  9.7 MB, '''start date/time:''' Jul 20, 2017 13:35:31.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_1.bag test_13_1.bag]: '''duration:''' 57.8s, '''size:''' 14.9 MB, '''start date/time:''' Jul 25, 2017 11:01:15.23&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_2.bag test_13_2.bag]: '''duration:''' 59.9s, '''size:''' 15.4 MB, '''start date/time:''' Jul 25, 2017 11:02:37.85&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_3.bag test_13_3.bag]: '''duration:''' 54.0s, '''size:''' 13.9 MB, '''start date/time:''' Jul 25, 2017 11:04:30.69&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_14.bag test_14.bag]: '''duration:''' 05.57s, '''size:''' 90.5 MB, '''start date/time:''' Jul 25, 2017 11:10:32.97&lt;br /&gt;
&lt;br /&gt;
====Location 2. Living room====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_01.bag test_01.bag]: '''duration:''' 15:37s, '''size:''' 239.7 MB, '''start date/time:''' Nov 16, 2017 12:20:19.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_02.bag test_02.bag]: '''duration:''' 15:05s, '''size:''' 231.4 MB, '''start date/time:''' Nov 27, 2017 18:32:24.35 &lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_1.bag test_03_1.bag]: '''duration:''' 55.2s, '''size:''' 14.5 MB, '''start date/time:''' Nov 16, 2017 12:43:58.29 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_2.bag test_03_2.bag]: '''duration:''' 44.4s, '''size:''' 10.2 MB, '''start date/time:''' Nov 16, 2017 12:46:31.99&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_3.bag test_03_3.bag]: '''duration:''' 51.1s, '''size:''' 13.4 MB, '''start date/time:''' Nov 16, 2017 12:47:29.66&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_1.bag test_04_1.bag]: '''duration:''' 01:23s, '''size:''' 21.6 MB, '''start date/time:''' Nov 16, 2017 1:50:22.18 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_2.bag test_04_2.bag]: '''duration:''' 01:20s, '''size:''' 21.0 MB, '''start date/time:''' Nov 16, 2017 12:52:18.44&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_3.bag test_04_3.bag]: '''duration:''' 01:09s, '''size:''' 18.1 MB, '''start date/time:''' Nov 16, 2017 12:53:51.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_4.bag test_04_4.bag]: '''duration:''' 1:21s, '''size:''' 21.3 MB, '''start date/time:''' Nov 16, 2017 12:55:31.81&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_05.bag test_05.bag]: '''duration:''' 15:07s, '''size:''' 232.4 MB, '''start date/time:''' Nov 28 2017 18:33:16.24&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_06.bag test_06.bag]: '''duration:''' 09:14s, '''size:''' 136.2 MB, '''start date/time:''' Nov 29 2017 13:12:42.61 &lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_07.bag test_07.bag]: '''duration:''' 4:39s, '''size:''' 71.5 MB, '''start date/time:''' Nov 22 2017 12:48:33.86 &lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_08.bag test_08.bag]: '''duration:''' 03:04s, '''size:''' 47.2 MB, '''start date/time:''' Nov 22 2017 12:59:08.58&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_1.bag test_09_1.bag]: '''duration:''' 28.8s, '''size:''' 7.7 MB, '''start date/time:''' Nov 28, 2017 17:17:46.49 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_2.bag test_09_2.bag]: '''duration:''' 31.9s, '''size:''' 8.4 MB, '''start date/time:''' Nov 28, 2017 17:20:48.26&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Nov 28, 2017 17:20:05.96&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_4.bag test_09_4.bag]: '''duration:''' 29.9s, '''size:''' 7.9 MB, '''start date/time:''' Nov 28, 2017 17:19:26.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_5.bag test_09_5.bag]: '''duration:''' 30.8s, '''size:''' 8.1 MB, '''start date/time:''' Nov 28, 2017 17:21:20.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_10.bag test_10.bag]: '''duration:''' 16:05s, '''size:''' 246.6 MB, '''start date/time:''' Nov 22 2017 11:55:02.39&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_11.bag test_11.bag]: '''duration:''' 03:25s, '''size:''' 52.8 MB, '''start date/time:''' Nov 22 2017 13:08:11.99&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_1.bag test_12_1.bag]: '''duration:''' 01:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:55:26.25&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_2.bag test_12_2.bag]: '''duration:''' 1:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:56:52.47&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_3.bag test_12_3.bag]: '''duration:''' 50.2s, '''size:'''  13.1 MB, '''start date/time:''' Nov 16, 2017 11:44:00.10&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_1.bag test_13_1.bag]: '''duration:''' 58.2s, '''size:''' 15.1 MB, '''start date/time:''' Nov 22, 2017 11:48:35.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_2.bag test_13_2.bag]: '''duration:''' 01:05s, '''size:''' 16.9 MB, '''start date/time:''' Nov 22, 2017 11:49:48.18&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_3.bag test_13_3.bag]: '''duration:''' 01:14s, '''size:''' 19.2 MB, '''start date/time:''' Nov 22, 2017 11:51:22.68&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_14.bag test_14.bag]: '''duration:''' 06:17s, '''size:''' 96.3 MB, '''start date/time:''' Nov 22 2017 12:23:07.92&lt;br /&gt;
&lt;br /&gt;
====Location 3. Bedroom====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_01.bag test_01.bag]: '''duration:''' 15:17s, '''size:''' 234.6 MB, '''start date/time:''' Nov 21, 2017 18:56:54.87&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 17:37:34.08&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_1.bag test_03_1.bag]: '''duration:''' 44.7s, '''size:''' 11.7 MB, '''start date/time:''' Nov 21, 2017 13:16:51.65&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_2.bag test_03_2.bag]: '''duration:''' 39.9s, '''size:''' 10.5 MB, '''start date/time:''' Nov 21, 2017 19:19:06.27&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_3.bag test_03_3.bag]: '''duration:''' 41.2s, '''size:''' 10.8 MB, '''start date/time:''' Nov 21, 2017 19:18:06.86&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_1.bag test_04_1.bag]: '''duration:''' 01:01s, '''size:''' 16.0 MB, '''start date/time:''' Nov 21, 2017 19:21:30.99 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_2.bag test_04_2.bag]: '''duration:''' 01:01s, '''size:''' 15.9 MB, '''start date/time:''' Nov 21, 2017 19:25:38.00&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_3.bag test_04_3.bag]: '''duration:''' 58.0s, '''size:''' 15.1 MB, '''start date/time:''' Nov 21, 2017 19:27:16.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_4.bag test_04_4.bag]: '''duration:''' 1:00s, '''size:''' 15.8 MB, '''start date/time:''' Nov 21, 2017 19:29:39.28&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_05.bag test_05.bag]: '''duration:''' 15:11s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 19:05:20.19&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_06.bag test_06.bag]: '''duration:''' 09:03s, '''size:''' 133.6 MB, '''start date/time:''' Nov 26, 2017 13:24:25.64&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Nov 27, 2017 17:34:50.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_08.bag test_08.bag]: '''duration:''' 03:47s, '''size:''' 58.3 MB, '''start date/time:''' Nov 27, 2017 17:41:48.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_1.bag test_09_1.bag]: '''duration:''' 31.3s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:23:58.31 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_2.bag test_09_2.bag]: '''duration:''' 30.4s, '''size:''' 8.0 MB, '''start date/time:''' Nov 28, 2017 17:24:41.09&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_3.bag test_09_3.bag]: '''duration:''' 31.2s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:25:22.22&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_4.bag test_09_4.bag]: '''duration:''' 31.4s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:26:06.98&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_5.bag test_09_5.bag]: '''duration:''' 33.2s, '''size:''' 8.8 MB, '''start date/time:''' Nov 28, 2017 17:26:51.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_10.bag test_10.bag]: '''duration:''' 10:00s, '''size:''' 153.7 MB, '''start date/time:''' Nov 22, 2017 18:15:29.53&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 59.1 MB, '''start date/time:''' Nov 22, 2017 18:25:54.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_1.bag test_12_1.bag]: '''duration:''' 41.3s, '''size:''' 10.8 MB, '''start date/time:''' Nov 22, 2017 18:35:11.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_2.bag test_12_2.bag]: '''duration:''' 41.8s, '''size:''' 10.9 MB, '''start date/time:''' Nov 22, 2017 18:36:12.11&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_3.bag test_12_3.bag]: '''duration:''' 39.8s, '''size:'''  10.5 MB, '''start date/time:''' Nov 22, 2017 18:37:05.06&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_1.bag test_13_1.bag]: '''duration:''' 01:01s, '''size:''' 14.9 MB, '''start date/time:''' Nov 22, 2017 18:40:13.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_2.bag test_13_2.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:41:28.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_3.bag test_13_3.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:42:39.52&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_14.bag test_14.bag]: '''duration:''' 06:12s, '''size:''' 95.1 MB, '''start date/time:''' Nov 23, 2017 12:53:06.48&lt;br /&gt;
&lt;br /&gt;
=== v2.0 [Feb-2018] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a second version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
{|class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align:center;&amp;quot;| Location 1. Kitchen&lt;br /&gt;
! Location 2. Living room&lt;br /&gt;
! Location 3. Bedroom&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_01.bag cocina_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_02.bag cocina_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_03.bag cocina_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_04.bag cocina_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_05.bag cocina_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_06.bag cocina_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_07.bag cocina_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_08.bag cocina_08.bag]&lt;br /&gt;
* Situation  9: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_09.bag cocina_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_10.bag cocina_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_11.bag cocina_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_12.bag cocina_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_13.bag cocina_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_14.bag cocina_14.bag]&lt;br /&gt;
|&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_01.bag salon_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_02.bag salon_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_03.bag salon_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_04.bag salon_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_05.bag salon_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_06.bag salon_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_07.bag salon_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_08.bag salon_08.bag]&lt;br /&gt;
* Situation  9: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_09.bag salon_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_10.bag salon_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_11.bag salon_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_12.bag salon_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_13.bag salon_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_14.bag salon_14.bag]&lt;br /&gt;
|&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_01.bag habitacion_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_02.bag habitacion_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_03.bag habitacion_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_04.bag habitacion_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_05.bag habitacion_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_06.bag habitacion_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_07.bag habitacion_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_08.bag habitacion_08.bag]&lt;br /&gt;
* Situation  9: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_09.bag habitacion_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_10.bag habitacion_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_11.bag habitacion_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_12.bag habitacion_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_13.bag habitacion_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_14.bag habitacion_14.bag]&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5221</id>
		<title>Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5221"/>
				<updated>2018-02-13T17:59:49Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: /* v2.0 [Feb-2018] */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;blockquote style=&amp;quot;background-color:#ffe;border:1px solid #fb0;padding:5px 10px&amp;quot;&amp;gt;SciCrunch! reference: '''Range-based people tracker classifiers Benchmark Dataset, RRID:SCR_015743.'''&amp;lt;/blockquote&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to evaluate the performance of different approaches for detecting and tracking people by using LIDAR sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers. &lt;br /&gt;
&lt;br /&gt;
Further information at [https://www.frontiersin.org/articles/10.3389/fnbot.2017.00072/full Álvarez-Aparicio et al. (2017)].&lt;br /&gt;
&lt;br /&gt;
== Materials and methods ==&lt;br /&gt;
&lt;br /&gt;
The following section describes the materials (shown in Figure 1) used to gather data, which include: a certified study area, an autonomous robot with an on-board LIDAR sensor, and a real-time location system (RTLS) to obtain ground-truth data about person location. Recorded data include location estimates calculated by two people trackers, LD and PeTra, also described below. Finally, the recording procedure used to build the dataset is explained.&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig1.jpg|frame|center|'''Fig. 1''': Materials used in the experiments. (A) Orbi-One robot. (B) Leon@Home Testbed plan. (C) Floor plan of the apartment. Red dots show the location of KIO anchors. Black numbered dots denote Orbi-One locations during the data gathering. (D) KIO anchors. (E) General view of the apartment. (F) Furniture in mock-up apartment.]]&lt;br /&gt;
&lt;br /&gt;
=== Leon@Home Testbed ===&lt;br /&gt;
&lt;br /&gt;
Data have been gathered at Leon@Home Testbed. This is a Certified Testbed of the European Robotics league (ERL). Its main purpose is to benchmark service robots in a realistic home environment. Our testbed is made up of four parts, shown in Figure 1B: a mock-up apartment, a control zone with direct vision (glass wall) into the apartment, a small workshop, and a larger development zone, where researchers work.&lt;br /&gt;
&lt;br /&gt;
Leon@Home Testbed is located on the second floor of the Módulo de Investigación en Cibernética (Building for Research in Cybernetics) on the Vegazana Campus of the University of León (Spain). The apartment is a single bedroom mock-up home built in an 8 m × 7 m space. Figure 1C shows a plan of the apartment. 60 cm high walls divide it into a kitchen, living room, bathroom, and bedroom. The furniture (Figures 1E,F) has been chosen to test different robot abilities. For instance, the kitchen cabinets all have different types of handles.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-one ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One (Figure 1A) is an assistant robot manufactured by Robotnik. It has several sensors, among them, a RGBD camera, a LIDAR sensor, and an inertial unit. It can operate a manipulator arm attached to its torso and has a wheeled base for moving around the room. Orbi-One includes a wireless access point, which allows WiFi communications with other robots and computers.&lt;br /&gt;
&lt;br /&gt;
The software to control the robot hardware is based on a ROS framework. ROS is basically a set of libraries for robotics similar to operating system services, providing hardware abstraction for sensors and actuators, low-level device control, and inter-process communication. Computation takes place in processes named Nodes, which can receive and send Messages. Nodes publish Messages into information buffers called Topics.&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
In order to acquire ground-truth data about person location in the study area, we need an RTLS for indoor environments. The KIO RTLS commercial solution by Eliko has been used. KIO is a precise RTLS for tracking any object in 2- or 3-dimensional space. The Ultra Wideband technology enables to micro-position objects through obstructions. KIO also works in non-line-of-sight conditions and both indoors and outdoors.&lt;br /&gt;
&lt;br /&gt;
KIO comes in two main configurations. The Regular Cell configuration guarantees a reliable accuracy of ±30 cm, according to the manufacturer’s specifications. The Small Cell configuration is designed for location-critical applications and provides reliable ±5 cm accuracy, according to the manufacturer’s specifications. Calibration done by the authors of this paper on the mock-up apartment shows that the error is higher in some areas, and lower in others, but on average, the claims of the manufacturer are correct.&lt;br /&gt;
&lt;br /&gt;
KIO calculates the position of a mobile transceiver, called a Tag. In order to do so, KIO uses radio beacons, called Anchors, distributed in known positions in the surroundings. Figure 1D shows a KIO anchor. KIO tags are the same size and must be placed on-board the tracking subject, in our case people. The red dots in Figure 1C show the location of the six anchors used in these experiments. They are placed on the ceiling. The distribution of the anchors has been chosen following the method shown in Guerrero-Higueras et al. (2017).&lt;br /&gt;
&lt;br /&gt;
=== Leg Detector (D) ===&lt;br /&gt;
&lt;br /&gt;
LD is a ROS package, which takes messages published by a LIDAR sensor as input and uses a machine-learning-trained classifier to detect groups of laser readings as possible legs. The code is available in a public repository, but is unsupported at this time.&lt;br /&gt;
&lt;br /&gt;
LD publishes the location for the individual legs. It can also attempt to pair the legs and publish their average as an estimate of where the center of a person is. LD may optionally also publish visualization marker messages to indicate where detections happened.&lt;br /&gt;
&lt;br /&gt;
=== PeTra ===&lt;br /&gt;
&lt;br /&gt;
PeTra is a person-tracker tool for detecting and tracking, developed by the Robotics Group at the University of León. The system is based on a Convolutional Neural Network (CNN) using a configuration based on the U-Net architecture by Ronneberger et al. (2015).&lt;br /&gt;
&lt;br /&gt;
The system performs the following steps in real time:&lt;br /&gt;
&lt;br /&gt;
1. First, the data provided by the LIDAR sensor are processed to build a two dimensional occupancy map centered around the robot. This occupancy map is represented as a binary matrix, where 1s denote positions where the LIDAR scan found an obstacle, and 0s denote positions where the LIDAR scan either went through without detecting any obstacle or did not go through that position.&lt;br /&gt;
&lt;br /&gt;
2. Then, the occupancy map is relayed to the network as input data. The network produces a second occupancy map representing the zones where legs have been detected.&lt;br /&gt;
&lt;br /&gt;
3. Finally, center of mass calculations return the location of persons. PeTra also publishes locations for the individual legs and Marker messages for visualization.&lt;br /&gt;
&lt;br /&gt;
PeTra system have been trained using the following data: [http://robotica.unileon.es/~datasets/LegTracking/PeTra_training_dataset/npy_train_test_globales.tar.gz npy_train_test_globales.tar.gz]&lt;br /&gt;
&lt;br /&gt;
=== Recording procedure ===&lt;br /&gt;
&lt;br /&gt;
The data were gathered in 14 different situations. In all of them, Orbi-One stood still as one or more people, carrying a KIO tag, moved around it. Three different locations for Orbi-One were defined (see Figure 1C) resulting in 42 scenarios (14 situations × 3 Orbi-One locations). Figure 2 shows the 14 different recognition scenarios recorded. These scenarios have been chosen according to different situations that may occur in robotics competitions such as ERL or RoboCup.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig2.png|frame|center|'''Fig. 2''': recognition scenarios recorded as shown at Álvarez-Aparicio et al. (2017).]]&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
A rosbag file was created for each scenario (except for situations 3, 12, and 13 where 3 rosbag files where recorded, for situation 4 where 4 rosbag files where recorded, and for situation 9 where 5 rosbag files where recorded), recording LIDAR sensor measurements, location estimates from PeTra and LD, locations from KIO RTLS, and other useful data. Specifically, the following data were included in the rosbag files:&lt;br /&gt;
&lt;br /&gt;
* LIDAR sensor data. Data from LIDAR sensors are provided as ROS LaserScan Messages, which include, among other information, the following: acquisition time of the first ray in the scan, start/end angle of the scan, angular distance between measurements, and range data.&lt;br /&gt;
* PeTra location estimates provided as ROS PointStamped Messages, which include a position [x, y, z] and a timestamp.&lt;br /&gt;
* Location estimates calculated by LD. It publishes data for individual legs (as ROS PositionMeasurementArray Messages). It also attempts to pair the legs and publishes their average as an estimate of where the center of a person is as a ROS PositionMeasurement Message.&lt;br /&gt;
* Locations provided by KIO RTLS also provided as ROS PointStamped Messages.&lt;br /&gt;
* Messages from /map, /odom, and /tf ROS topics, which include map information, odometry of the robot base, and transformation information, respectively.&lt;br /&gt;
&lt;br /&gt;
Differents versions of the dataset are enumerated below.&lt;br /&gt;
&lt;br /&gt;
=== v1.0 [Nov-2017] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a first version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Location 1. Kitchen====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_01.bag test_01.bag]: '''duration:''' 14:56s, '''size:''' 227.8 MB, '''start date/time:''' Jul 20, 2017 12:49:21.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Jul 26, 2017 11:01:24.72&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_1.bag test_03_1.bag]: '''duration:''' 39.9s, '''size:''' 10.4 MB, '''start date/time:''' Jul 20, 2017 13:27:25.50&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_2.bag test_03_2.bag]: '''duration:''' 39.1s, '''size:''' 10.2 MB, '''start date/time:''' Jul 20, 2017 13:28:56.41&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_3.bag test_03_3.bag]: '''duration:''' 40.5s, '''size:''' 10.5 MB, '''start date/time:''' Jul 20, 2017 13:30:04.94&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_1.bag test_04_1.bag]: '''duration:''' 58.3s, '''size:''' 15.0 MB, '''start date/time:''' Jul 25, 2017 10:39:52.62 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_2.bag test_04_2.bag]: '''duration:''' 57.2s, '''size:''' 14.7 MB, '''start date/time:''' Jul 25, 2017 10:41:16.31&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_3.bag test_04_3.bag]: '''duration:''' 50.5s, '''size:''' 13.0 MB, '''start date/time:''' Jul 25, 2017 10:42:44.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_4.bag test_04_4.bag]: '''duration:''' 1:01s, '''size:''' 15.7 MB, '''start date/time:''' Jul 25, 2017 10:43:52.44&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_05.bag test_05.bag]: '''duration:''' 15:15s, '''size:''' 236.8 MB, '''start date/time:''' Jul 26, 2017 11:33:13.31&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_06.bag test_06.bag]: '''duration:''' 09:18s, '''size:''' 143.3 MB, '''start date/time:''' Jul 26, 2017 12:25:45.12&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Jul 25, 2017 11:40:01.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_08.bag test_08.bag]: '''duration:''' 03:39s, '''size:''' 55.8 MB, '''start date/time:''' Jul 25, 2017 12:25:29.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_1.bag test_09_1.bag]: '''duration:''' 22.9s, '''size:''' 6.1 MB, '''start date/time:''' Jul 25, 2017 10:50:02.95 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_2.bag test_09_2.bag]: '''duration:''' 20.8s, '''size:''' 5.6 MB, '''start date/time:''' Jul 25, 2017 10:51:02.62&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Jul 25, 2017 10:51:45.96 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_4.bag test_09_4.bag]: '''duration:''' 29.3s, '''size:''' 7.9 MB, '''start date/time:''' Jul 25, 2017 10:52:51.24&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_5.bag test_09_5.bag]: '''duration:''' 36.9s, '''size:''' 9.7 MB, '''start date/time:''' Jul 25, 2017 10:54:00.13&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_10.bag test_10.bag]: '''duration:''' 15:50s, '''size:''' 240.5 MB, '''start date/time:''' Jul 20, 2017 13:07:40.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 58.7 MB, '''start date/time:''' Jul 25, 2017 11:48:33.90&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_1.bag test_12_1.bag]: '''duration:''' 43.6s, '''size:''' 11.3 MB, '''start date/time:''' Jul 20, 2017 13:33:23.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_2.bag test_12_2.bag]: '''duration:''' 44.3s, '''size:''' 11.5 MB, '''start date/time:''' Jul 20, 2017 13:34:24.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_3.bag test_12_3.bag]: '''duration:''' 37.3s, '''size:'''  9.7 MB, '''start date/time:''' Jul 20, 2017 13:35:31.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_1.bag test_13_1.bag]: '''duration:''' 57.8s, '''size:''' 14.9 MB, '''start date/time:''' Jul 25, 2017 11:01:15.23&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_2.bag test_13_2.bag]: '''duration:''' 59.9s, '''size:''' 15.4 MB, '''start date/time:''' Jul 25, 2017 11:02:37.85&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_3.bag test_13_3.bag]: '''duration:''' 54.0s, '''size:''' 13.9 MB, '''start date/time:''' Jul 25, 2017 11:04:30.69&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_14.bag test_14.bag]: '''duration:''' 05.57s, '''size:''' 90.5 MB, '''start date/time:''' Jul 25, 2017 11:10:32.97&lt;br /&gt;
&lt;br /&gt;
====Location 2. Living room====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_01.bag test_01.bag]: '''duration:''' 15:37s, '''size:''' 239.7 MB, '''start date/time:''' Nov 16, 2017 12:20:19.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_02.bag test_02.bag]: '''duration:''' 15:05s, '''size:''' 231.4 MB, '''start date/time:''' Nov 27, 2017 18:32:24.35 &lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_1.bag test_03_1.bag]: '''duration:''' 55.2s, '''size:''' 14.5 MB, '''start date/time:''' Nov 16, 2017 12:43:58.29 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_2.bag test_03_2.bag]: '''duration:''' 44.4s, '''size:''' 10.2 MB, '''start date/time:''' Nov 16, 2017 12:46:31.99&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_3.bag test_03_3.bag]: '''duration:''' 51.1s, '''size:''' 13.4 MB, '''start date/time:''' Nov 16, 2017 12:47:29.66&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_1.bag test_04_1.bag]: '''duration:''' 01:23s, '''size:''' 21.6 MB, '''start date/time:''' Nov 16, 2017 1:50:22.18 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_2.bag test_04_2.bag]: '''duration:''' 01:20s, '''size:''' 21.0 MB, '''start date/time:''' Nov 16, 2017 12:52:18.44&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_3.bag test_04_3.bag]: '''duration:''' 01:09s, '''size:''' 18.1 MB, '''start date/time:''' Nov 16, 2017 12:53:51.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_4.bag test_04_4.bag]: '''duration:''' 1:21s, '''size:''' 21.3 MB, '''start date/time:''' Nov 16, 2017 12:55:31.81&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_05.bag test_05.bag]: '''duration:''' 15:07s, '''size:''' 232.4 MB, '''start date/time:''' Nov 28 2017 18:33:16.24&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_06.bag test_06.bag]: '''duration:''' 09:14s, '''size:''' 136.2 MB, '''start date/time:''' Nov 29 2017 13:12:42.61 &lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_07.bag test_07.bag]: '''duration:''' 4:39s, '''size:''' 71.5 MB, '''start date/time:''' Nov 22 2017 12:48:33.86 &lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_08.bag test_08.bag]: '''duration:''' 03:04s, '''size:''' 47.2 MB, '''start date/time:''' Nov 22 2017 12:59:08.58&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_1.bag test_09_1.bag]: '''duration:''' 28.8s, '''size:''' 7.7 MB, '''start date/time:''' Nov 28, 2017 17:17:46.49 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_2.bag test_09_2.bag]: '''duration:''' 31.9s, '''size:''' 8.4 MB, '''start date/time:''' Nov 28, 2017 17:20:48.26&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Nov 28, 2017 17:20:05.96&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_4.bag test_09_4.bag]: '''duration:''' 29.9s, '''size:''' 7.9 MB, '''start date/time:''' Nov 28, 2017 17:19:26.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_5.bag test_09_5.bag]: '''duration:''' 30.8s, '''size:''' 8.1 MB, '''start date/time:''' Nov 28, 2017 17:21:20.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_10.bag test_10.bag]: '''duration:''' 16:05s, '''size:''' 246.6 MB, '''start date/time:''' Nov 22 2017 11:55:02.39&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_11.bag test_11.bag]: '''duration:''' 03:25s, '''size:''' 52.8 MB, '''start date/time:''' Nov 22 2017 13:08:11.99&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_1.bag test_12_1.bag]: '''duration:''' 01:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:55:26.25&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_2.bag test_12_2.bag]: '''duration:''' 1:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:56:52.47&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_3.bag test_12_3.bag]: '''duration:''' 50.2s, '''size:'''  13.1 MB, '''start date/time:''' Nov 16, 2017 11:44:00.10&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_1.bag test_13_1.bag]: '''duration:''' 58.2s, '''size:''' 15.1 MB, '''start date/time:''' Nov 22, 2017 11:48:35.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_2.bag test_13_2.bag]: '''duration:''' 01:05s, '''size:''' 16.9 MB, '''start date/time:''' Nov 22, 2017 11:49:48.18&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_3.bag test_13_3.bag]: '''duration:''' 01:14s, '''size:''' 19.2 MB, '''start date/time:''' Nov 22, 2017 11:51:22.68&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_14.bag test_14.bag]: '''duration:''' 06:17s, '''size:''' 96.3 MB, '''start date/time:''' Nov 22 2017 12:23:07.92&lt;br /&gt;
&lt;br /&gt;
====Location 3. Bedroom====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_01.bag test_01.bag]: '''duration:''' 15:17s, '''size:''' 234.6 MB, '''start date/time:''' Nov 21, 2017 18:56:54.87&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 17:37:34.08&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_1.bag test_03_1.bag]: '''duration:''' 44.7s, '''size:''' 11.7 MB, '''start date/time:''' Nov 21, 2017 13:16:51.65&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_2.bag test_03_2.bag]: '''duration:''' 39.9s, '''size:''' 10.5 MB, '''start date/time:''' Nov 21, 2017 19:19:06.27&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_3.bag test_03_3.bag]: '''duration:''' 41.2s, '''size:''' 10.8 MB, '''start date/time:''' Nov 21, 2017 19:18:06.86&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_1.bag test_04_1.bag]: '''duration:''' 01:01s, '''size:''' 16.0 MB, '''start date/time:''' Nov 21, 2017 19:21:30.99 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_2.bag test_04_2.bag]: '''duration:''' 01:01s, '''size:''' 15.9 MB, '''start date/time:''' Nov 21, 2017 19:25:38.00&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_3.bag test_04_3.bag]: '''duration:''' 58.0s, '''size:''' 15.1 MB, '''start date/time:''' Nov 21, 2017 19:27:16.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_4.bag test_04_4.bag]: '''duration:''' 1:00s, '''size:''' 15.8 MB, '''start date/time:''' Nov 21, 2017 19:29:39.28&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_05.bag test_05.bag]: '''duration:''' 15:11s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 19:05:20.19&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_06.bag test_06.bag]: '''duration:''' 09:03s, '''size:''' 133.6 MB, '''start date/time:''' Nov 26, 2017 13:24:25.64&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Nov 27, 2017 17:34:50.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_08.bag test_08.bag]: '''duration:''' 03:47s, '''size:''' 58.3 MB, '''start date/time:''' Nov 27, 2017 17:41:48.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_1.bag test_09_1.bag]: '''duration:''' 31.3s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:23:58.31 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_2.bag test_09_2.bag]: '''duration:''' 30.4s, '''size:''' 8.0 MB, '''start date/time:''' Nov 28, 2017 17:24:41.09&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_3.bag test_09_3.bag]: '''duration:''' 31.2s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:25:22.22&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_4.bag test_09_4.bag]: '''duration:''' 31.4s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:26:06.98&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_5.bag test_09_5.bag]: '''duration:''' 33.2s, '''size:''' 8.8 MB, '''start date/time:''' Nov 28, 2017 17:26:51.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_10.bag test_10.bag]: '''duration:''' 10:00s, '''size:''' 153.7 MB, '''start date/time:''' Nov 22, 2017 18:15:29.53&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 59.1 MB, '''start date/time:''' Nov 22, 2017 18:25:54.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_1.bag test_12_1.bag]: '''duration:''' 41.3s, '''size:''' 10.8 MB, '''start date/time:''' Nov 22, 2017 18:35:11.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_2.bag test_12_2.bag]: '''duration:''' 41.8s, '''size:''' 10.9 MB, '''start date/time:''' Nov 22, 2017 18:36:12.11&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_3.bag test_12_3.bag]: '''duration:''' 39.8s, '''size:'''  10.5 MB, '''start date/time:''' Nov 22, 2017 18:37:05.06&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_1.bag test_13_1.bag]: '''duration:''' 01:01s, '''size:''' 14.9 MB, '''start date/time:''' Nov 22, 2017 18:40:13.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_2.bag test_13_2.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:41:28.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_3.bag test_13_3.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:42:39.52&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_14.bag test_14.bag]: '''duration:''' 06:12s, '''size:''' 95.1 MB, '''start date/time:''' Nov 23, 2017 12:53:06.48&lt;br /&gt;
&lt;br /&gt;
=== v2.0 [Feb-2018] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a second version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
{|class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align:center;&amp;quot;| Location 1. Kitchen&lt;br /&gt;
! Location 2. Living room&lt;br /&gt;
! Location 3. Bedroom&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_01.bag cocina_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_02.bag cocina_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_03.bag cocina_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_04.bag cocina_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_05.bag cocina_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_06.bag cocina_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_07.bag cocina_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_08.bag cocina_08.bag]&lt;br /&gt;
* Situation  9: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_09.bag cocina_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_10.bag cocina_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_11.bag cocina_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_12.bag cocina_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_13.bag cocina_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_14.bag cocina_14.bag]&lt;br /&gt;
|&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_01.bag salon_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_02.bag salon_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_03.bag salon_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_04.bag salon_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_05.bag salon_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_06.bag salon_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_07.bag salon_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_08.bag salon_08.bag]&lt;br /&gt;
* Situation  9: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_09.bag salon_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_10.bag salon_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_11.bag salon_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_12.bag salon_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_13.bag salon_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_14.bag salon_14.bag]&lt;br /&gt;
|&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_01.bag habitacion_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_02.bag habitacion_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_03.bag habitacion_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_04.bag habitacion_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_05.bag habitacion_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_06.bag habitacion_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_07.bag habitacion_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_08.bag habitacion_08.bag]&lt;br /&gt;
* Situation  9: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_09.bag habitacion_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_10.bag habitacion_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_11.bag habitacion_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_12.bag habitacion_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_13.bag habitacion_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_14.bag habitacion_14.bag]&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5220</id>
		<title>Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5220"/>
				<updated>2018-02-13T15:54:31Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;blockquote style=&amp;quot;background-color:#ffe;border:1px solid #fb0;padding:5px 10px&amp;quot;&amp;gt;SciCrunch! reference: '''Range-based people tracker classifiers Benchmark Dataset, RRID:SCR_015743.'''&amp;lt;/blockquote&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to evaluate the performance of different approaches for detecting and tracking people by using LIDAR sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers. &lt;br /&gt;
&lt;br /&gt;
Further information at [https://www.frontiersin.org/articles/10.3389/fnbot.2017.00072/full Álvarez-Aparicio et al. (2017)].&lt;br /&gt;
&lt;br /&gt;
== Materials and methods ==&lt;br /&gt;
&lt;br /&gt;
The following section describes the materials (shown in Figure 1) used to gather data, which include: a certified study area, an autonomous robot with an on-board LIDAR sensor, and a real-time location system (RTLS) to obtain ground-truth data about person location. Recorded data include location estimates calculated by two people trackers, LD and PeTra, also described below. Finally, the recording procedure used to build the dataset is explained.&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig1.jpg|frame|center|'''Fig. 1''': Materials used in the experiments. (A) Orbi-One robot. (B) Leon@Home Testbed plan. (C) Floor plan of the apartment. Red dots show the location of KIO anchors. Black numbered dots denote Orbi-One locations during the data gathering. (D) KIO anchors. (E) General view of the apartment. (F) Furniture in mock-up apartment.]]&lt;br /&gt;
&lt;br /&gt;
=== Leon@Home Testbed ===&lt;br /&gt;
&lt;br /&gt;
Data have been gathered at Leon@Home Testbed. This is a Certified Testbed of the European Robotics league (ERL). Its main purpose is to benchmark service robots in a realistic home environment. Our testbed is made up of four parts, shown in Figure 1B: a mock-up apartment, a control zone with direct vision (glass wall) into the apartment, a small workshop, and a larger development zone, where researchers work.&lt;br /&gt;
&lt;br /&gt;
Leon@Home Testbed is located on the second floor of the Módulo de Investigación en Cibernética (Building for Research in Cybernetics) on the Vegazana Campus of the University of León (Spain). The apartment is a single bedroom mock-up home built in an 8 m × 7 m space. Figure 1C shows a plan of the apartment. 60 cm high walls divide it into a kitchen, living room, bathroom, and bedroom. The furniture (Figures 1E,F) has been chosen to test different robot abilities. For instance, the kitchen cabinets all have different types of handles.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-one ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One (Figure 1A) is an assistant robot manufactured by Robotnik. It has several sensors, among them, a RGBD camera, a LIDAR sensor, and an inertial unit. It can operate a manipulator arm attached to its torso and has a wheeled base for moving around the room. Orbi-One includes a wireless access point, which allows WiFi communications with other robots and computers.&lt;br /&gt;
&lt;br /&gt;
The software to control the robot hardware is based on a ROS framework. ROS is basically a set of libraries for robotics similar to operating system services, providing hardware abstraction for sensors and actuators, low-level device control, and inter-process communication. Computation takes place in processes named Nodes, which can receive and send Messages. Nodes publish Messages into information buffers called Topics.&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
In order to acquire ground-truth data about person location in the study area, we need an RTLS for indoor environments. The KIO RTLS commercial solution by Eliko has been used. KIO is a precise RTLS for tracking any object in 2- or 3-dimensional space. The Ultra Wideband technology enables to micro-position objects through obstructions. KIO also works in non-line-of-sight conditions and both indoors and outdoors.&lt;br /&gt;
&lt;br /&gt;
KIO comes in two main configurations. The Regular Cell configuration guarantees a reliable accuracy of ±30 cm, according to the manufacturer’s specifications. The Small Cell configuration is designed for location-critical applications and provides reliable ±5 cm accuracy, according to the manufacturer’s specifications. Calibration done by the authors of this paper on the mock-up apartment shows that the error is higher in some areas, and lower in others, but on average, the claims of the manufacturer are correct.&lt;br /&gt;
&lt;br /&gt;
KIO calculates the position of a mobile transceiver, called a Tag. In order to do so, KIO uses radio beacons, called Anchors, distributed in known positions in the surroundings. Figure 1D shows a KIO anchor. KIO tags are the same size and must be placed on-board the tracking subject, in our case people. The red dots in Figure 1C show the location of the six anchors used in these experiments. They are placed on the ceiling. The distribution of the anchors has been chosen following the method shown in Guerrero-Higueras et al. (2017).&lt;br /&gt;
&lt;br /&gt;
=== Leg Detector (D) ===&lt;br /&gt;
&lt;br /&gt;
LD is a ROS package, which takes messages published by a LIDAR sensor as input and uses a machine-learning-trained classifier to detect groups of laser readings as possible legs. The code is available in a public repository, but is unsupported at this time.&lt;br /&gt;
&lt;br /&gt;
LD publishes the location for the individual legs. It can also attempt to pair the legs and publish their average as an estimate of where the center of a person is. LD may optionally also publish visualization marker messages to indicate where detections happened.&lt;br /&gt;
&lt;br /&gt;
=== PeTra ===&lt;br /&gt;
&lt;br /&gt;
PeTra is a person-tracker tool for detecting and tracking, developed by the Robotics Group at the University of León. The system is based on a Convolutional Neural Network (CNN) using a configuration based on the U-Net architecture by Ronneberger et al. (2015).&lt;br /&gt;
&lt;br /&gt;
The system performs the following steps in real time:&lt;br /&gt;
&lt;br /&gt;
1. First, the data provided by the LIDAR sensor are processed to build a two dimensional occupancy map centered around the robot. This occupancy map is represented as a binary matrix, where 1s denote positions where the LIDAR scan found an obstacle, and 0s denote positions where the LIDAR scan either went through without detecting any obstacle or did not go through that position.&lt;br /&gt;
&lt;br /&gt;
2. Then, the occupancy map is relayed to the network as input data. The network produces a second occupancy map representing the zones where legs have been detected.&lt;br /&gt;
&lt;br /&gt;
3. Finally, center of mass calculations return the location of persons. PeTra also publishes locations for the individual legs and Marker messages for visualization.&lt;br /&gt;
&lt;br /&gt;
PeTra system have been trained using the following data: [http://robotica.unileon.es/~datasets/LegTracking/PeTra_training_dataset/npy_train_test_globales.tar.gz npy_train_test_globales.tar.gz]&lt;br /&gt;
&lt;br /&gt;
=== Recording procedure ===&lt;br /&gt;
&lt;br /&gt;
The data were gathered in 14 different situations. In all of them, Orbi-One stood still as one or more people, carrying a KIO tag, moved around it. Three different locations for Orbi-One were defined (see Figure 1C) resulting in 42 scenarios (14 situations × 3 Orbi-One locations). Figure 2 shows the 14 different recognition scenarios recorded. These scenarios have been chosen according to different situations that may occur in robotics competitions such as ERL or RoboCup.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig2.png|frame|center|'''Fig. 2''': recognition scenarios recorded as shown at Álvarez-Aparicio et al. (2017).]]&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
A rosbag file was created for each scenario (except for situations 3, 12, and 13 where 3 rosbag files where recorded, for situation 4 where 4 rosbag files where recorded, and for situation 9 where 5 rosbag files where recorded), recording LIDAR sensor measurements, location estimates from PeTra and LD, locations from KIO RTLS, and other useful data. Specifically, the following data were included in the rosbag files:&lt;br /&gt;
&lt;br /&gt;
* LIDAR sensor data. Data from LIDAR sensors are provided as ROS LaserScan Messages, which include, among other information, the following: acquisition time of the first ray in the scan, start/end angle of the scan, angular distance between measurements, and range data.&lt;br /&gt;
* PeTra location estimates provided as ROS PointStamped Messages, which include a position [x, y, z] and a timestamp.&lt;br /&gt;
* Location estimates calculated by LD. It publishes data for individual legs (as ROS PositionMeasurementArray Messages). It also attempts to pair the legs and publishes their average as an estimate of where the center of a person is as a ROS PositionMeasurement Message.&lt;br /&gt;
* Locations provided by KIO RTLS also provided as ROS PointStamped Messages.&lt;br /&gt;
* Messages from /map, /odom, and /tf ROS topics, which include map information, odometry of the robot base, and transformation information, respectively.&lt;br /&gt;
&lt;br /&gt;
Differents versions of the dataset are enumerated below.&lt;br /&gt;
&lt;br /&gt;
=== v1.0 [Nov-2017] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a first version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Location 1. Kitchen====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_01.bag test_01.bag]: '''duration:''' 14:56s, '''size:''' 227.8 MB, '''start date/time:''' Jul 20, 2017 12:49:21.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Jul 26, 2017 11:01:24.72&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_1.bag test_03_1.bag]: '''duration:''' 39.9s, '''size:''' 10.4 MB, '''start date/time:''' Jul 20, 2017 13:27:25.50&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_2.bag test_03_2.bag]: '''duration:''' 39.1s, '''size:''' 10.2 MB, '''start date/time:''' Jul 20, 2017 13:28:56.41&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_3.bag test_03_3.bag]: '''duration:''' 40.5s, '''size:''' 10.5 MB, '''start date/time:''' Jul 20, 2017 13:30:04.94&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_1.bag test_04_1.bag]: '''duration:''' 58.3s, '''size:''' 15.0 MB, '''start date/time:''' Jul 25, 2017 10:39:52.62 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_2.bag test_04_2.bag]: '''duration:''' 57.2s, '''size:''' 14.7 MB, '''start date/time:''' Jul 25, 2017 10:41:16.31&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_3.bag test_04_3.bag]: '''duration:''' 50.5s, '''size:''' 13.0 MB, '''start date/time:''' Jul 25, 2017 10:42:44.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_4.bag test_04_4.bag]: '''duration:''' 1:01s, '''size:''' 15.7 MB, '''start date/time:''' Jul 25, 2017 10:43:52.44&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_05.bag test_05.bag]: '''duration:''' 15:15s, '''size:''' 236.8 MB, '''start date/time:''' Jul 26, 2017 11:33:13.31&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_06.bag test_06.bag]: '''duration:''' 09:18s, '''size:''' 143.3 MB, '''start date/time:''' Jul 26, 2017 12:25:45.12&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Jul 25, 2017 11:40:01.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_08.bag test_08.bag]: '''duration:''' 03:39s, '''size:''' 55.8 MB, '''start date/time:''' Jul 25, 2017 12:25:29.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_1.bag test_09_1.bag]: '''duration:''' 22.9s, '''size:''' 6.1 MB, '''start date/time:''' Jul 25, 2017 10:50:02.95 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_2.bag test_09_2.bag]: '''duration:''' 20.8s, '''size:''' 5.6 MB, '''start date/time:''' Jul 25, 2017 10:51:02.62&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Jul 25, 2017 10:51:45.96 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_4.bag test_09_4.bag]: '''duration:''' 29.3s, '''size:''' 7.9 MB, '''start date/time:''' Jul 25, 2017 10:52:51.24&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_5.bag test_09_5.bag]: '''duration:''' 36.9s, '''size:''' 9.7 MB, '''start date/time:''' Jul 25, 2017 10:54:00.13&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_10.bag test_10.bag]: '''duration:''' 15:50s, '''size:''' 240.5 MB, '''start date/time:''' Jul 20, 2017 13:07:40.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 58.7 MB, '''start date/time:''' Jul 25, 2017 11:48:33.90&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_1.bag test_12_1.bag]: '''duration:''' 43.6s, '''size:''' 11.3 MB, '''start date/time:''' Jul 20, 2017 13:33:23.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_2.bag test_12_2.bag]: '''duration:''' 44.3s, '''size:''' 11.5 MB, '''start date/time:''' Jul 20, 2017 13:34:24.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_3.bag test_12_3.bag]: '''duration:''' 37.3s, '''size:'''  9.7 MB, '''start date/time:''' Jul 20, 2017 13:35:31.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_1.bag test_13_1.bag]: '''duration:''' 57.8s, '''size:''' 14.9 MB, '''start date/time:''' Jul 25, 2017 11:01:15.23&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_2.bag test_13_2.bag]: '''duration:''' 59.9s, '''size:''' 15.4 MB, '''start date/time:''' Jul 25, 2017 11:02:37.85&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_3.bag test_13_3.bag]: '''duration:''' 54.0s, '''size:''' 13.9 MB, '''start date/time:''' Jul 25, 2017 11:04:30.69&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_14.bag test_14.bag]: '''duration:''' 05.57s, '''size:''' 90.5 MB, '''start date/time:''' Jul 25, 2017 11:10:32.97&lt;br /&gt;
&lt;br /&gt;
====Location 2. Living room====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_01.bag test_01.bag]: '''duration:''' 15:37s, '''size:''' 239.7 MB, '''start date/time:''' Nov 16, 2017 12:20:19.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_02.bag test_02.bag]: '''duration:''' 15:05s, '''size:''' 231.4 MB, '''start date/time:''' Nov 27, 2017 18:32:24.35 &lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_1.bag test_03_1.bag]: '''duration:''' 55.2s, '''size:''' 14.5 MB, '''start date/time:''' Nov 16, 2017 12:43:58.29 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_2.bag test_03_2.bag]: '''duration:''' 44.4s, '''size:''' 10.2 MB, '''start date/time:''' Nov 16, 2017 12:46:31.99&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_3.bag test_03_3.bag]: '''duration:''' 51.1s, '''size:''' 13.4 MB, '''start date/time:''' Nov 16, 2017 12:47:29.66&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_1.bag test_04_1.bag]: '''duration:''' 01:23s, '''size:''' 21.6 MB, '''start date/time:''' Nov 16, 2017 1:50:22.18 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_2.bag test_04_2.bag]: '''duration:''' 01:20s, '''size:''' 21.0 MB, '''start date/time:''' Nov 16, 2017 12:52:18.44&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_3.bag test_04_3.bag]: '''duration:''' 01:09s, '''size:''' 18.1 MB, '''start date/time:''' Nov 16, 2017 12:53:51.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_4.bag test_04_4.bag]: '''duration:''' 1:21s, '''size:''' 21.3 MB, '''start date/time:''' Nov 16, 2017 12:55:31.81&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_05.bag test_05.bag]: '''duration:''' 15:07s, '''size:''' 232.4 MB, '''start date/time:''' Nov 28 2017 18:33:16.24&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_06.bag test_06.bag]: '''duration:''' 09:14s, '''size:''' 136.2 MB, '''start date/time:''' Nov 29 2017 13:12:42.61 &lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_07.bag test_07.bag]: '''duration:''' 4:39s, '''size:''' 71.5 MB, '''start date/time:''' Nov 22 2017 12:48:33.86 &lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_08.bag test_08.bag]: '''duration:''' 03:04s, '''size:''' 47.2 MB, '''start date/time:''' Nov 22 2017 12:59:08.58&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_1.bag test_09_1.bag]: '''duration:''' 28.8s, '''size:''' 7.7 MB, '''start date/time:''' Nov 28, 2017 17:17:46.49 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_2.bag test_09_2.bag]: '''duration:''' 31.9s, '''size:''' 8.4 MB, '''start date/time:''' Nov 28, 2017 17:20:48.26&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Nov 28, 2017 17:20:05.96&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_4.bag test_09_4.bag]: '''duration:''' 29.9s, '''size:''' 7.9 MB, '''start date/time:''' Nov 28, 2017 17:19:26.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_5.bag test_09_5.bag]: '''duration:''' 30.8s, '''size:''' 8.1 MB, '''start date/time:''' Nov 28, 2017 17:21:20.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_10.bag test_10.bag]: '''duration:''' 16:05s, '''size:''' 246.6 MB, '''start date/time:''' Nov 22 2017 11:55:02.39&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_11.bag test_11.bag]: '''duration:''' 03:25s, '''size:''' 52.8 MB, '''start date/time:''' Nov 22 2017 13:08:11.99&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_1.bag test_12_1.bag]: '''duration:''' 01:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:55:26.25&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_2.bag test_12_2.bag]: '''duration:''' 1:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:56:52.47&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_3.bag test_12_3.bag]: '''duration:''' 50.2s, '''size:'''  13.1 MB, '''start date/time:''' Nov 16, 2017 11:44:00.10&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_1.bag test_13_1.bag]: '''duration:''' 58.2s, '''size:''' 15.1 MB, '''start date/time:''' Nov 22, 2017 11:48:35.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_2.bag test_13_2.bag]: '''duration:''' 01:05s, '''size:''' 16.9 MB, '''start date/time:''' Nov 22, 2017 11:49:48.18&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_3.bag test_13_3.bag]: '''duration:''' 01:14s, '''size:''' 19.2 MB, '''start date/time:''' Nov 22, 2017 11:51:22.68&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_14.bag test_14.bag]: '''duration:''' 06:17s, '''size:''' 96.3 MB, '''start date/time:''' Nov 22 2017 12:23:07.92&lt;br /&gt;
&lt;br /&gt;
====Location 3. Bedroom====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_01.bag test_01.bag]: '''duration:''' 15:17s, '''size:''' 234.6 MB, '''start date/time:''' Nov 21, 2017 18:56:54.87&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 17:37:34.08&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_1.bag test_03_1.bag]: '''duration:''' 44.7s, '''size:''' 11.7 MB, '''start date/time:''' Nov 21, 2017 13:16:51.65&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_2.bag test_03_2.bag]: '''duration:''' 39.9s, '''size:''' 10.5 MB, '''start date/time:''' Nov 21, 2017 19:19:06.27&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_3.bag test_03_3.bag]: '''duration:''' 41.2s, '''size:''' 10.8 MB, '''start date/time:''' Nov 21, 2017 19:18:06.86&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_1.bag test_04_1.bag]: '''duration:''' 01:01s, '''size:''' 16.0 MB, '''start date/time:''' Nov 21, 2017 19:21:30.99 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_2.bag test_04_2.bag]: '''duration:''' 01:01s, '''size:''' 15.9 MB, '''start date/time:''' Nov 21, 2017 19:25:38.00&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_3.bag test_04_3.bag]: '''duration:''' 58.0s, '''size:''' 15.1 MB, '''start date/time:''' Nov 21, 2017 19:27:16.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_4.bag test_04_4.bag]: '''duration:''' 1:00s, '''size:''' 15.8 MB, '''start date/time:''' Nov 21, 2017 19:29:39.28&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_05.bag test_05.bag]: '''duration:''' 15:11s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 19:05:20.19&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_06.bag test_06.bag]: '''duration:''' 09:03s, '''size:''' 133.6 MB, '''start date/time:''' Nov 26, 2017 13:24:25.64&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Nov 27, 2017 17:34:50.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_08.bag test_08.bag]: '''duration:''' 03:47s, '''size:''' 58.3 MB, '''start date/time:''' Nov 27, 2017 17:41:48.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_1.bag test_09_1.bag]: '''duration:''' 31.3s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:23:58.31 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_2.bag test_09_2.bag]: '''duration:''' 30.4s, '''size:''' 8.0 MB, '''start date/time:''' Nov 28, 2017 17:24:41.09&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_3.bag test_09_3.bag]: '''duration:''' 31.2s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:25:22.22&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_4.bag test_09_4.bag]: '''duration:''' 31.4s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:26:06.98&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_5.bag test_09_5.bag]: '''duration:''' 33.2s, '''size:''' 8.8 MB, '''start date/time:''' Nov 28, 2017 17:26:51.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_10.bag test_10.bag]: '''duration:''' 10:00s, '''size:''' 153.7 MB, '''start date/time:''' Nov 22, 2017 18:15:29.53&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 59.1 MB, '''start date/time:''' Nov 22, 2017 18:25:54.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_1.bag test_12_1.bag]: '''duration:''' 41.3s, '''size:''' 10.8 MB, '''start date/time:''' Nov 22, 2017 18:35:11.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_2.bag test_12_2.bag]: '''duration:''' 41.8s, '''size:''' 10.9 MB, '''start date/time:''' Nov 22, 2017 18:36:12.11&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_3.bag test_12_3.bag]: '''duration:''' 39.8s, '''size:'''  10.5 MB, '''start date/time:''' Nov 22, 2017 18:37:05.06&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_1.bag test_13_1.bag]: '''duration:''' 01:01s, '''size:''' 14.9 MB, '''start date/time:''' Nov 22, 2017 18:40:13.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_2.bag test_13_2.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:41:28.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_3.bag test_13_3.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:42:39.52&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_14.bag test_14.bag]: '''duration:''' 06:12s, '''size:''' 95.1 MB, '''start date/time:''' Nov 23, 2017 12:53:06.48&lt;br /&gt;
&lt;br /&gt;
=== v2.0 [Feb-2018] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a second version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Location 1. Kitchen====&lt;br /&gt;
&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_01.bag cocina_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_02.bag cocina_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_03.bag cocina_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_04.bag cocina_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_05.bag cocina_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_06.bag cocina_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_07.bag cocina_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_08.bag cocina_08.bag]&lt;br /&gt;
* Situation 19: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_09.bag cocina_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_10.bag cocina_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_11.bag cocina_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_12.bag cocina_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_13.bag cocina_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_14.bag cocina_14.bag]&lt;br /&gt;
&lt;br /&gt;
====Location 2. Living room====&lt;br /&gt;
&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_01.bag salon_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_02.bag salon_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_03.bag salon_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_04.bag salon_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_05.bag salon_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_06.bag salon_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_07.bag salon_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_08.bag salon_08.bag]&lt;br /&gt;
* Situation 19: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_09.bag salon_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_10.bag salon_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_11.bag salon_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_12.bag salon_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_13.bag salon_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_14.bag salon_14.bag]&lt;br /&gt;
&lt;br /&gt;
====Location 3. Bedroom====&lt;br /&gt;
&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_01.bag habitacion_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_02.bag habitacion_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_03.bag habitacion_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_04.bag habitacion_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_05.bag habitacion_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_06.bag habitacion_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_07.bag habitacion_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_08.bag habitacion_08.bag]&lt;br /&gt;
* Situation 19: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_09.bag habitacion_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_10.bag habitacion_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_11.bag habitacion_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_12.bag habitacion_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_13.bag habitacion_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_14.bag habitacion_14.bag]&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5219</id>
		<title>Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5219"/>
				<updated>2018-02-13T15:54:14Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: /* Recording procedure */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;blockquote style=&amp;quot;background-color:#ffe;border:1px solid #fb0;padding:5px 10px&amp;quot;&amp;gt;SciCrunch! reference: '''Range-based people tracker classifiers Benchmark Dataset, RRID:SCR_015743.'''&amp;lt;/blockquote&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to evaluate the performance of different approaches for detecting and tracking people by using LIDAR sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers. &lt;br /&gt;
&lt;br /&gt;
Further information at [https://www.frontiersin.org/articles/10.3389/fnbot.2017.00072/full Álvarez-Aparicio et al. (2018)].&lt;br /&gt;
&lt;br /&gt;
== Materials and methods ==&lt;br /&gt;
&lt;br /&gt;
The following section describes the materials (shown in Figure 1) used to gather data, which include: a certified study area, an autonomous robot with an on-board LIDAR sensor, and a real-time location system (RTLS) to obtain ground-truth data about person location. Recorded data include location estimates calculated by two people trackers, LD and PeTra, also described below. Finally, the recording procedure used to build the dataset is explained.&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig1.jpg|frame|center|'''Fig. 1''': Materials used in the experiments. (A) Orbi-One robot. (B) Leon@Home Testbed plan. (C) Floor plan of the apartment. Red dots show the location of KIO anchors. Black numbered dots denote Orbi-One locations during the data gathering. (D) KIO anchors. (E) General view of the apartment. (F) Furniture in mock-up apartment.]]&lt;br /&gt;
&lt;br /&gt;
=== Leon@Home Testbed ===&lt;br /&gt;
&lt;br /&gt;
Data have been gathered at Leon@Home Testbed. This is a Certified Testbed of the European Robotics league (ERL). Its main purpose is to benchmark service robots in a realistic home environment. Our testbed is made up of four parts, shown in Figure 1B: a mock-up apartment, a control zone with direct vision (glass wall) into the apartment, a small workshop, and a larger development zone, where researchers work.&lt;br /&gt;
&lt;br /&gt;
Leon@Home Testbed is located on the second floor of the Módulo de Investigación en Cibernética (Building for Research in Cybernetics) on the Vegazana Campus of the University of León (Spain). The apartment is a single bedroom mock-up home built in an 8 m × 7 m space. Figure 1C shows a plan of the apartment. 60 cm high walls divide it into a kitchen, living room, bathroom, and bedroom. The furniture (Figures 1E,F) has been chosen to test different robot abilities. For instance, the kitchen cabinets all have different types of handles.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-one ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One (Figure 1A) is an assistant robot manufactured by Robotnik. It has several sensors, among them, a RGBD camera, a LIDAR sensor, and an inertial unit. It can operate a manipulator arm attached to its torso and has a wheeled base for moving around the room. Orbi-One includes a wireless access point, which allows WiFi communications with other robots and computers.&lt;br /&gt;
&lt;br /&gt;
The software to control the robot hardware is based on a ROS framework. ROS is basically a set of libraries for robotics similar to operating system services, providing hardware abstraction for sensors and actuators, low-level device control, and inter-process communication. Computation takes place in processes named Nodes, which can receive and send Messages. Nodes publish Messages into information buffers called Topics.&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
In order to acquire ground-truth data about person location in the study area, we need an RTLS for indoor environments. The KIO RTLS commercial solution by Eliko has been used. KIO is a precise RTLS for tracking any object in 2- or 3-dimensional space. The Ultra Wideband technology enables to micro-position objects through obstructions. KIO also works in non-line-of-sight conditions and both indoors and outdoors.&lt;br /&gt;
&lt;br /&gt;
KIO comes in two main configurations. The Regular Cell configuration guarantees a reliable accuracy of ±30 cm, according to the manufacturer’s specifications. The Small Cell configuration is designed for location-critical applications and provides reliable ±5 cm accuracy, according to the manufacturer’s specifications. Calibration done by the authors of this paper on the mock-up apartment shows that the error is higher in some areas, and lower in others, but on average, the claims of the manufacturer are correct.&lt;br /&gt;
&lt;br /&gt;
KIO calculates the position of a mobile transceiver, called a Tag. In order to do so, KIO uses radio beacons, called Anchors, distributed in known positions in the surroundings. Figure 1D shows a KIO anchor. KIO tags are the same size and must be placed on-board the tracking subject, in our case people. The red dots in Figure 1C show the location of the six anchors used in these experiments. They are placed on the ceiling. The distribution of the anchors has been chosen following the method shown in Guerrero-Higueras et al. (2017).&lt;br /&gt;
&lt;br /&gt;
=== Leg Detector (D) ===&lt;br /&gt;
&lt;br /&gt;
LD is a ROS package, which takes messages published by a LIDAR sensor as input and uses a machine-learning-trained classifier to detect groups of laser readings as possible legs. The code is available in a public repository, but is unsupported at this time.&lt;br /&gt;
&lt;br /&gt;
LD publishes the location for the individual legs. It can also attempt to pair the legs and publish their average as an estimate of where the center of a person is. LD may optionally also publish visualization marker messages to indicate where detections happened.&lt;br /&gt;
&lt;br /&gt;
=== PeTra ===&lt;br /&gt;
&lt;br /&gt;
PeTra is a person-tracker tool for detecting and tracking, developed by the Robotics Group at the University of León. The system is based on a Convolutional Neural Network (CNN) using a configuration based on the U-Net architecture by Ronneberger et al. (2015).&lt;br /&gt;
&lt;br /&gt;
The system performs the following steps in real time:&lt;br /&gt;
&lt;br /&gt;
1. First, the data provided by the LIDAR sensor are processed to build a two dimensional occupancy map centered around the robot. This occupancy map is represented as a binary matrix, where 1s denote positions where the LIDAR scan found an obstacle, and 0s denote positions where the LIDAR scan either went through without detecting any obstacle or did not go through that position.&lt;br /&gt;
&lt;br /&gt;
2. Then, the occupancy map is relayed to the network as input data. The network produces a second occupancy map representing the zones where legs have been detected.&lt;br /&gt;
&lt;br /&gt;
3. Finally, center of mass calculations return the location of persons. PeTra also publishes locations for the individual legs and Marker messages for visualization.&lt;br /&gt;
&lt;br /&gt;
PeTra system have been trained using the following data: [http://robotica.unileon.es/~datasets/LegTracking/PeTra_training_dataset/npy_train_test_globales.tar.gz npy_train_test_globales.tar.gz]&lt;br /&gt;
&lt;br /&gt;
=== Recording procedure ===&lt;br /&gt;
&lt;br /&gt;
The data were gathered in 14 different situations. In all of them, Orbi-One stood still as one or more people, carrying a KIO tag, moved around it. Three different locations for Orbi-One were defined (see Figure 1C) resulting in 42 scenarios (14 situations × 3 Orbi-One locations). Figure 2 shows the 14 different recognition scenarios recorded. These scenarios have been chosen according to different situations that may occur in robotics competitions such as ERL or RoboCup.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig2.png|frame|center|'''Fig. 2''': recognition scenarios recorded as shown at Álvarez-Aparicio et al. (2017).]]&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
A rosbag file was created for each scenario (except for situations 3, 12, and 13 where 3 rosbag files where recorded, for situation 4 where 4 rosbag files where recorded, and for situation 9 where 5 rosbag files where recorded), recording LIDAR sensor measurements, location estimates from PeTra and LD, locations from KIO RTLS, and other useful data. Specifically, the following data were included in the rosbag files:&lt;br /&gt;
&lt;br /&gt;
* LIDAR sensor data. Data from LIDAR sensors are provided as ROS LaserScan Messages, which include, among other information, the following: acquisition time of the first ray in the scan, start/end angle of the scan, angular distance between measurements, and range data.&lt;br /&gt;
* PeTra location estimates provided as ROS PointStamped Messages, which include a position [x, y, z] and a timestamp.&lt;br /&gt;
* Location estimates calculated by LD. It publishes data for individual legs (as ROS PositionMeasurementArray Messages). It also attempts to pair the legs and publishes their average as an estimate of where the center of a person is as a ROS PositionMeasurement Message.&lt;br /&gt;
* Locations provided by KIO RTLS also provided as ROS PointStamped Messages.&lt;br /&gt;
* Messages from /map, /odom, and /tf ROS topics, which include map information, odometry of the robot base, and transformation information, respectively.&lt;br /&gt;
&lt;br /&gt;
Differents versions of the dataset are enumerated below.&lt;br /&gt;
&lt;br /&gt;
=== v1.0 [Nov-2017] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a first version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Location 1. Kitchen====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_01.bag test_01.bag]: '''duration:''' 14:56s, '''size:''' 227.8 MB, '''start date/time:''' Jul 20, 2017 12:49:21.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Jul 26, 2017 11:01:24.72&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_1.bag test_03_1.bag]: '''duration:''' 39.9s, '''size:''' 10.4 MB, '''start date/time:''' Jul 20, 2017 13:27:25.50&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_2.bag test_03_2.bag]: '''duration:''' 39.1s, '''size:''' 10.2 MB, '''start date/time:''' Jul 20, 2017 13:28:56.41&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_3.bag test_03_3.bag]: '''duration:''' 40.5s, '''size:''' 10.5 MB, '''start date/time:''' Jul 20, 2017 13:30:04.94&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_1.bag test_04_1.bag]: '''duration:''' 58.3s, '''size:''' 15.0 MB, '''start date/time:''' Jul 25, 2017 10:39:52.62 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_2.bag test_04_2.bag]: '''duration:''' 57.2s, '''size:''' 14.7 MB, '''start date/time:''' Jul 25, 2017 10:41:16.31&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_3.bag test_04_3.bag]: '''duration:''' 50.5s, '''size:''' 13.0 MB, '''start date/time:''' Jul 25, 2017 10:42:44.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_4.bag test_04_4.bag]: '''duration:''' 1:01s, '''size:''' 15.7 MB, '''start date/time:''' Jul 25, 2017 10:43:52.44&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_05.bag test_05.bag]: '''duration:''' 15:15s, '''size:''' 236.8 MB, '''start date/time:''' Jul 26, 2017 11:33:13.31&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_06.bag test_06.bag]: '''duration:''' 09:18s, '''size:''' 143.3 MB, '''start date/time:''' Jul 26, 2017 12:25:45.12&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Jul 25, 2017 11:40:01.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_08.bag test_08.bag]: '''duration:''' 03:39s, '''size:''' 55.8 MB, '''start date/time:''' Jul 25, 2017 12:25:29.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_1.bag test_09_1.bag]: '''duration:''' 22.9s, '''size:''' 6.1 MB, '''start date/time:''' Jul 25, 2017 10:50:02.95 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_2.bag test_09_2.bag]: '''duration:''' 20.8s, '''size:''' 5.6 MB, '''start date/time:''' Jul 25, 2017 10:51:02.62&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Jul 25, 2017 10:51:45.96 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_4.bag test_09_4.bag]: '''duration:''' 29.3s, '''size:''' 7.9 MB, '''start date/time:''' Jul 25, 2017 10:52:51.24&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_5.bag test_09_5.bag]: '''duration:''' 36.9s, '''size:''' 9.7 MB, '''start date/time:''' Jul 25, 2017 10:54:00.13&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_10.bag test_10.bag]: '''duration:''' 15:50s, '''size:''' 240.5 MB, '''start date/time:''' Jul 20, 2017 13:07:40.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 58.7 MB, '''start date/time:''' Jul 25, 2017 11:48:33.90&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_1.bag test_12_1.bag]: '''duration:''' 43.6s, '''size:''' 11.3 MB, '''start date/time:''' Jul 20, 2017 13:33:23.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_2.bag test_12_2.bag]: '''duration:''' 44.3s, '''size:''' 11.5 MB, '''start date/time:''' Jul 20, 2017 13:34:24.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_3.bag test_12_3.bag]: '''duration:''' 37.3s, '''size:'''  9.7 MB, '''start date/time:''' Jul 20, 2017 13:35:31.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_1.bag test_13_1.bag]: '''duration:''' 57.8s, '''size:''' 14.9 MB, '''start date/time:''' Jul 25, 2017 11:01:15.23&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_2.bag test_13_2.bag]: '''duration:''' 59.9s, '''size:''' 15.4 MB, '''start date/time:''' Jul 25, 2017 11:02:37.85&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_3.bag test_13_3.bag]: '''duration:''' 54.0s, '''size:''' 13.9 MB, '''start date/time:''' Jul 25, 2017 11:04:30.69&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_14.bag test_14.bag]: '''duration:''' 05.57s, '''size:''' 90.5 MB, '''start date/time:''' Jul 25, 2017 11:10:32.97&lt;br /&gt;
&lt;br /&gt;
====Location 2. Living room====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_01.bag test_01.bag]: '''duration:''' 15:37s, '''size:''' 239.7 MB, '''start date/time:''' Nov 16, 2017 12:20:19.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_02.bag test_02.bag]: '''duration:''' 15:05s, '''size:''' 231.4 MB, '''start date/time:''' Nov 27, 2017 18:32:24.35 &lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_1.bag test_03_1.bag]: '''duration:''' 55.2s, '''size:''' 14.5 MB, '''start date/time:''' Nov 16, 2017 12:43:58.29 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_2.bag test_03_2.bag]: '''duration:''' 44.4s, '''size:''' 10.2 MB, '''start date/time:''' Nov 16, 2017 12:46:31.99&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_3.bag test_03_3.bag]: '''duration:''' 51.1s, '''size:''' 13.4 MB, '''start date/time:''' Nov 16, 2017 12:47:29.66&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_1.bag test_04_1.bag]: '''duration:''' 01:23s, '''size:''' 21.6 MB, '''start date/time:''' Nov 16, 2017 1:50:22.18 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_2.bag test_04_2.bag]: '''duration:''' 01:20s, '''size:''' 21.0 MB, '''start date/time:''' Nov 16, 2017 12:52:18.44&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_3.bag test_04_3.bag]: '''duration:''' 01:09s, '''size:''' 18.1 MB, '''start date/time:''' Nov 16, 2017 12:53:51.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_4.bag test_04_4.bag]: '''duration:''' 1:21s, '''size:''' 21.3 MB, '''start date/time:''' Nov 16, 2017 12:55:31.81&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_05.bag test_05.bag]: '''duration:''' 15:07s, '''size:''' 232.4 MB, '''start date/time:''' Nov 28 2017 18:33:16.24&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_06.bag test_06.bag]: '''duration:''' 09:14s, '''size:''' 136.2 MB, '''start date/time:''' Nov 29 2017 13:12:42.61 &lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_07.bag test_07.bag]: '''duration:''' 4:39s, '''size:''' 71.5 MB, '''start date/time:''' Nov 22 2017 12:48:33.86 &lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_08.bag test_08.bag]: '''duration:''' 03:04s, '''size:''' 47.2 MB, '''start date/time:''' Nov 22 2017 12:59:08.58&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_1.bag test_09_1.bag]: '''duration:''' 28.8s, '''size:''' 7.7 MB, '''start date/time:''' Nov 28, 2017 17:17:46.49 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_2.bag test_09_2.bag]: '''duration:''' 31.9s, '''size:''' 8.4 MB, '''start date/time:''' Nov 28, 2017 17:20:48.26&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Nov 28, 2017 17:20:05.96&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_4.bag test_09_4.bag]: '''duration:''' 29.9s, '''size:''' 7.9 MB, '''start date/time:''' Nov 28, 2017 17:19:26.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_5.bag test_09_5.bag]: '''duration:''' 30.8s, '''size:''' 8.1 MB, '''start date/time:''' Nov 28, 2017 17:21:20.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_10.bag test_10.bag]: '''duration:''' 16:05s, '''size:''' 246.6 MB, '''start date/time:''' Nov 22 2017 11:55:02.39&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_11.bag test_11.bag]: '''duration:''' 03:25s, '''size:''' 52.8 MB, '''start date/time:''' Nov 22 2017 13:08:11.99&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_1.bag test_12_1.bag]: '''duration:''' 01:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:55:26.25&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_2.bag test_12_2.bag]: '''duration:''' 1:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:56:52.47&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_3.bag test_12_3.bag]: '''duration:''' 50.2s, '''size:'''  13.1 MB, '''start date/time:''' Nov 16, 2017 11:44:00.10&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_1.bag test_13_1.bag]: '''duration:''' 58.2s, '''size:''' 15.1 MB, '''start date/time:''' Nov 22, 2017 11:48:35.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_2.bag test_13_2.bag]: '''duration:''' 01:05s, '''size:''' 16.9 MB, '''start date/time:''' Nov 22, 2017 11:49:48.18&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_3.bag test_13_3.bag]: '''duration:''' 01:14s, '''size:''' 19.2 MB, '''start date/time:''' Nov 22, 2017 11:51:22.68&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_14.bag test_14.bag]: '''duration:''' 06:17s, '''size:''' 96.3 MB, '''start date/time:''' Nov 22 2017 12:23:07.92&lt;br /&gt;
&lt;br /&gt;
====Location 3. Bedroom====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_01.bag test_01.bag]: '''duration:''' 15:17s, '''size:''' 234.6 MB, '''start date/time:''' Nov 21, 2017 18:56:54.87&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 17:37:34.08&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_1.bag test_03_1.bag]: '''duration:''' 44.7s, '''size:''' 11.7 MB, '''start date/time:''' Nov 21, 2017 13:16:51.65&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_2.bag test_03_2.bag]: '''duration:''' 39.9s, '''size:''' 10.5 MB, '''start date/time:''' Nov 21, 2017 19:19:06.27&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_3.bag test_03_3.bag]: '''duration:''' 41.2s, '''size:''' 10.8 MB, '''start date/time:''' Nov 21, 2017 19:18:06.86&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_1.bag test_04_1.bag]: '''duration:''' 01:01s, '''size:''' 16.0 MB, '''start date/time:''' Nov 21, 2017 19:21:30.99 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_2.bag test_04_2.bag]: '''duration:''' 01:01s, '''size:''' 15.9 MB, '''start date/time:''' Nov 21, 2017 19:25:38.00&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_3.bag test_04_3.bag]: '''duration:''' 58.0s, '''size:''' 15.1 MB, '''start date/time:''' Nov 21, 2017 19:27:16.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_4.bag test_04_4.bag]: '''duration:''' 1:00s, '''size:''' 15.8 MB, '''start date/time:''' Nov 21, 2017 19:29:39.28&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_05.bag test_05.bag]: '''duration:''' 15:11s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 19:05:20.19&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_06.bag test_06.bag]: '''duration:''' 09:03s, '''size:''' 133.6 MB, '''start date/time:''' Nov 26, 2017 13:24:25.64&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Nov 27, 2017 17:34:50.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_08.bag test_08.bag]: '''duration:''' 03:47s, '''size:''' 58.3 MB, '''start date/time:''' Nov 27, 2017 17:41:48.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_1.bag test_09_1.bag]: '''duration:''' 31.3s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:23:58.31 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_2.bag test_09_2.bag]: '''duration:''' 30.4s, '''size:''' 8.0 MB, '''start date/time:''' Nov 28, 2017 17:24:41.09&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_3.bag test_09_3.bag]: '''duration:''' 31.2s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:25:22.22&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_4.bag test_09_4.bag]: '''duration:''' 31.4s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:26:06.98&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_5.bag test_09_5.bag]: '''duration:''' 33.2s, '''size:''' 8.8 MB, '''start date/time:''' Nov 28, 2017 17:26:51.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_10.bag test_10.bag]: '''duration:''' 10:00s, '''size:''' 153.7 MB, '''start date/time:''' Nov 22, 2017 18:15:29.53&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 59.1 MB, '''start date/time:''' Nov 22, 2017 18:25:54.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_1.bag test_12_1.bag]: '''duration:''' 41.3s, '''size:''' 10.8 MB, '''start date/time:''' Nov 22, 2017 18:35:11.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_2.bag test_12_2.bag]: '''duration:''' 41.8s, '''size:''' 10.9 MB, '''start date/time:''' Nov 22, 2017 18:36:12.11&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_3.bag test_12_3.bag]: '''duration:''' 39.8s, '''size:'''  10.5 MB, '''start date/time:''' Nov 22, 2017 18:37:05.06&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_1.bag test_13_1.bag]: '''duration:''' 01:01s, '''size:''' 14.9 MB, '''start date/time:''' Nov 22, 2017 18:40:13.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_2.bag test_13_2.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:41:28.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_3.bag test_13_3.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:42:39.52&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_14.bag test_14.bag]: '''duration:''' 06:12s, '''size:''' 95.1 MB, '''start date/time:''' Nov 23, 2017 12:53:06.48&lt;br /&gt;
&lt;br /&gt;
=== v2.0 [Feb-2018] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a second version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Location 1. Kitchen====&lt;br /&gt;
&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_01.bag cocina_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_02.bag cocina_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_03.bag cocina_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_04.bag cocina_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_05.bag cocina_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_06.bag cocina_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_07.bag cocina_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_08.bag cocina_08.bag]&lt;br /&gt;
* Situation 19: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_09.bag cocina_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_10.bag cocina_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_11.bag cocina_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_12.bag cocina_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_13.bag cocina_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_14.bag cocina_14.bag]&lt;br /&gt;
&lt;br /&gt;
====Location 2. Living room====&lt;br /&gt;
&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_01.bag salon_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_02.bag salon_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_03.bag salon_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_04.bag salon_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_05.bag salon_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_06.bag salon_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_07.bag salon_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_08.bag salon_08.bag]&lt;br /&gt;
* Situation 19: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_09.bag salon_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_10.bag salon_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_11.bag salon_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_12.bag salon_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_13.bag salon_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_14.bag salon_14.bag]&lt;br /&gt;
&lt;br /&gt;
====Location 3. Bedroom====&lt;br /&gt;
&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_01.bag habitacion_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_02.bag habitacion_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_03.bag habitacion_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_04.bag habitacion_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_05.bag habitacion_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_06.bag habitacion_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_07.bag habitacion_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_08.bag habitacion_08.bag]&lt;br /&gt;
* Situation 19: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_09.bag habitacion_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_10.bag habitacion_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_11.bag habitacion_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_12.bag habitacion_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_13.bag habitacion_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_14.bag habitacion_14.bag]&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=File:PeopleTrackingFig2.png&amp;diff=5218</id>
		<title>File:PeopleTrackingFig2.png</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=File:PeopleTrackingFig2.png&amp;diff=5218"/>
				<updated>2018-02-13T15:52:39Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: Am6 uploaded a new version of File:PeopleTrackingFig2.png&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5217</id>
		<title>Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5217"/>
				<updated>2018-02-13T12:08:05Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: /* PeTra */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;blockquote style=&amp;quot;background-color:#ffe;border:1px solid #fb0;padding:5px 10px&amp;quot;&amp;gt;SciCrunch! reference: '''Range-based people tracker classifiers Benchmark Dataset, RRID:SCR_015743.'''&amp;lt;/blockquote&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to evaluate the performance of different approaches for detecting and tracking people by using LIDAR sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers. &lt;br /&gt;
&lt;br /&gt;
Further information at [https://www.frontiersin.org/articles/10.3389/fnbot.2017.00072/full Álvarez-Aparicio et al. (2018)].&lt;br /&gt;
&lt;br /&gt;
== Materials and methods ==&lt;br /&gt;
&lt;br /&gt;
The following section describes the materials (shown in Figure 1) used to gather data, which include: a certified study area, an autonomous robot with an on-board LIDAR sensor, and a real-time location system (RTLS) to obtain ground-truth data about person location. Recorded data include location estimates calculated by two people trackers, LD and PeTra, also described below. Finally, the recording procedure used to build the dataset is explained.&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig1.jpg|frame|center|'''Fig. 1''': Materials used in the experiments. (A) Orbi-One robot. (B) Leon@Home Testbed plan. (C) Floor plan of the apartment. Red dots show the location of KIO anchors. Black numbered dots denote Orbi-One locations during the data gathering. (D) KIO anchors. (E) General view of the apartment. (F) Furniture in mock-up apartment.]]&lt;br /&gt;
&lt;br /&gt;
=== Leon@Home Testbed ===&lt;br /&gt;
&lt;br /&gt;
Data have been gathered at Leon@Home Testbed. This is a Certified Testbed of the European Robotics league (ERL). Its main purpose is to benchmark service robots in a realistic home environment. Our testbed is made up of four parts, shown in Figure 1B: a mock-up apartment, a control zone with direct vision (glass wall) into the apartment, a small workshop, and a larger development zone, where researchers work.&lt;br /&gt;
&lt;br /&gt;
Leon@Home Testbed is located on the second floor of the Módulo de Investigación en Cibernética (Building for Research in Cybernetics) on the Vegazana Campus of the University of León (Spain). The apartment is a single bedroom mock-up home built in an 8 m × 7 m space. Figure 1C shows a plan of the apartment. 60 cm high walls divide it into a kitchen, living room, bathroom, and bedroom. The furniture (Figures 1E,F) has been chosen to test different robot abilities. For instance, the kitchen cabinets all have different types of handles.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-one ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One (Figure 1A) is an assistant robot manufactured by Robotnik. It has several sensors, among them, a RGBD camera, a LIDAR sensor, and an inertial unit. It can operate a manipulator arm attached to its torso and has a wheeled base for moving around the room. Orbi-One includes a wireless access point, which allows WiFi communications with other robots and computers.&lt;br /&gt;
&lt;br /&gt;
The software to control the robot hardware is based on a ROS framework. ROS is basically a set of libraries for robotics similar to operating system services, providing hardware abstraction for sensors and actuators, low-level device control, and inter-process communication. Computation takes place in processes named Nodes, which can receive and send Messages. Nodes publish Messages into information buffers called Topics.&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
In order to acquire ground-truth data about person location in the study area, we need an RTLS for indoor environments. The KIO RTLS commercial solution by Eliko has been used. KIO is a precise RTLS for tracking any object in 2- or 3-dimensional space. The Ultra Wideband technology enables to micro-position objects through obstructions. KIO also works in non-line-of-sight conditions and both indoors and outdoors.&lt;br /&gt;
&lt;br /&gt;
KIO comes in two main configurations. The Regular Cell configuration guarantees a reliable accuracy of ±30 cm, according to the manufacturer’s specifications. The Small Cell configuration is designed for location-critical applications and provides reliable ±5 cm accuracy, according to the manufacturer’s specifications. Calibration done by the authors of this paper on the mock-up apartment shows that the error is higher in some areas, and lower in others, but on average, the claims of the manufacturer are correct.&lt;br /&gt;
&lt;br /&gt;
KIO calculates the position of a mobile transceiver, called a Tag. In order to do so, KIO uses radio beacons, called Anchors, distributed in known positions in the surroundings. Figure 1D shows a KIO anchor. KIO tags are the same size and must be placed on-board the tracking subject, in our case people. The red dots in Figure 1C show the location of the six anchors used in these experiments. They are placed on the ceiling. The distribution of the anchors has been chosen following the method shown in Guerrero-Higueras et al. (2017).&lt;br /&gt;
&lt;br /&gt;
=== Leg Detector (D) ===&lt;br /&gt;
&lt;br /&gt;
LD is a ROS package, which takes messages published by a LIDAR sensor as input and uses a machine-learning-trained classifier to detect groups of laser readings as possible legs. The code is available in a public repository, but is unsupported at this time.&lt;br /&gt;
&lt;br /&gt;
LD publishes the location for the individual legs. It can also attempt to pair the legs and publish their average as an estimate of where the center of a person is. LD may optionally also publish visualization marker messages to indicate where detections happened.&lt;br /&gt;
&lt;br /&gt;
=== PeTra ===&lt;br /&gt;
&lt;br /&gt;
PeTra is a person-tracker tool for detecting and tracking, developed by the Robotics Group at the University of León. The system is based on a Convolutional Neural Network (CNN) using a configuration based on the U-Net architecture by Ronneberger et al. (2015).&lt;br /&gt;
&lt;br /&gt;
The system performs the following steps in real time:&lt;br /&gt;
&lt;br /&gt;
1. First, the data provided by the LIDAR sensor are processed to build a two dimensional occupancy map centered around the robot. This occupancy map is represented as a binary matrix, where 1s denote positions where the LIDAR scan found an obstacle, and 0s denote positions where the LIDAR scan either went through without detecting any obstacle or did not go through that position.&lt;br /&gt;
&lt;br /&gt;
2. Then, the occupancy map is relayed to the network as input data. The network produces a second occupancy map representing the zones where legs have been detected.&lt;br /&gt;
&lt;br /&gt;
3. Finally, center of mass calculations return the location of persons. PeTra also publishes locations for the individual legs and Marker messages for visualization.&lt;br /&gt;
&lt;br /&gt;
PeTra system have been trained using the following data: [http://robotica.unileon.es/~datasets/LegTracking/PeTra_training_dataset/npy_train_test_globales.tar.gz npy_train_test_globales.tar.gz]&lt;br /&gt;
&lt;br /&gt;
=== Recording procedure ===&lt;br /&gt;
&lt;br /&gt;
The data were gathered in 14 different situations. In all of them, Orbi-One stood still as one or more people, carrying a KIO tag, moved around it. Three different locations for Orbi-One were defined (see Figure 1C) resulting in 42 scenarios (14 situations × 3 Orbi-One locations). Figure 2 shows the 14 different recognition scenarios recorded. These scenarios have been chosen according to different situations that may occur in robotics competitions such as ERL or RoboCup.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig2.png|frame|center|'''Fig. 2''': recognition scenarios recorded.]]&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
A rosbag file was created for each scenario (except for situations 3, 12, and 13 where 3 rosbag files where recorded, for situation 4 where 4 rosbag files where recorded, and for situation 9 where 5 rosbag files where recorded), recording LIDAR sensor measurements, location estimates from PeTra and LD, locations from KIO RTLS, and other useful data. Specifically, the following data were included in the rosbag files:&lt;br /&gt;
&lt;br /&gt;
* LIDAR sensor data. Data from LIDAR sensors are provided as ROS LaserScan Messages, which include, among other information, the following: acquisition time of the first ray in the scan, start/end angle of the scan, angular distance between measurements, and range data.&lt;br /&gt;
* PeTra location estimates provided as ROS PointStamped Messages, which include a position [x, y, z] and a timestamp.&lt;br /&gt;
* Location estimates calculated by LD. It publishes data for individual legs (as ROS PositionMeasurementArray Messages). It also attempts to pair the legs and publishes their average as an estimate of where the center of a person is as a ROS PositionMeasurement Message.&lt;br /&gt;
* Locations provided by KIO RTLS also provided as ROS PointStamped Messages.&lt;br /&gt;
* Messages from /map, /odom, and /tf ROS topics, which include map information, odometry of the robot base, and transformation information, respectively.&lt;br /&gt;
&lt;br /&gt;
Differents versions of the dataset are enumerated below.&lt;br /&gt;
&lt;br /&gt;
=== v1.0 [Nov-2017] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a first version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Location 1. Kitchen====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_01.bag test_01.bag]: '''duration:''' 14:56s, '''size:''' 227.8 MB, '''start date/time:''' Jul 20, 2017 12:49:21.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Jul 26, 2017 11:01:24.72&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_1.bag test_03_1.bag]: '''duration:''' 39.9s, '''size:''' 10.4 MB, '''start date/time:''' Jul 20, 2017 13:27:25.50&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_2.bag test_03_2.bag]: '''duration:''' 39.1s, '''size:''' 10.2 MB, '''start date/time:''' Jul 20, 2017 13:28:56.41&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_3.bag test_03_3.bag]: '''duration:''' 40.5s, '''size:''' 10.5 MB, '''start date/time:''' Jul 20, 2017 13:30:04.94&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_1.bag test_04_1.bag]: '''duration:''' 58.3s, '''size:''' 15.0 MB, '''start date/time:''' Jul 25, 2017 10:39:52.62 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_2.bag test_04_2.bag]: '''duration:''' 57.2s, '''size:''' 14.7 MB, '''start date/time:''' Jul 25, 2017 10:41:16.31&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_3.bag test_04_3.bag]: '''duration:''' 50.5s, '''size:''' 13.0 MB, '''start date/time:''' Jul 25, 2017 10:42:44.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_4.bag test_04_4.bag]: '''duration:''' 1:01s, '''size:''' 15.7 MB, '''start date/time:''' Jul 25, 2017 10:43:52.44&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_05.bag test_05.bag]: '''duration:''' 15:15s, '''size:''' 236.8 MB, '''start date/time:''' Jul 26, 2017 11:33:13.31&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_06.bag test_06.bag]: '''duration:''' 09:18s, '''size:''' 143.3 MB, '''start date/time:''' Jul 26, 2017 12:25:45.12&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Jul 25, 2017 11:40:01.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_08.bag test_08.bag]: '''duration:''' 03:39s, '''size:''' 55.8 MB, '''start date/time:''' Jul 25, 2017 12:25:29.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_1.bag test_09_1.bag]: '''duration:''' 22.9s, '''size:''' 6.1 MB, '''start date/time:''' Jul 25, 2017 10:50:02.95 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_2.bag test_09_2.bag]: '''duration:''' 20.8s, '''size:''' 5.6 MB, '''start date/time:''' Jul 25, 2017 10:51:02.62&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Jul 25, 2017 10:51:45.96 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_4.bag test_09_4.bag]: '''duration:''' 29.3s, '''size:''' 7.9 MB, '''start date/time:''' Jul 25, 2017 10:52:51.24&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_5.bag test_09_5.bag]: '''duration:''' 36.9s, '''size:''' 9.7 MB, '''start date/time:''' Jul 25, 2017 10:54:00.13&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_10.bag test_10.bag]: '''duration:''' 15:50s, '''size:''' 240.5 MB, '''start date/time:''' Jul 20, 2017 13:07:40.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 58.7 MB, '''start date/time:''' Jul 25, 2017 11:48:33.90&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_1.bag test_12_1.bag]: '''duration:''' 43.6s, '''size:''' 11.3 MB, '''start date/time:''' Jul 20, 2017 13:33:23.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_2.bag test_12_2.bag]: '''duration:''' 44.3s, '''size:''' 11.5 MB, '''start date/time:''' Jul 20, 2017 13:34:24.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_3.bag test_12_3.bag]: '''duration:''' 37.3s, '''size:'''  9.7 MB, '''start date/time:''' Jul 20, 2017 13:35:31.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_1.bag test_13_1.bag]: '''duration:''' 57.8s, '''size:''' 14.9 MB, '''start date/time:''' Jul 25, 2017 11:01:15.23&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_2.bag test_13_2.bag]: '''duration:''' 59.9s, '''size:''' 15.4 MB, '''start date/time:''' Jul 25, 2017 11:02:37.85&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_3.bag test_13_3.bag]: '''duration:''' 54.0s, '''size:''' 13.9 MB, '''start date/time:''' Jul 25, 2017 11:04:30.69&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_14.bag test_14.bag]: '''duration:''' 05.57s, '''size:''' 90.5 MB, '''start date/time:''' Jul 25, 2017 11:10:32.97&lt;br /&gt;
&lt;br /&gt;
====Location 2. Living room====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_01.bag test_01.bag]: '''duration:''' 15:37s, '''size:''' 239.7 MB, '''start date/time:''' Nov 16, 2017 12:20:19.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_02.bag test_02.bag]: '''duration:''' 15:05s, '''size:''' 231.4 MB, '''start date/time:''' Nov 27, 2017 18:32:24.35 &lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_1.bag test_03_1.bag]: '''duration:''' 55.2s, '''size:''' 14.5 MB, '''start date/time:''' Nov 16, 2017 12:43:58.29 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_2.bag test_03_2.bag]: '''duration:''' 44.4s, '''size:''' 10.2 MB, '''start date/time:''' Nov 16, 2017 12:46:31.99&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_3.bag test_03_3.bag]: '''duration:''' 51.1s, '''size:''' 13.4 MB, '''start date/time:''' Nov 16, 2017 12:47:29.66&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_1.bag test_04_1.bag]: '''duration:''' 01:23s, '''size:''' 21.6 MB, '''start date/time:''' Nov 16, 2017 1:50:22.18 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_2.bag test_04_2.bag]: '''duration:''' 01:20s, '''size:''' 21.0 MB, '''start date/time:''' Nov 16, 2017 12:52:18.44&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_3.bag test_04_3.bag]: '''duration:''' 01:09s, '''size:''' 18.1 MB, '''start date/time:''' Nov 16, 2017 12:53:51.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_4.bag test_04_4.bag]: '''duration:''' 1:21s, '''size:''' 21.3 MB, '''start date/time:''' Nov 16, 2017 12:55:31.81&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_05.bag test_05.bag]: '''duration:''' 15:07s, '''size:''' 232.4 MB, '''start date/time:''' Nov 28 2017 18:33:16.24&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_06.bag test_06.bag]: '''duration:''' 09:14s, '''size:''' 136.2 MB, '''start date/time:''' Nov 29 2017 13:12:42.61 &lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_07.bag test_07.bag]: '''duration:''' 4:39s, '''size:''' 71.5 MB, '''start date/time:''' Nov 22 2017 12:48:33.86 &lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_08.bag test_08.bag]: '''duration:''' 03:04s, '''size:''' 47.2 MB, '''start date/time:''' Nov 22 2017 12:59:08.58&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_1.bag test_09_1.bag]: '''duration:''' 28.8s, '''size:''' 7.7 MB, '''start date/time:''' Nov 28, 2017 17:17:46.49 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_2.bag test_09_2.bag]: '''duration:''' 31.9s, '''size:''' 8.4 MB, '''start date/time:''' Nov 28, 2017 17:20:48.26&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Nov 28, 2017 17:20:05.96&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_4.bag test_09_4.bag]: '''duration:''' 29.9s, '''size:''' 7.9 MB, '''start date/time:''' Nov 28, 2017 17:19:26.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_5.bag test_09_5.bag]: '''duration:''' 30.8s, '''size:''' 8.1 MB, '''start date/time:''' Nov 28, 2017 17:21:20.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_10.bag test_10.bag]: '''duration:''' 16:05s, '''size:''' 246.6 MB, '''start date/time:''' Nov 22 2017 11:55:02.39&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_11.bag test_11.bag]: '''duration:''' 03:25s, '''size:''' 52.8 MB, '''start date/time:''' Nov 22 2017 13:08:11.99&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_1.bag test_12_1.bag]: '''duration:''' 01:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:55:26.25&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_2.bag test_12_2.bag]: '''duration:''' 1:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:56:52.47&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_3.bag test_12_3.bag]: '''duration:''' 50.2s, '''size:'''  13.1 MB, '''start date/time:''' Nov 16, 2017 11:44:00.10&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_1.bag test_13_1.bag]: '''duration:''' 58.2s, '''size:''' 15.1 MB, '''start date/time:''' Nov 22, 2017 11:48:35.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_2.bag test_13_2.bag]: '''duration:''' 01:05s, '''size:''' 16.9 MB, '''start date/time:''' Nov 22, 2017 11:49:48.18&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_3.bag test_13_3.bag]: '''duration:''' 01:14s, '''size:''' 19.2 MB, '''start date/time:''' Nov 22, 2017 11:51:22.68&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_14.bag test_14.bag]: '''duration:''' 06:17s, '''size:''' 96.3 MB, '''start date/time:''' Nov 22 2017 12:23:07.92&lt;br /&gt;
&lt;br /&gt;
====Location 3. Bedroom====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_01.bag test_01.bag]: '''duration:''' 15:17s, '''size:''' 234.6 MB, '''start date/time:''' Nov 21, 2017 18:56:54.87&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 17:37:34.08&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_1.bag test_03_1.bag]: '''duration:''' 44.7s, '''size:''' 11.7 MB, '''start date/time:''' Nov 21, 2017 13:16:51.65&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_2.bag test_03_2.bag]: '''duration:''' 39.9s, '''size:''' 10.5 MB, '''start date/time:''' Nov 21, 2017 19:19:06.27&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_3.bag test_03_3.bag]: '''duration:''' 41.2s, '''size:''' 10.8 MB, '''start date/time:''' Nov 21, 2017 19:18:06.86&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_1.bag test_04_1.bag]: '''duration:''' 01:01s, '''size:''' 16.0 MB, '''start date/time:''' Nov 21, 2017 19:21:30.99 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_2.bag test_04_2.bag]: '''duration:''' 01:01s, '''size:''' 15.9 MB, '''start date/time:''' Nov 21, 2017 19:25:38.00&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_3.bag test_04_3.bag]: '''duration:''' 58.0s, '''size:''' 15.1 MB, '''start date/time:''' Nov 21, 2017 19:27:16.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_4.bag test_04_4.bag]: '''duration:''' 1:00s, '''size:''' 15.8 MB, '''start date/time:''' Nov 21, 2017 19:29:39.28&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_05.bag test_05.bag]: '''duration:''' 15:11s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 19:05:20.19&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_06.bag test_06.bag]: '''duration:''' 09:03s, '''size:''' 133.6 MB, '''start date/time:''' Nov 26, 2017 13:24:25.64&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Nov 27, 2017 17:34:50.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_08.bag test_08.bag]: '''duration:''' 03:47s, '''size:''' 58.3 MB, '''start date/time:''' Nov 27, 2017 17:41:48.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_1.bag test_09_1.bag]: '''duration:''' 31.3s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:23:58.31 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_2.bag test_09_2.bag]: '''duration:''' 30.4s, '''size:''' 8.0 MB, '''start date/time:''' Nov 28, 2017 17:24:41.09&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_3.bag test_09_3.bag]: '''duration:''' 31.2s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:25:22.22&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_4.bag test_09_4.bag]: '''duration:''' 31.4s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:26:06.98&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_5.bag test_09_5.bag]: '''duration:''' 33.2s, '''size:''' 8.8 MB, '''start date/time:''' Nov 28, 2017 17:26:51.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_10.bag test_10.bag]: '''duration:''' 10:00s, '''size:''' 153.7 MB, '''start date/time:''' Nov 22, 2017 18:15:29.53&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 59.1 MB, '''start date/time:''' Nov 22, 2017 18:25:54.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_1.bag test_12_1.bag]: '''duration:''' 41.3s, '''size:''' 10.8 MB, '''start date/time:''' Nov 22, 2017 18:35:11.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_2.bag test_12_2.bag]: '''duration:''' 41.8s, '''size:''' 10.9 MB, '''start date/time:''' Nov 22, 2017 18:36:12.11&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_3.bag test_12_3.bag]: '''duration:''' 39.8s, '''size:'''  10.5 MB, '''start date/time:''' Nov 22, 2017 18:37:05.06&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_1.bag test_13_1.bag]: '''duration:''' 01:01s, '''size:''' 14.9 MB, '''start date/time:''' Nov 22, 2017 18:40:13.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_2.bag test_13_2.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:41:28.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_3.bag test_13_3.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:42:39.52&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_14.bag test_14.bag]: '''duration:''' 06:12s, '''size:''' 95.1 MB, '''start date/time:''' Nov 23, 2017 12:53:06.48&lt;br /&gt;
&lt;br /&gt;
=== v2.0 [Feb-2018] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a second version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Location 1. Kitchen====&lt;br /&gt;
&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_01.bag cocina_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_02.bag cocina_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_03.bag cocina_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_04.bag cocina_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_05.bag cocina_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_06.bag cocina_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_07.bag cocina_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_08.bag cocina_08.bag]&lt;br /&gt;
* Situation 19: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_09.bag cocina_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_10.bag cocina_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_11.bag cocina_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_12.bag cocina_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_13.bag cocina_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_14.bag cocina_14.bag]&lt;br /&gt;
&lt;br /&gt;
====Location 2. Living room====&lt;br /&gt;
&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_01.bag salon_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_02.bag salon_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_03.bag salon_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_04.bag salon_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_05.bag salon_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_06.bag salon_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_07.bag salon_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_08.bag salon_08.bag]&lt;br /&gt;
* Situation 19: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_09.bag salon_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_10.bag salon_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_11.bag salon_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_12.bag salon_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_13.bag salon_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_14.bag salon_14.bag]&lt;br /&gt;
&lt;br /&gt;
====Location 3. Bedroom====&lt;br /&gt;
&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_01.bag habitacion_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_02.bag habitacion_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_03.bag habitacion_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_04.bag habitacion_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_05.bag habitacion_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_06.bag habitacion_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_07.bag habitacion_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_08.bag habitacion_08.bag]&lt;br /&gt;
* Situation 19: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_09.bag habitacion_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_10.bag habitacion_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_11.bag habitacion_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_12.bag habitacion_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_13.bag habitacion_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_14.bag habitacion_14.bag]&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5216</id>
		<title>Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5216"/>
				<updated>2018-02-13T12:03:47Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: /* PeTra */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;blockquote style=&amp;quot;background-color:#ffe;border:1px solid #fb0;padding:5px 10px&amp;quot;&amp;gt;SciCrunch! reference: '''Range-based people tracker classifiers Benchmark Dataset, RRID:SCR_015743.'''&amp;lt;/blockquote&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to evaluate the performance of different approaches for detecting and tracking people by using LIDAR sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers. &lt;br /&gt;
&lt;br /&gt;
Further information at [https://www.frontiersin.org/articles/10.3389/fnbot.2017.00072/full Álvarez-Aparicio et al. (2018)].&lt;br /&gt;
&lt;br /&gt;
== Materials and methods ==&lt;br /&gt;
&lt;br /&gt;
The following section describes the materials (shown in Figure 1) used to gather data, which include: a certified study area, an autonomous robot with an on-board LIDAR sensor, and a real-time location system (RTLS) to obtain ground-truth data about person location. Recorded data include location estimates calculated by two people trackers, LD and PeTra, also described below. Finally, the recording procedure used to build the dataset is explained.&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig1.jpg|frame|center|'''Fig. 1''': Materials used in the experiments. (A) Orbi-One robot. (B) Leon@Home Testbed plan. (C) Floor plan of the apartment. Red dots show the location of KIO anchors. Black numbered dots denote Orbi-One locations during the data gathering. (D) KIO anchors. (E) General view of the apartment. (F) Furniture in mock-up apartment.]]&lt;br /&gt;
&lt;br /&gt;
=== Leon@Home Testbed ===&lt;br /&gt;
&lt;br /&gt;
Data have been gathered at Leon@Home Testbed. This is a Certified Testbed of the European Robotics league (ERL). Its main purpose is to benchmark service robots in a realistic home environment. Our testbed is made up of four parts, shown in Figure 1B: a mock-up apartment, a control zone with direct vision (glass wall) into the apartment, a small workshop, and a larger development zone, where researchers work.&lt;br /&gt;
&lt;br /&gt;
Leon@Home Testbed is located on the second floor of the Módulo de Investigación en Cibernética (Building for Research in Cybernetics) on the Vegazana Campus of the University of León (Spain). The apartment is a single bedroom mock-up home built in an 8 m × 7 m space. Figure 1C shows a plan of the apartment. 60 cm high walls divide it into a kitchen, living room, bathroom, and bedroom. The furniture (Figures 1E,F) has been chosen to test different robot abilities. For instance, the kitchen cabinets all have different types of handles.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-one ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One (Figure 1A) is an assistant robot manufactured by Robotnik. It has several sensors, among them, a RGBD camera, a LIDAR sensor, and an inertial unit. It can operate a manipulator arm attached to its torso and has a wheeled base for moving around the room. Orbi-One includes a wireless access point, which allows WiFi communications with other robots and computers.&lt;br /&gt;
&lt;br /&gt;
The software to control the robot hardware is based on a ROS framework. ROS is basically a set of libraries for robotics similar to operating system services, providing hardware abstraction for sensors and actuators, low-level device control, and inter-process communication. Computation takes place in processes named Nodes, which can receive and send Messages. Nodes publish Messages into information buffers called Topics.&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
In order to acquire ground-truth data about person location in the study area, we need an RTLS for indoor environments. The KIO RTLS commercial solution by Eliko has been used. KIO is a precise RTLS for tracking any object in 2- or 3-dimensional space. The Ultra Wideband technology enables to micro-position objects through obstructions. KIO also works in non-line-of-sight conditions and both indoors and outdoors.&lt;br /&gt;
&lt;br /&gt;
KIO comes in two main configurations. The Regular Cell configuration guarantees a reliable accuracy of ±30 cm, according to the manufacturer’s specifications. The Small Cell configuration is designed for location-critical applications and provides reliable ±5 cm accuracy, according to the manufacturer’s specifications. Calibration done by the authors of this paper on the mock-up apartment shows that the error is higher in some areas, and lower in others, but on average, the claims of the manufacturer are correct.&lt;br /&gt;
&lt;br /&gt;
KIO calculates the position of a mobile transceiver, called a Tag. In order to do so, KIO uses radio beacons, called Anchors, distributed in known positions in the surroundings. Figure 1D shows a KIO anchor. KIO tags are the same size and must be placed on-board the tracking subject, in our case people. The red dots in Figure 1C show the location of the six anchors used in these experiments. They are placed on the ceiling. The distribution of the anchors has been chosen following the method shown in Guerrero-Higueras et al. (2017).&lt;br /&gt;
&lt;br /&gt;
=== Leg Detector (D) ===&lt;br /&gt;
&lt;br /&gt;
LD is a ROS package, which takes messages published by a LIDAR sensor as input and uses a machine-learning-trained classifier to detect groups of laser readings as possible legs. The code is available in a public repository, but is unsupported at this time.&lt;br /&gt;
&lt;br /&gt;
LD publishes the location for the individual legs. It can also attempt to pair the legs and publish their average as an estimate of where the center of a person is. LD may optionally also publish visualization marker messages to indicate where detections happened.&lt;br /&gt;
&lt;br /&gt;
=== PeTra ===&lt;br /&gt;
&lt;br /&gt;
PeTra is a person-tracker tool for detecting and tracking, developed by the Robotics Group at the University of León. The system is based on a Convolutional Neural Network (CNN) using a configuration based on the U-Net architecture by Ronneberger et al. (2015).&lt;br /&gt;
&lt;br /&gt;
The system performs the following steps in real time:&lt;br /&gt;
&lt;br /&gt;
1. First, the data provided by the LIDAR sensor are processed to build a two dimensional occupancy map centered around the robot. This occupancy map is represented as a binary matrix, where 1s denote positions where the LIDAR scan found an obstacle, and 0s denote positions where the LIDAR scan either went through without detecting any obstacle or did not go through that position.&lt;br /&gt;
&lt;br /&gt;
2. Then, the occupancy map is relayed to the network as input data. The network produces a second occupancy map representing the zones where legs have been detected.&lt;br /&gt;
&lt;br /&gt;
3. Finally, center of mass calculations return the location of persons. PeTra also publishes locations for the individual legs and Marker messages for visualization.&lt;br /&gt;
&lt;br /&gt;
PeTra system have been trained using the following data: [http://niebla.unileon.es/proyectos/attachments/download/343/npy_train_test_globales.tar.gz PeTra_train_data]&lt;br /&gt;
&lt;br /&gt;
=== Recording procedure ===&lt;br /&gt;
&lt;br /&gt;
The data were gathered in 14 different situations. In all of them, Orbi-One stood still as one or more people, carrying a KIO tag, moved around it. Three different locations for Orbi-One were defined (see Figure 1C) resulting in 42 scenarios (14 situations × 3 Orbi-One locations). Figure 2 shows the 14 different recognition scenarios recorded. These scenarios have been chosen according to different situations that may occur in robotics competitions such as ERL or RoboCup.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig2.png|frame|center|'''Fig. 2''': recognition scenarios recorded.]]&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
A rosbag file was created for each scenario (except for situations 3, 12, and 13 where 3 rosbag files where recorded, for situation 4 where 4 rosbag files where recorded, and for situation 9 where 5 rosbag files where recorded), recording LIDAR sensor measurements, location estimates from PeTra and LD, locations from KIO RTLS, and other useful data. Specifically, the following data were included in the rosbag files:&lt;br /&gt;
&lt;br /&gt;
* LIDAR sensor data. Data from LIDAR sensors are provided as ROS LaserScan Messages, which include, among other information, the following: acquisition time of the first ray in the scan, start/end angle of the scan, angular distance between measurements, and range data.&lt;br /&gt;
* PeTra location estimates provided as ROS PointStamped Messages, which include a position [x, y, z] and a timestamp.&lt;br /&gt;
* Location estimates calculated by LD. It publishes data for individual legs (as ROS PositionMeasurementArray Messages). It also attempts to pair the legs and publishes their average as an estimate of where the center of a person is as a ROS PositionMeasurement Message.&lt;br /&gt;
* Locations provided by KIO RTLS also provided as ROS PointStamped Messages.&lt;br /&gt;
* Messages from /map, /odom, and /tf ROS topics, which include map information, odometry of the robot base, and transformation information, respectively.&lt;br /&gt;
&lt;br /&gt;
Differents versions of the dataset are enumerated below.&lt;br /&gt;
&lt;br /&gt;
=== v1.0 [Nov-2017] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a first version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Location 1. Kitchen====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_01.bag test_01.bag]: '''duration:''' 14:56s, '''size:''' 227.8 MB, '''start date/time:''' Jul 20, 2017 12:49:21.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Jul 26, 2017 11:01:24.72&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_1.bag test_03_1.bag]: '''duration:''' 39.9s, '''size:''' 10.4 MB, '''start date/time:''' Jul 20, 2017 13:27:25.50&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_2.bag test_03_2.bag]: '''duration:''' 39.1s, '''size:''' 10.2 MB, '''start date/time:''' Jul 20, 2017 13:28:56.41&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_3.bag test_03_3.bag]: '''duration:''' 40.5s, '''size:''' 10.5 MB, '''start date/time:''' Jul 20, 2017 13:30:04.94&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_1.bag test_04_1.bag]: '''duration:''' 58.3s, '''size:''' 15.0 MB, '''start date/time:''' Jul 25, 2017 10:39:52.62 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_2.bag test_04_2.bag]: '''duration:''' 57.2s, '''size:''' 14.7 MB, '''start date/time:''' Jul 25, 2017 10:41:16.31&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_3.bag test_04_3.bag]: '''duration:''' 50.5s, '''size:''' 13.0 MB, '''start date/time:''' Jul 25, 2017 10:42:44.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_4.bag test_04_4.bag]: '''duration:''' 1:01s, '''size:''' 15.7 MB, '''start date/time:''' Jul 25, 2017 10:43:52.44&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_05.bag test_05.bag]: '''duration:''' 15:15s, '''size:''' 236.8 MB, '''start date/time:''' Jul 26, 2017 11:33:13.31&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_06.bag test_06.bag]: '''duration:''' 09:18s, '''size:''' 143.3 MB, '''start date/time:''' Jul 26, 2017 12:25:45.12&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Jul 25, 2017 11:40:01.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_08.bag test_08.bag]: '''duration:''' 03:39s, '''size:''' 55.8 MB, '''start date/time:''' Jul 25, 2017 12:25:29.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_1.bag test_09_1.bag]: '''duration:''' 22.9s, '''size:''' 6.1 MB, '''start date/time:''' Jul 25, 2017 10:50:02.95 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_2.bag test_09_2.bag]: '''duration:''' 20.8s, '''size:''' 5.6 MB, '''start date/time:''' Jul 25, 2017 10:51:02.62&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Jul 25, 2017 10:51:45.96 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_4.bag test_09_4.bag]: '''duration:''' 29.3s, '''size:''' 7.9 MB, '''start date/time:''' Jul 25, 2017 10:52:51.24&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_5.bag test_09_5.bag]: '''duration:''' 36.9s, '''size:''' 9.7 MB, '''start date/time:''' Jul 25, 2017 10:54:00.13&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_10.bag test_10.bag]: '''duration:''' 15:50s, '''size:''' 240.5 MB, '''start date/time:''' Jul 20, 2017 13:07:40.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 58.7 MB, '''start date/time:''' Jul 25, 2017 11:48:33.90&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_1.bag test_12_1.bag]: '''duration:''' 43.6s, '''size:''' 11.3 MB, '''start date/time:''' Jul 20, 2017 13:33:23.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_2.bag test_12_2.bag]: '''duration:''' 44.3s, '''size:''' 11.5 MB, '''start date/time:''' Jul 20, 2017 13:34:24.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_3.bag test_12_3.bag]: '''duration:''' 37.3s, '''size:'''  9.7 MB, '''start date/time:''' Jul 20, 2017 13:35:31.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_1.bag test_13_1.bag]: '''duration:''' 57.8s, '''size:''' 14.9 MB, '''start date/time:''' Jul 25, 2017 11:01:15.23&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_2.bag test_13_2.bag]: '''duration:''' 59.9s, '''size:''' 15.4 MB, '''start date/time:''' Jul 25, 2017 11:02:37.85&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_3.bag test_13_3.bag]: '''duration:''' 54.0s, '''size:''' 13.9 MB, '''start date/time:''' Jul 25, 2017 11:04:30.69&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_14.bag test_14.bag]: '''duration:''' 05.57s, '''size:''' 90.5 MB, '''start date/time:''' Jul 25, 2017 11:10:32.97&lt;br /&gt;
&lt;br /&gt;
====Location 2. Living room====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_01.bag test_01.bag]: '''duration:''' 15:37s, '''size:''' 239.7 MB, '''start date/time:''' Nov 16, 2017 12:20:19.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_02.bag test_02.bag]: '''duration:''' 15:05s, '''size:''' 231.4 MB, '''start date/time:''' Nov 27, 2017 18:32:24.35 &lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_1.bag test_03_1.bag]: '''duration:''' 55.2s, '''size:''' 14.5 MB, '''start date/time:''' Nov 16, 2017 12:43:58.29 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_2.bag test_03_2.bag]: '''duration:''' 44.4s, '''size:''' 10.2 MB, '''start date/time:''' Nov 16, 2017 12:46:31.99&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_3.bag test_03_3.bag]: '''duration:''' 51.1s, '''size:''' 13.4 MB, '''start date/time:''' Nov 16, 2017 12:47:29.66&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_1.bag test_04_1.bag]: '''duration:''' 01:23s, '''size:''' 21.6 MB, '''start date/time:''' Nov 16, 2017 1:50:22.18 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_2.bag test_04_2.bag]: '''duration:''' 01:20s, '''size:''' 21.0 MB, '''start date/time:''' Nov 16, 2017 12:52:18.44&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_3.bag test_04_3.bag]: '''duration:''' 01:09s, '''size:''' 18.1 MB, '''start date/time:''' Nov 16, 2017 12:53:51.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_4.bag test_04_4.bag]: '''duration:''' 1:21s, '''size:''' 21.3 MB, '''start date/time:''' Nov 16, 2017 12:55:31.81&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_05.bag test_05.bag]: '''duration:''' 15:07s, '''size:''' 232.4 MB, '''start date/time:''' Nov 28 2017 18:33:16.24&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_06.bag test_06.bag]: '''duration:''' 09:14s, '''size:''' 136.2 MB, '''start date/time:''' Nov 29 2017 13:12:42.61 &lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_07.bag test_07.bag]: '''duration:''' 4:39s, '''size:''' 71.5 MB, '''start date/time:''' Nov 22 2017 12:48:33.86 &lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_08.bag test_08.bag]: '''duration:''' 03:04s, '''size:''' 47.2 MB, '''start date/time:''' Nov 22 2017 12:59:08.58&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_1.bag test_09_1.bag]: '''duration:''' 28.8s, '''size:''' 7.7 MB, '''start date/time:''' Nov 28, 2017 17:17:46.49 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_2.bag test_09_2.bag]: '''duration:''' 31.9s, '''size:''' 8.4 MB, '''start date/time:''' Nov 28, 2017 17:20:48.26&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Nov 28, 2017 17:20:05.96&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_4.bag test_09_4.bag]: '''duration:''' 29.9s, '''size:''' 7.9 MB, '''start date/time:''' Nov 28, 2017 17:19:26.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_5.bag test_09_5.bag]: '''duration:''' 30.8s, '''size:''' 8.1 MB, '''start date/time:''' Nov 28, 2017 17:21:20.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_10.bag test_10.bag]: '''duration:''' 16:05s, '''size:''' 246.6 MB, '''start date/time:''' Nov 22 2017 11:55:02.39&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_11.bag test_11.bag]: '''duration:''' 03:25s, '''size:''' 52.8 MB, '''start date/time:''' Nov 22 2017 13:08:11.99&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_1.bag test_12_1.bag]: '''duration:''' 01:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:55:26.25&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_2.bag test_12_2.bag]: '''duration:''' 1:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:56:52.47&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_3.bag test_12_3.bag]: '''duration:''' 50.2s, '''size:'''  13.1 MB, '''start date/time:''' Nov 16, 2017 11:44:00.10&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_1.bag test_13_1.bag]: '''duration:''' 58.2s, '''size:''' 15.1 MB, '''start date/time:''' Nov 22, 2017 11:48:35.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_2.bag test_13_2.bag]: '''duration:''' 01:05s, '''size:''' 16.9 MB, '''start date/time:''' Nov 22, 2017 11:49:48.18&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_3.bag test_13_3.bag]: '''duration:''' 01:14s, '''size:''' 19.2 MB, '''start date/time:''' Nov 22, 2017 11:51:22.68&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_14.bag test_14.bag]: '''duration:''' 06:17s, '''size:''' 96.3 MB, '''start date/time:''' Nov 22 2017 12:23:07.92&lt;br /&gt;
&lt;br /&gt;
====Location 3. Bedroom====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_01.bag test_01.bag]: '''duration:''' 15:17s, '''size:''' 234.6 MB, '''start date/time:''' Nov 21, 2017 18:56:54.87&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 17:37:34.08&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_1.bag test_03_1.bag]: '''duration:''' 44.7s, '''size:''' 11.7 MB, '''start date/time:''' Nov 21, 2017 13:16:51.65&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_2.bag test_03_2.bag]: '''duration:''' 39.9s, '''size:''' 10.5 MB, '''start date/time:''' Nov 21, 2017 19:19:06.27&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_3.bag test_03_3.bag]: '''duration:''' 41.2s, '''size:''' 10.8 MB, '''start date/time:''' Nov 21, 2017 19:18:06.86&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_1.bag test_04_1.bag]: '''duration:''' 01:01s, '''size:''' 16.0 MB, '''start date/time:''' Nov 21, 2017 19:21:30.99 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_2.bag test_04_2.bag]: '''duration:''' 01:01s, '''size:''' 15.9 MB, '''start date/time:''' Nov 21, 2017 19:25:38.00&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_3.bag test_04_3.bag]: '''duration:''' 58.0s, '''size:''' 15.1 MB, '''start date/time:''' Nov 21, 2017 19:27:16.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_4.bag test_04_4.bag]: '''duration:''' 1:00s, '''size:''' 15.8 MB, '''start date/time:''' Nov 21, 2017 19:29:39.28&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_05.bag test_05.bag]: '''duration:''' 15:11s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 19:05:20.19&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_06.bag test_06.bag]: '''duration:''' 09:03s, '''size:''' 133.6 MB, '''start date/time:''' Nov 26, 2017 13:24:25.64&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Nov 27, 2017 17:34:50.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_08.bag test_08.bag]: '''duration:''' 03:47s, '''size:''' 58.3 MB, '''start date/time:''' Nov 27, 2017 17:41:48.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_1.bag test_09_1.bag]: '''duration:''' 31.3s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:23:58.31 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_2.bag test_09_2.bag]: '''duration:''' 30.4s, '''size:''' 8.0 MB, '''start date/time:''' Nov 28, 2017 17:24:41.09&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_3.bag test_09_3.bag]: '''duration:''' 31.2s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:25:22.22&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_4.bag test_09_4.bag]: '''duration:''' 31.4s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:26:06.98&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_5.bag test_09_5.bag]: '''duration:''' 33.2s, '''size:''' 8.8 MB, '''start date/time:''' Nov 28, 2017 17:26:51.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_10.bag test_10.bag]: '''duration:''' 10:00s, '''size:''' 153.7 MB, '''start date/time:''' Nov 22, 2017 18:15:29.53&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 59.1 MB, '''start date/time:''' Nov 22, 2017 18:25:54.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_1.bag test_12_1.bag]: '''duration:''' 41.3s, '''size:''' 10.8 MB, '''start date/time:''' Nov 22, 2017 18:35:11.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_2.bag test_12_2.bag]: '''duration:''' 41.8s, '''size:''' 10.9 MB, '''start date/time:''' Nov 22, 2017 18:36:12.11&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_3.bag test_12_3.bag]: '''duration:''' 39.8s, '''size:'''  10.5 MB, '''start date/time:''' Nov 22, 2017 18:37:05.06&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_1.bag test_13_1.bag]: '''duration:''' 01:01s, '''size:''' 14.9 MB, '''start date/time:''' Nov 22, 2017 18:40:13.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_2.bag test_13_2.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:41:28.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_3.bag test_13_3.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:42:39.52&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_14.bag test_14.bag]: '''duration:''' 06:12s, '''size:''' 95.1 MB, '''start date/time:''' Nov 23, 2017 12:53:06.48&lt;br /&gt;
&lt;br /&gt;
=== v2.0 [Feb-2018] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a second version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Location 1. Kitchen====&lt;br /&gt;
&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_01.bag cocina_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_02.bag cocina_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_03.bag cocina_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_04.bag cocina_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_05.bag cocina_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_06.bag cocina_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_07.bag cocina_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_08.bag cocina_08.bag]&lt;br /&gt;
* Situation 19: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_09.bag cocina_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_10.bag cocina_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_11.bag cocina_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_12.bag cocina_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_13.bag cocina_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_14.bag cocina_14.bag]&lt;br /&gt;
&lt;br /&gt;
====Location 2. Living room====&lt;br /&gt;
&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_01.bag salon_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_02.bag salon_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_03.bag salon_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_04.bag salon_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_05.bag salon_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_06.bag salon_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_07.bag salon_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_08.bag salon_08.bag]&lt;br /&gt;
* Situation 19: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_09.bag salon_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_10.bag salon_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_11.bag salon_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_12.bag salon_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_13.bag salon_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_14.bag salon_14.bag]&lt;br /&gt;
&lt;br /&gt;
====Location 3. Bedroom====&lt;br /&gt;
&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_01.bag habitacion_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_02.bag habitacion_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_03.bag habitacion_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_04.bag habitacion_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_05.bag habitacion_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_06.bag habitacion_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_07.bag habitacion_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_08.bag habitacion_08.bag]&lt;br /&gt;
* Situation 19: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_09.bag habitacion_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_10.bag habitacion_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_11.bag habitacion_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_12.bag habitacion_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_13.bag habitacion_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_14.bag habitacion_14.bag]&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5215</id>
		<title>Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5215"/>
				<updated>2018-02-13T12:02:22Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: /* PeTra */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;blockquote style=&amp;quot;background-color:#ffe;border:1px solid #fb0;padding:5px 10px&amp;quot;&amp;gt;SciCrunch! reference: '''Range-based people tracker classifiers Benchmark Dataset, RRID:SCR_015743.'''&amp;lt;/blockquote&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to evaluate the performance of different approaches for detecting and tracking people by using LIDAR sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers. &lt;br /&gt;
&lt;br /&gt;
Further information at [https://www.frontiersin.org/articles/10.3389/fnbot.2017.00072/full Álvarez-Aparicio et al. (2018)].&lt;br /&gt;
&lt;br /&gt;
== Materials and methods ==&lt;br /&gt;
&lt;br /&gt;
The following section describes the materials (shown in Figure 1) used to gather data, which include: a certified study area, an autonomous robot with an on-board LIDAR sensor, and a real-time location system (RTLS) to obtain ground-truth data about person location. Recorded data include location estimates calculated by two people trackers, LD and PeTra, also described below. Finally, the recording procedure used to build the dataset is explained.&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig1.jpg|frame|center|'''Fig. 1''': Materials used in the experiments. (A) Orbi-One robot. (B) Leon@Home Testbed plan. (C) Floor plan of the apartment. Red dots show the location of KIO anchors. Black numbered dots denote Orbi-One locations during the data gathering. (D) KIO anchors. (E) General view of the apartment. (F) Furniture in mock-up apartment.]]&lt;br /&gt;
&lt;br /&gt;
=== Leon@Home Testbed ===&lt;br /&gt;
&lt;br /&gt;
Data have been gathered at Leon@Home Testbed. This is a Certified Testbed of the European Robotics league (ERL). Its main purpose is to benchmark service robots in a realistic home environment. Our testbed is made up of four parts, shown in Figure 1B: a mock-up apartment, a control zone with direct vision (glass wall) into the apartment, a small workshop, and a larger development zone, where researchers work.&lt;br /&gt;
&lt;br /&gt;
Leon@Home Testbed is located on the second floor of the Módulo de Investigación en Cibernética (Building for Research in Cybernetics) on the Vegazana Campus of the University of León (Spain). The apartment is a single bedroom mock-up home built in an 8 m × 7 m space. Figure 1C shows a plan of the apartment. 60 cm high walls divide it into a kitchen, living room, bathroom, and bedroom. The furniture (Figures 1E,F) has been chosen to test different robot abilities. For instance, the kitchen cabinets all have different types of handles.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-one ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One (Figure 1A) is an assistant robot manufactured by Robotnik. It has several sensors, among them, a RGBD camera, a LIDAR sensor, and an inertial unit. It can operate a manipulator arm attached to its torso and has a wheeled base for moving around the room. Orbi-One includes a wireless access point, which allows WiFi communications with other robots and computers.&lt;br /&gt;
&lt;br /&gt;
The software to control the robot hardware is based on a ROS framework. ROS is basically a set of libraries for robotics similar to operating system services, providing hardware abstraction for sensors and actuators, low-level device control, and inter-process communication. Computation takes place in processes named Nodes, which can receive and send Messages. Nodes publish Messages into information buffers called Topics.&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
In order to acquire ground-truth data about person location in the study area, we need an RTLS for indoor environments. The KIO RTLS commercial solution by Eliko has been used. KIO is a precise RTLS for tracking any object in 2- or 3-dimensional space. The Ultra Wideband technology enables to micro-position objects through obstructions. KIO also works in non-line-of-sight conditions and both indoors and outdoors.&lt;br /&gt;
&lt;br /&gt;
KIO comes in two main configurations. The Regular Cell configuration guarantees a reliable accuracy of ±30 cm, according to the manufacturer’s specifications. The Small Cell configuration is designed for location-critical applications and provides reliable ±5 cm accuracy, according to the manufacturer’s specifications. Calibration done by the authors of this paper on the mock-up apartment shows that the error is higher in some areas, and lower in others, but on average, the claims of the manufacturer are correct.&lt;br /&gt;
&lt;br /&gt;
KIO calculates the position of a mobile transceiver, called a Tag. In order to do so, KIO uses radio beacons, called Anchors, distributed in known positions in the surroundings. Figure 1D shows a KIO anchor. KIO tags are the same size and must be placed on-board the tracking subject, in our case people. The red dots in Figure 1C show the location of the six anchors used in these experiments. They are placed on the ceiling. The distribution of the anchors has been chosen following the method shown in Guerrero-Higueras et al. (2017).&lt;br /&gt;
&lt;br /&gt;
=== Leg Detector (D) ===&lt;br /&gt;
&lt;br /&gt;
LD is a ROS package, which takes messages published by a LIDAR sensor as input and uses a machine-learning-trained classifier to detect groups of laser readings as possible legs. The code is available in a public repository, but is unsupported at this time.&lt;br /&gt;
&lt;br /&gt;
LD publishes the location for the individual legs. It can also attempt to pair the legs and publish their average as an estimate of where the center of a person is. LD may optionally also publish visualization marker messages to indicate where detections happened.&lt;br /&gt;
&lt;br /&gt;
=== PeTra ===&lt;br /&gt;
&lt;br /&gt;
PeTra is a person-tracker tool for detecting and tracking, developed by the Robotics Group at the University of León. The system is based on a Convolutional Neural Network (CNN) using a configuration based on the U-Net architecture by Ronneberger et al. (2015).&lt;br /&gt;
&lt;br /&gt;
The system performs the following steps in real time:&lt;br /&gt;
&lt;br /&gt;
1. First, the data provided by the LIDAR sensor are processed to build a two dimensional occupancy map centered around the robot. This occupancy map is represented as a binary matrix, where 1s denote positions where the LIDAR scan found an obstacle, and 0s denote positions where the LIDAR scan either went through without detecting any obstacle or did not go through that position.&lt;br /&gt;
&lt;br /&gt;
2. Then, the occupancy map is relayed to the network as input data. The network produces a second occupancy map representing the zones where legs have been detected.&lt;br /&gt;
&lt;br /&gt;
3. Finally, center of mass calculations return the location of persons. PeTra also publishes locations for the individual legs and Marker messages for visualization.&lt;br /&gt;
&lt;br /&gt;
[http://niebla.unileon.es/proyectos/attachments/download/343/npy_train_test_globales.tar.gz train data]&lt;br /&gt;
&lt;br /&gt;
=== Recording procedure ===&lt;br /&gt;
&lt;br /&gt;
The data were gathered in 14 different situations. In all of them, Orbi-One stood still as one or more people, carrying a KIO tag, moved around it. Three different locations for Orbi-One were defined (see Figure 1C) resulting in 42 scenarios (14 situations × 3 Orbi-One locations). Figure 2 shows the 14 different recognition scenarios recorded. These scenarios have been chosen according to different situations that may occur in robotics competitions such as ERL or RoboCup.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig2.png|frame|center|'''Fig. 2''': recognition scenarios recorded.]]&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
A rosbag file was created for each scenario (except for situations 3, 12, and 13 where 3 rosbag files where recorded, for situation 4 where 4 rosbag files where recorded, and for situation 9 where 5 rosbag files where recorded), recording LIDAR sensor measurements, location estimates from PeTra and LD, locations from KIO RTLS, and other useful data. Specifically, the following data were included in the rosbag files:&lt;br /&gt;
&lt;br /&gt;
* LIDAR sensor data. Data from LIDAR sensors are provided as ROS LaserScan Messages, which include, among other information, the following: acquisition time of the first ray in the scan, start/end angle of the scan, angular distance between measurements, and range data.&lt;br /&gt;
* PeTra location estimates provided as ROS PointStamped Messages, which include a position [x, y, z] and a timestamp.&lt;br /&gt;
* Location estimates calculated by LD. It publishes data for individual legs (as ROS PositionMeasurementArray Messages). It also attempts to pair the legs and publishes their average as an estimate of where the center of a person is as a ROS PositionMeasurement Message.&lt;br /&gt;
* Locations provided by KIO RTLS also provided as ROS PointStamped Messages.&lt;br /&gt;
* Messages from /map, /odom, and /tf ROS topics, which include map information, odometry of the robot base, and transformation information, respectively.&lt;br /&gt;
&lt;br /&gt;
Differents versions of the dataset are enumerated below.&lt;br /&gt;
&lt;br /&gt;
=== v1.0 [Nov-2017] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a first version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Location 1. Kitchen====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_01.bag test_01.bag]: '''duration:''' 14:56s, '''size:''' 227.8 MB, '''start date/time:''' Jul 20, 2017 12:49:21.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Jul 26, 2017 11:01:24.72&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_1.bag test_03_1.bag]: '''duration:''' 39.9s, '''size:''' 10.4 MB, '''start date/time:''' Jul 20, 2017 13:27:25.50&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_2.bag test_03_2.bag]: '''duration:''' 39.1s, '''size:''' 10.2 MB, '''start date/time:''' Jul 20, 2017 13:28:56.41&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_3.bag test_03_3.bag]: '''duration:''' 40.5s, '''size:''' 10.5 MB, '''start date/time:''' Jul 20, 2017 13:30:04.94&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_1.bag test_04_1.bag]: '''duration:''' 58.3s, '''size:''' 15.0 MB, '''start date/time:''' Jul 25, 2017 10:39:52.62 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_2.bag test_04_2.bag]: '''duration:''' 57.2s, '''size:''' 14.7 MB, '''start date/time:''' Jul 25, 2017 10:41:16.31&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_3.bag test_04_3.bag]: '''duration:''' 50.5s, '''size:''' 13.0 MB, '''start date/time:''' Jul 25, 2017 10:42:44.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_4.bag test_04_4.bag]: '''duration:''' 1:01s, '''size:''' 15.7 MB, '''start date/time:''' Jul 25, 2017 10:43:52.44&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_05.bag test_05.bag]: '''duration:''' 15:15s, '''size:''' 236.8 MB, '''start date/time:''' Jul 26, 2017 11:33:13.31&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_06.bag test_06.bag]: '''duration:''' 09:18s, '''size:''' 143.3 MB, '''start date/time:''' Jul 26, 2017 12:25:45.12&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Jul 25, 2017 11:40:01.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_08.bag test_08.bag]: '''duration:''' 03:39s, '''size:''' 55.8 MB, '''start date/time:''' Jul 25, 2017 12:25:29.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_1.bag test_09_1.bag]: '''duration:''' 22.9s, '''size:''' 6.1 MB, '''start date/time:''' Jul 25, 2017 10:50:02.95 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_2.bag test_09_2.bag]: '''duration:''' 20.8s, '''size:''' 5.6 MB, '''start date/time:''' Jul 25, 2017 10:51:02.62&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Jul 25, 2017 10:51:45.96 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_4.bag test_09_4.bag]: '''duration:''' 29.3s, '''size:''' 7.9 MB, '''start date/time:''' Jul 25, 2017 10:52:51.24&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_5.bag test_09_5.bag]: '''duration:''' 36.9s, '''size:''' 9.7 MB, '''start date/time:''' Jul 25, 2017 10:54:00.13&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_10.bag test_10.bag]: '''duration:''' 15:50s, '''size:''' 240.5 MB, '''start date/time:''' Jul 20, 2017 13:07:40.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 58.7 MB, '''start date/time:''' Jul 25, 2017 11:48:33.90&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_1.bag test_12_1.bag]: '''duration:''' 43.6s, '''size:''' 11.3 MB, '''start date/time:''' Jul 20, 2017 13:33:23.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_2.bag test_12_2.bag]: '''duration:''' 44.3s, '''size:''' 11.5 MB, '''start date/time:''' Jul 20, 2017 13:34:24.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_3.bag test_12_3.bag]: '''duration:''' 37.3s, '''size:'''  9.7 MB, '''start date/time:''' Jul 20, 2017 13:35:31.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_1.bag test_13_1.bag]: '''duration:''' 57.8s, '''size:''' 14.9 MB, '''start date/time:''' Jul 25, 2017 11:01:15.23&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_2.bag test_13_2.bag]: '''duration:''' 59.9s, '''size:''' 15.4 MB, '''start date/time:''' Jul 25, 2017 11:02:37.85&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_3.bag test_13_3.bag]: '''duration:''' 54.0s, '''size:''' 13.9 MB, '''start date/time:''' Jul 25, 2017 11:04:30.69&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_14.bag test_14.bag]: '''duration:''' 05.57s, '''size:''' 90.5 MB, '''start date/time:''' Jul 25, 2017 11:10:32.97&lt;br /&gt;
&lt;br /&gt;
====Location 2. Living room====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_01.bag test_01.bag]: '''duration:''' 15:37s, '''size:''' 239.7 MB, '''start date/time:''' Nov 16, 2017 12:20:19.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_02.bag test_02.bag]: '''duration:''' 15:05s, '''size:''' 231.4 MB, '''start date/time:''' Nov 27, 2017 18:32:24.35 &lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_1.bag test_03_1.bag]: '''duration:''' 55.2s, '''size:''' 14.5 MB, '''start date/time:''' Nov 16, 2017 12:43:58.29 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_2.bag test_03_2.bag]: '''duration:''' 44.4s, '''size:''' 10.2 MB, '''start date/time:''' Nov 16, 2017 12:46:31.99&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_3.bag test_03_3.bag]: '''duration:''' 51.1s, '''size:''' 13.4 MB, '''start date/time:''' Nov 16, 2017 12:47:29.66&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_1.bag test_04_1.bag]: '''duration:''' 01:23s, '''size:''' 21.6 MB, '''start date/time:''' Nov 16, 2017 1:50:22.18 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_2.bag test_04_2.bag]: '''duration:''' 01:20s, '''size:''' 21.0 MB, '''start date/time:''' Nov 16, 2017 12:52:18.44&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_3.bag test_04_3.bag]: '''duration:''' 01:09s, '''size:''' 18.1 MB, '''start date/time:''' Nov 16, 2017 12:53:51.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_4.bag test_04_4.bag]: '''duration:''' 1:21s, '''size:''' 21.3 MB, '''start date/time:''' Nov 16, 2017 12:55:31.81&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_05.bag test_05.bag]: '''duration:''' 15:07s, '''size:''' 232.4 MB, '''start date/time:''' Nov 28 2017 18:33:16.24&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_06.bag test_06.bag]: '''duration:''' 09:14s, '''size:''' 136.2 MB, '''start date/time:''' Nov 29 2017 13:12:42.61 &lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_07.bag test_07.bag]: '''duration:''' 4:39s, '''size:''' 71.5 MB, '''start date/time:''' Nov 22 2017 12:48:33.86 &lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_08.bag test_08.bag]: '''duration:''' 03:04s, '''size:''' 47.2 MB, '''start date/time:''' Nov 22 2017 12:59:08.58&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_1.bag test_09_1.bag]: '''duration:''' 28.8s, '''size:''' 7.7 MB, '''start date/time:''' Nov 28, 2017 17:17:46.49 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_2.bag test_09_2.bag]: '''duration:''' 31.9s, '''size:''' 8.4 MB, '''start date/time:''' Nov 28, 2017 17:20:48.26&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Nov 28, 2017 17:20:05.96&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_4.bag test_09_4.bag]: '''duration:''' 29.9s, '''size:''' 7.9 MB, '''start date/time:''' Nov 28, 2017 17:19:26.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_5.bag test_09_5.bag]: '''duration:''' 30.8s, '''size:''' 8.1 MB, '''start date/time:''' Nov 28, 2017 17:21:20.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_10.bag test_10.bag]: '''duration:''' 16:05s, '''size:''' 246.6 MB, '''start date/time:''' Nov 22 2017 11:55:02.39&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_11.bag test_11.bag]: '''duration:''' 03:25s, '''size:''' 52.8 MB, '''start date/time:''' Nov 22 2017 13:08:11.99&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_1.bag test_12_1.bag]: '''duration:''' 01:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:55:26.25&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_2.bag test_12_2.bag]: '''duration:''' 1:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:56:52.47&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_3.bag test_12_3.bag]: '''duration:''' 50.2s, '''size:'''  13.1 MB, '''start date/time:''' Nov 16, 2017 11:44:00.10&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_1.bag test_13_1.bag]: '''duration:''' 58.2s, '''size:''' 15.1 MB, '''start date/time:''' Nov 22, 2017 11:48:35.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_2.bag test_13_2.bag]: '''duration:''' 01:05s, '''size:''' 16.9 MB, '''start date/time:''' Nov 22, 2017 11:49:48.18&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_3.bag test_13_3.bag]: '''duration:''' 01:14s, '''size:''' 19.2 MB, '''start date/time:''' Nov 22, 2017 11:51:22.68&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_14.bag test_14.bag]: '''duration:''' 06:17s, '''size:''' 96.3 MB, '''start date/time:''' Nov 22 2017 12:23:07.92&lt;br /&gt;
&lt;br /&gt;
====Location 3. Bedroom====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_01.bag test_01.bag]: '''duration:''' 15:17s, '''size:''' 234.6 MB, '''start date/time:''' Nov 21, 2017 18:56:54.87&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 17:37:34.08&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_1.bag test_03_1.bag]: '''duration:''' 44.7s, '''size:''' 11.7 MB, '''start date/time:''' Nov 21, 2017 13:16:51.65&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_2.bag test_03_2.bag]: '''duration:''' 39.9s, '''size:''' 10.5 MB, '''start date/time:''' Nov 21, 2017 19:19:06.27&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_3.bag test_03_3.bag]: '''duration:''' 41.2s, '''size:''' 10.8 MB, '''start date/time:''' Nov 21, 2017 19:18:06.86&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_1.bag test_04_1.bag]: '''duration:''' 01:01s, '''size:''' 16.0 MB, '''start date/time:''' Nov 21, 2017 19:21:30.99 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_2.bag test_04_2.bag]: '''duration:''' 01:01s, '''size:''' 15.9 MB, '''start date/time:''' Nov 21, 2017 19:25:38.00&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_3.bag test_04_3.bag]: '''duration:''' 58.0s, '''size:''' 15.1 MB, '''start date/time:''' Nov 21, 2017 19:27:16.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_4.bag test_04_4.bag]: '''duration:''' 1:00s, '''size:''' 15.8 MB, '''start date/time:''' Nov 21, 2017 19:29:39.28&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_05.bag test_05.bag]: '''duration:''' 15:11s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 19:05:20.19&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_06.bag test_06.bag]: '''duration:''' 09:03s, '''size:''' 133.6 MB, '''start date/time:''' Nov 26, 2017 13:24:25.64&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Nov 27, 2017 17:34:50.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_08.bag test_08.bag]: '''duration:''' 03:47s, '''size:''' 58.3 MB, '''start date/time:''' Nov 27, 2017 17:41:48.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_1.bag test_09_1.bag]: '''duration:''' 31.3s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:23:58.31 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_2.bag test_09_2.bag]: '''duration:''' 30.4s, '''size:''' 8.0 MB, '''start date/time:''' Nov 28, 2017 17:24:41.09&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_3.bag test_09_3.bag]: '''duration:''' 31.2s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:25:22.22&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_4.bag test_09_4.bag]: '''duration:''' 31.4s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:26:06.98&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_5.bag test_09_5.bag]: '''duration:''' 33.2s, '''size:''' 8.8 MB, '''start date/time:''' Nov 28, 2017 17:26:51.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_10.bag test_10.bag]: '''duration:''' 10:00s, '''size:''' 153.7 MB, '''start date/time:''' Nov 22, 2017 18:15:29.53&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 59.1 MB, '''start date/time:''' Nov 22, 2017 18:25:54.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_1.bag test_12_1.bag]: '''duration:''' 41.3s, '''size:''' 10.8 MB, '''start date/time:''' Nov 22, 2017 18:35:11.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_2.bag test_12_2.bag]: '''duration:''' 41.8s, '''size:''' 10.9 MB, '''start date/time:''' Nov 22, 2017 18:36:12.11&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_3.bag test_12_3.bag]: '''duration:''' 39.8s, '''size:'''  10.5 MB, '''start date/time:''' Nov 22, 2017 18:37:05.06&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_1.bag test_13_1.bag]: '''duration:''' 01:01s, '''size:''' 14.9 MB, '''start date/time:''' Nov 22, 2017 18:40:13.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_2.bag test_13_2.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:41:28.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_3.bag test_13_3.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:42:39.52&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_14.bag test_14.bag]: '''duration:''' 06:12s, '''size:''' 95.1 MB, '''start date/time:''' Nov 23, 2017 12:53:06.48&lt;br /&gt;
&lt;br /&gt;
=== v2.0 [Feb-2018] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a second version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Location 1. Kitchen====&lt;br /&gt;
&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_01.bag cocina_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_02.bag cocina_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_03.bag cocina_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_04.bag cocina_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_05.bag cocina_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_06.bag cocina_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_07.bag cocina_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_08.bag cocina_08.bag]&lt;br /&gt;
* Situation 19: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_09.bag cocina_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_10.bag cocina_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_11.bag cocina_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_12.bag cocina_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_13.bag cocina_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_14.bag cocina_14.bag]&lt;br /&gt;
&lt;br /&gt;
====Location 2. Living room====&lt;br /&gt;
&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_01.bag salon_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_02.bag salon_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_03.bag salon_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_04.bag salon_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_05.bag salon_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_06.bag salon_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_07.bag salon_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_08.bag salon_08.bag]&lt;br /&gt;
* Situation 19: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_09.bag salon_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_10.bag salon_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_11.bag salon_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_12.bag salon_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_13.bag salon_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_14.bag salon_14.bag]&lt;br /&gt;
&lt;br /&gt;
====Location 3. Bedroom====&lt;br /&gt;
&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_01.bag habitacion_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_02.bag habitacion_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_03.bag habitacion_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_04.bag habitacion_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_05.bag habitacion_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_06.bag habitacion_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_07.bag habitacion_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_08.bag habitacion_08.bag]&lt;br /&gt;
* Situation 19: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_09.bag habitacion_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_10.bag habitacion_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_11.bag habitacion_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_12.bag habitacion_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_13.bag habitacion_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_14.bag habitacion_14.bag]&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5214</id>
		<title>Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5214"/>
				<updated>2018-02-13T12:00:14Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: /* Location 2. Living room */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;blockquote style=&amp;quot;background-color:#ffe;border:1px solid #fb0;padding:5px 10px&amp;quot;&amp;gt;SciCrunch! reference: '''Range-based people tracker classifiers Benchmark Dataset, RRID:SCR_015743.'''&amp;lt;/blockquote&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to evaluate the performance of different approaches for detecting and tracking people by using LIDAR sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers. &lt;br /&gt;
&lt;br /&gt;
Further information at [https://www.frontiersin.org/articles/10.3389/fnbot.2017.00072/full Álvarez-Aparicio et al. (2018)].&lt;br /&gt;
&lt;br /&gt;
== Materials and methods ==&lt;br /&gt;
&lt;br /&gt;
The following section describes the materials (shown in Figure 1) used to gather data, which include: a certified study area, an autonomous robot with an on-board LIDAR sensor, and a real-time location system (RTLS) to obtain ground-truth data about person location. Recorded data include location estimates calculated by two people trackers, LD and PeTra, also described below. Finally, the recording procedure used to build the dataset is explained.&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig1.jpg|frame|center|'''Fig. 1''': Materials used in the experiments. (A) Orbi-One robot. (B) Leon@Home Testbed plan. (C) Floor plan of the apartment. Red dots show the location of KIO anchors. Black numbered dots denote Orbi-One locations during the data gathering. (D) KIO anchors. (E) General view of the apartment. (F) Furniture in mock-up apartment.]]&lt;br /&gt;
&lt;br /&gt;
=== Leon@Home Testbed ===&lt;br /&gt;
&lt;br /&gt;
Data have been gathered at Leon@Home Testbed. This is a Certified Testbed of the European Robotics league (ERL). Its main purpose is to benchmark service robots in a realistic home environment. Our testbed is made up of four parts, shown in Figure 1B: a mock-up apartment, a control zone with direct vision (glass wall) into the apartment, a small workshop, and a larger development zone, where researchers work.&lt;br /&gt;
&lt;br /&gt;
Leon@Home Testbed is located on the second floor of the Módulo de Investigación en Cibernética (Building for Research in Cybernetics) on the Vegazana Campus of the University of León (Spain). The apartment is a single bedroom mock-up home built in an 8 m × 7 m space. Figure 1C shows a plan of the apartment. 60 cm high walls divide it into a kitchen, living room, bathroom, and bedroom. The furniture (Figures 1E,F) has been chosen to test different robot abilities. For instance, the kitchen cabinets all have different types of handles.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-one ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One (Figure 1A) is an assistant robot manufactured by Robotnik. It has several sensors, among them, a RGBD camera, a LIDAR sensor, and an inertial unit. It can operate a manipulator arm attached to its torso and has a wheeled base for moving around the room. Orbi-One includes a wireless access point, which allows WiFi communications with other robots and computers.&lt;br /&gt;
&lt;br /&gt;
The software to control the robot hardware is based on a ROS framework. ROS is basically a set of libraries for robotics similar to operating system services, providing hardware abstraction for sensors and actuators, low-level device control, and inter-process communication. Computation takes place in processes named Nodes, which can receive and send Messages. Nodes publish Messages into information buffers called Topics.&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
In order to acquire ground-truth data about person location in the study area, we need an RTLS for indoor environments. The KIO RTLS commercial solution by Eliko has been used. KIO is a precise RTLS for tracking any object in 2- or 3-dimensional space. The Ultra Wideband technology enables to micro-position objects through obstructions. KIO also works in non-line-of-sight conditions and both indoors and outdoors.&lt;br /&gt;
&lt;br /&gt;
KIO comes in two main configurations. The Regular Cell configuration guarantees a reliable accuracy of ±30 cm, according to the manufacturer’s specifications. The Small Cell configuration is designed for location-critical applications and provides reliable ±5 cm accuracy, according to the manufacturer’s specifications. Calibration done by the authors of this paper on the mock-up apartment shows that the error is higher in some areas, and lower in others, but on average, the claims of the manufacturer are correct.&lt;br /&gt;
&lt;br /&gt;
KIO calculates the position of a mobile transceiver, called a Tag. In order to do so, KIO uses radio beacons, called Anchors, distributed in known positions in the surroundings. Figure 1D shows a KIO anchor. KIO tags are the same size and must be placed on-board the tracking subject, in our case people. The red dots in Figure 1C show the location of the six anchors used in these experiments. They are placed on the ceiling. The distribution of the anchors has been chosen following the method shown in Guerrero-Higueras et al. (2017).&lt;br /&gt;
&lt;br /&gt;
=== Leg Detector (D) ===&lt;br /&gt;
&lt;br /&gt;
LD is a ROS package, which takes messages published by a LIDAR sensor as input and uses a machine-learning-trained classifier to detect groups of laser readings as possible legs. The code is available in a public repository, but is unsupported at this time.&lt;br /&gt;
&lt;br /&gt;
LD publishes the location for the individual legs. It can also attempt to pair the legs and publish their average as an estimate of where the center of a person is. LD may optionally also publish visualization marker messages to indicate where detections happened.&lt;br /&gt;
&lt;br /&gt;
=== PeTra ===&lt;br /&gt;
&lt;br /&gt;
PeTra is a person-tracker tool for detecting and tracking, developed by the Robotics Group at the University of León. The system is based on a Convolutional Neural Network (CNN) using a configuration based on the U-Net architecture by Ronneberger et al. (2015).&lt;br /&gt;
&lt;br /&gt;
The system performs the following steps in real time:&lt;br /&gt;
&lt;br /&gt;
1. First, the data provided by the LIDAR sensor are processed to build a two dimensional occupancy map centered around the robot. This occupancy map is represented as a binary matrix, where 1s denote positions where the LIDAR scan found an obstacle, and 0s denote positions where the LIDAR scan either went through without detecting any obstacle or did not go through that position.&lt;br /&gt;
&lt;br /&gt;
2. Then, the occupancy map is relayed to the network as input data. The network produces a second occupancy map representing the zones where legs have been detected.&lt;br /&gt;
&lt;br /&gt;
3. Finally, center of mass calculations return the location of persons. PeTra also publishes locations for the individual legs and Marker messages for visualization.&lt;br /&gt;
&lt;br /&gt;
=== Recording procedure ===&lt;br /&gt;
&lt;br /&gt;
The data were gathered in 14 different situations. In all of them, Orbi-One stood still as one or more people, carrying a KIO tag, moved around it. Three different locations for Orbi-One were defined (see Figure 1C) resulting in 42 scenarios (14 situations × 3 Orbi-One locations). Figure 2 shows the 14 different recognition scenarios recorded. These scenarios have been chosen according to different situations that may occur in robotics competitions such as ERL or RoboCup.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig2.png|frame|center|'''Fig. 2''': recognition scenarios recorded.]]&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
A rosbag file was created for each scenario (except for situations 3, 12, and 13 where 3 rosbag files where recorded, for situation 4 where 4 rosbag files where recorded, and for situation 9 where 5 rosbag files where recorded), recording LIDAR sensor measurements, location estimates from PeTra and LD, locations from KIO RTLS, and other useful data. Specifically, the following data were included in the rosbag files:&lt;br /&gt;
&lt;br /&gt;
* LIDAR sensor data. Data from LIDAR sensors are provided as ROS LaserScan Messages, which include, among other information, the following: acquisition time of the first ray in the scan, start/end angle of the scan, angular distance between measurements, and range data.&lt;br /&gt;
* PeTra location estimates provided as ROS PointStamped Messages, which include a position [x, y, z] and a timestamp.&lt;br /&gt;
* Location estimates calculated by LD. It publishes data for individual legs (as ROS PositionMeasurementArray Messages). It also attempts to pair the legs and publishes their average as an estimate of where the center of a person is as a ROS PositionMeasurement Message.&lt;br /&gt;
* Locations provided by KIO RTLS also provided as ROS PointStamped Messages.&lt;br /&gt;
* Messages from /map, /odom, and /tf ROS topics, which include map information, odometry of the robot base, and transformation information, respectively.&lt;br /&gt;
&lt;br /&gt;
Differents versions of the dataset are enumerated below.&lt;br /&gt;
&lt;br /&gt;
=== v1.0 [Nov-2017] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a first version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Location 1. Kitchen====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_01.bag test_01.bag]: '''duration:''' 14:56s, '''size:''' 227.8 MB, '''start date/time:''' Jul 20, 2017 12:49:21.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Jul 26, 2017 11:01:24.72&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_1.bag test_03_1.bag]: '''duration:''' 39.9s, '''size:''' 10.4 MB, '''start date/time:''' Jul 20, 2017 13:27:25.50&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_2.bag test_03_2.bag]: '''duration:''' 39.1s, '''size:''' 10.2 MB, '''start date/time:''' Jul 20, 2017 13:28:56.41&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_3.bag test_03_3.bag]: '''duration:''' 40.5s, '''size:''' 10.5 MB, '''start date/time:''' Jul 20, 2017 13:30:04.94&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_1.bag test_04_1.bag]: '''duration:''' 58.3s, '''size:''' 15.0 MB, '''start date/time:''' Jul 25, 2017 10:39:52.62 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_2.bag test_04_2.bag]: '''duration:''' 57.2s, '''size:''' 14.7 MB, '''start date/time:''' Jul 25, 2017 10:41:16.31&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_3.bag test_04_3.bag]: '''duration:''' 50.5s, '''size:''' 13.0 MB, '''start date/time:''' Jul 25, 2017 10:42:44.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_4.bag test_04_4.bag]: '''duration:''' 1:01s, '''size:''' 15.7 MB, '''start date/time:''' Jul 25, 2017 10:43:52.44&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_05.bag test_05.bag]: '''duration:''' 15:15s, '''size:''' 236.8 MB, '''start date/time:''' Jul 26, 2017 11:33:13.31&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_06.bag test_06.bag]: '''duration:''' 09:18s, '''size:''' 143.3 MB, '''start date/time:''' Jul 26, 2017 12:25:45.12&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Jul 25, 2017 11:40:01.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_08.bag test_08.bag]: '''duration:''' 03:39s, '''size:''' 55.8 MB, '''start date/time:''' Jul 25, 2017 12:25:29.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_1.bag test_09_1.bag]: '''duration:''' 22.9s, '''size:''' 6.1 MB, '''start date/time:''' Jul 25, 2017 10:50:02.95 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_2.bag test_09_2.bag]: '''duration:''' 20.8s, '''size:''' 5.6 MB, '''start date/time:''' Jul 25, 2017 10:51:02.62&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Jul 25, 2017 10:51:45.96 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_4.bag test_09_4.bag]: '''duration:''' 29.3s, '''size:''' 7.9 MB, '''start date/time:''' Jul 25, 2017 10:52:51.24&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_5.bag test_09_5.bag]: '''duration:''' 36.9s, '''size:''' 9.7 MB, '''start date/time:''' Jul 25, 2017 10:54:00.13&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_10.bag test_10.bag]: '''duration:''' 15:50s, '''size:''' 240.5 MB, '''start date/time:''' Jul 20, 2017 13:07:40.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 58.7 MB, '''start date/time:''' Jul 25, 2017 11:48:33.90&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_1.bag test_12_1.bag]: '''duration:''' 43.6s, '''size:''' 11.3 MB, '''start date/time:''' Jul 20, 2017 13:33:23.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_2.bag test_12_2.bag]: '''duration:''' 44.3s, '''size:''' 11.5 MB, '''start date/time:''' Jul 20, 2017 13:34:24.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_3.bag test_12_3.bag]: '''duration:''' 37.3s, '''size:'''  9.7 MB, '''start date/time:''' Jul 20, 2017 13:35:31.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_1.bag test_13_1.bag]: '''duration:''' 57.8s, '''size:''' 14.9 MB, '''start date/time:''' Jul 25, 2017 11:01:15.23&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_2.bag test_13_2.bag]: '''duration:''' 59.9s, '''size:''' 15.4 MB, '''start date/time:''' Jul 25, 2017 11:02:37.85&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_3.bag test_13_3.bag]: '''duration:''' 54.0s, '''size:''' 13.9 MB, '''start date/time:''' Jul 25, 2017 11:04:30.69&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_14.bag test_14.bag]: '''duration:''' 05.57s, '''size:''' 90.5 MB, '''start date/time:''' Jul 25, 2017 11:10:32.97&lt;br /&gt;
&lt;br /&gt;
====Location 2. Living room====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_01.bag test_01.bag]: '''duration:''' 15:37s, '''size:''' 239.7 MB, '''start date/time:''' Nov 16, 2017 12:20:19.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_02.bag test_02.bag]: '''duration:''' 15:05s, '''size:''' 231.4 MB, '''start date/time:''' Nov 27, 2017 18:32:24.35 &lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_1.bag test_03_1.bag]: '''duration:''' 55.2s, '''size:''' 14.5 MB, '''start date/time:''' Nov 16, 2017 12:43:58.29 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_2.bag test_03_2.bag]: '''duration:''' 44.4s, '''size:''' 10.2 MB, '''start date/time:''' Nov 16, 2017 12:46:31.99&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_3.bag test_03_3.bag]: '''duration:''' 51.1s, '''size:''' 13.4 MB, '''start date/time:''' Nov 16, 2017 12:47:29.66&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_1.bag test_04_1.bag]: '''duration:''' 01:23s, '''size:''' 21.6 MB, '''start date/time:''' Nov 16, 2017 1:50:22.18 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_2.bag test_04_2.bag]: '''duration:''' 01:20s, '''size:''' 21.0 MB, '''start date/time:''' Nov 16, 2017 12:52:18.44&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_3.bag test_04_3.bag]: '''duration:''' 01:09s, '''size:''' 18.1 MB, '''start date/time:''' Nov 16, 2017 12:53:51.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_4.bag test_04_4.bag]: '''duration:''' 1:21s, '''size:''' 21.3 MB, '''start date/time:''' Nov 16, 2017 12:55:31.81&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_05.bag test_05.bag]: '''duration:''' 15:07s, '''size:''' 232.4 MB, '''start date/time:''' Nov 28 2017 18:33:16.24&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_06.bag test_06.bag]: '''duration:''' 09:14s, '''size:''' 136.2 MB, '''start date/time:''' Nov 29 2017 13:12:42.61 &lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_07.bag test_07.bag]: '''duration:''' 4:39s, '''size:''' 71.5 MB, '''start date/time:''' Nov 22 2017 12:48:33.86 &lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_08.bag test_08.bag]: '''duration:''' 03:04s, '''size:''' 47.2 MB, '''start date/time:''' Nov 22 2017 12:59:08.58&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_1.bag test_09_1.bag]: '''duration:''' 28.8s, '''size:''' 7.7 MB, '''start date/time:''' Nov 28, 2017 17:17:46.49 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_2.bag test_09_2.bag]: '''duration:''' 31.9s, '''size:''' 8.4 MB, '''start date/time:''' Nov 28, 2017 17:20:48.26&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Nov 28, 2017 17:20:05.96&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_4.bag test_09_4.bag]: '''duration:''' 29.9s, '''size:''' 7.9 MB, '''start date/time:''' Nov 28, 2017 17:19:26.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_5.bag test_09_5.bag]: '''duration:''' 30.8s, '''size:''' 8.1 MB, '''start date/time:''' Nov 28, 2017 17:21:20.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_10.bag test_10.bag]: '''duration:''' 16:05s, '''size:''' 246.6 MB, '''start date/time:''' Nov 22 2017 11:55:02.39&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_11.bag test_11.bag]: '''duration:''' 03:25s, '''size:''' 52.8 MB, '''start date/time:''' Nov 22 2017 13:08:11.99&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_1.bag test_12_1.bag]: '''duration:''' 01:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:55:26.25&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_2.bag test_12_2.bag]: '''duration:''' 1:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:56:52.47&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_3.bag test_12_3.bag]: '''duration:''' 50.2s, '''size:'''  13.1 MB, '''start date/time:''' Nov 16, 2017 11:44:00.10&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_1.bag test_13_1.bag]: '''duration:''' 58.2s, '''size:''' 15.1 MB, '''start date/time:''' Nov 22, 2017 11:48:35.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_2.bag test_13_2.bag]: '''duration:''' 01:05s, '''size:''' 16.9 MB, '''start date/time:''' Nov 22, 2017 11:49:48.18&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_3.bag test_13_3.bag]: '''duration:''' 01:14s, '''size:''' 19.2 MB, '''start date/time:''' Nov 22, 2017 11:51:22.68&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_14.bag test_14.bag]: '''duration:''' 06:17s, '''size:''' 96.3 MB, '''start date/time:''' Nov 22 2017 12:23:07.92&lt;br /&gt;
&lt;br /&gt;
====Location 3. Bedroom====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_01.bag test_01.bag]: '''duration:''' 15:17s, '''size:''' 234.6 MB, '''start date/time:''' Nov 21, 2017 18:56:54.87&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 17:37:34.08&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_1.bag test_03_1.bag]: '''duration:''' 44.7s, '''size:''' 11.7 MB, '''start date/time:''' Nov 21, 2017 13:16:51.65&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_2.bag test_03_2.bag]: '''duration:''' 39.9s, '''size:''' 10.5 MB, '''start date/time:''' Nov 21, 2017 19:19:06.27&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_3.bag test_03_3.bag]: '''duration:''' 41.2s, '''size:''' 10.8 MB, '''start date/time:''' Nov 21, 2017 19:18:06.86&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_1.bag test_04_1.bag]: '''duration:''' 01:01s, '''size:''' 16.0 MB, '''start date/time:''' Nov 21, 2017 19:21:30.99 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_2.bag test_04_2.bag]: '''duration:''' 01:01s, '''size:''' 15.9 MB, '''start date/time:''' Nov 21, 2017 19:25:38.00&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_3.bag test_04_3.bag]: '''duration:''' 58.0s, '''size:''' 15.1 MB, '''start date/time:''' Nov 21, 2017 19:27:16.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_4.bag test_04_4.bag]: '''duration:''' 1:00s, '''size:''' 15.8 MB, '''start date/time:''' Nov 21, 2017 19:29:39.28&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_05.bag test_05.bag]: '''duration:''' 15:11s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 19:05:20.19&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_06.bag test_06.bag]: '''duration:''' 09:03s, '''size:''' 133.6 MB, '''start date/time:''' Nov 26, 2017 13:24:25.64&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Nov 27, 2017 17:34:50.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_08.bag test_08.bag]: '''duration:''' 03:47s, '''size:''' 58.3 MB, '''start date/time:''' Nov 27, 2017 17:41:48.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_1.bag test_09_1.bag]: '''duration:''' 31.3s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:23:58.31 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_2.bag test_09_2.bag]: '''duration:''' 30.4s, '''size:''' 8.0 MB, '''start date/time:''' Nov 28, 2017 17:24:41.09&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_3.bag test_09_3.bag]: '''duration:''' 31.2s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:25:22.22&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_4.bag test_09_4.bag]: '''duration:''' 31.4s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:26:06.98&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_5.bag test_09_5.bag]: '''duration:''' 33.2s, '''size:''' 8.8 MB, '''start date/time:''' Nov 28, 2017 17:26:51.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_10.bag test_10.bag]: '''duration:''' 10:00s, '''size:''' 153.7 MB, '''start date/time:''' Nov 22, 2017 18:15:29.53&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 59.1 MB, '''start date/time:''' Nov 22, 2017 18:25:54.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_1.bag test_12_1.bag]: '''duration:''' 41.3s, '''size:''' 10.8 MB, '''start date/time:''' Nov 22, 2017 18:35:11.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_2.bag test_12_2.bag]: '''duration:''' 41.8s, '''size:''' 10.9 MB, '''start date/time:''' Nov 22, 2017 18:36:12.11&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_3.bag test_12_3.bag]: '''duration:''' 39.8s, '''size:'''  10.5 MB, '''start date/time:''' Nov 22, 2017 18:37:05.06&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_1.bag test_13_1.bag]: '''duration:''' 01:01s, '''size:''' 14.9 MB, '''start date/time:''' Nov 22, 2017 18:40:13.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_2.bag test_13_2.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:41:28.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_3.bag test_13_3.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:42:39.52&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_14.bag test_14.bag]: '''duration:''' 06:12s, '''size:''' 95.1 MB, '''start date/time:''' Nov 23, 2017 12:53:06.48&lt;br /&gt;
&lt;br /&gt;
=== v2.0 [Feb-2018] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a second version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Location 1. Kitchen====&lt;br /&gt;
&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_01.bag cocina_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_02.bag cocina_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_03.bag cocina_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_04.bag cocina_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_05.bag cocina_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_06.bag cocina_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_07.bag cocina_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_08.bag cocina_08.bag]&lt;br /&gt;
* Situation 19: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_09.bag cocina_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_10.bag cocina_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_11.bag cocina_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_12.bag cocina_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_13.bag cocina_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_14.bag cocina_14.bag]&lt;br /&gt;
&lt;br /&gt;
====Location 2. Living room====&lt;br /&gt;
&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_01.bag salon_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_02.bag salon_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_03.bag salon_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_04.bag salon_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_05.bag salon_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_06.bag salon_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_07.bag salon_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_08.bag salon_08.bag]&lt;br /&gt;
* Situation 19: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_09.bag salon_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_10.bag salon_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_11.bag salon_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_12.bag salon_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_13.bag salon_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_14.bag salon_14.bag]&lt;br /&gt;
&lt;br /&gt;
====Location 3. Bedroom====&lt;br /&gt;
&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_01.bag habitacion_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_02.bag habitacion_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_03.bag habitacion_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_04.bag habitacion_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_05.bag habitacion_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_06.bag habitacion_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_07.bag habitacion_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_08.bag habitacion_08.bag]&lt;br /&gt;
* Situation 19: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_09.bag habitacion_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_10.bag habitacion_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_11.bag habitacion_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_12.bag habitacion_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_13.bag habitacion_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location3/habitacion_14.bag habitacion_14.bag]&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5213</id>
		<title>Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5213"/>
				<updated>2018-02-13T11:59:13Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: /* Location 1. Kitchen */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;blockquote style=&amp;quot;background-color:#ffe;border:1px solid #fb0;padding:5px 10px&amp;quot;&amp;gt;SciCrunch! reference: '''Range-based people tracker classifiers Benchmark Dataset, RRID:SCR_015743.'''&amp;lt;/blockquote&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to evaluate the performance of different approaches for detecting and tracking people by using LIDAR sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers. &lt;br /&gt;
&lt;br /&gt;
Further information at [https://www.frontiersin.org/articles/10.3389/fnbot.2017.00072/full Álvarez-Aparicio et al. (2018)].&lt;br /&gt;
&lt;br /&gt;
== Materials and methods ==&lt;br /&gt;
&lt;br /&gt;
The following section describes the materials (shown in Figure 1) used to gather data, which include: a certified study area, an autonomous robot with an on-board LIDAR sensor, and a real-time location system (RTLS) to obtain ground-truth data about person location. Recorded data include location estimates calculated by two people trackers, LD and PeTra, also described below. Finally, the recording procedure used to build the dataset is explained.&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig1.jpg|frame|center|'''Fig. 1''': Materials used in the experiments. (A) Orbi-One robot. (B) Leon@Home Testbed plan. (C) Floor plan of the apartment. Red dots show the location of KIO anchors. Black numbered dots denote Orbi-One locations during the data gathering. (D) KIO anchors. (E) General view of the apartment. (F) Furniture in mock-up apartment.]]&lt;br /&gt;
&lt;br /&gt;
=== Leon@Home Testbed ===&lt;br /&gt;
&lt;br /&gt;
Data have been gathered at Leon@Home Testbed. This is a Certified Testbed of the European Robotics league (ERL). Its main purpose is to benchmark service robots in a realistic home environment. Our testbed is made up of four parts, shown in Figure 1B: a mock-up apartment, a control zone with direct vision (glass wall) into the apartment, a small workshop, and a larger development zone, where researchers work.&lt;br /&gt;
&lt;br /&gt;
Leon@Home Testbed is located on the second floor of the Módulo de Investigación en Cibernética (Building for Research in Cybernetics) on the Vegazana Campus of the University of León (Spain). The apartment is a single bedroom mock-up home built in an 8 m × 7 m space. Figure 1C shows a plan of the apartment. 60 cm high walls divide it into a kitchen, living room, bathroom, and bedroom. The furniture (Figures 1E,F) has been chosen to test different robot abilities. For instance, the kitchen cabinets all have different types of handles.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-one ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One (Figure 1A) is an assistant robot manufactured by Robotnik. It has several sensors, among them, a RGBD camera, a LIDAR sensor, and an inertial unit. It can operate a manipulator arm attached to its torso and has a wheeled base for moving around the room. Orbi-One includes a wireless access point, which allows WiFi communications with other robots and computers.&lt;br /&gt;
&lt;br /&gt;
The software to control the robot hardware is based on a ROS framework. ROS is basically a set of libraries for robotics similar to operating system services, providing hardware abstraction for sensors and actuators, low-level device control, and inter-process communication. Computation takes place in processes named Nodes, which can receive and send Messages. Nodes publish Messages into information buffers called Topics.&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
In order to acquire ground-truth data about person location in the study area, we need an RTLS for indoor environments. The KIO RTLS commercial solution by Eliko has been used. KIO is a precise RTLS for tracking any object in 2- or 3-dimensional space. The Ultra Wideband technology enables to micro-position objects through obstructions. KIO also works in non-line-of-sight conditions and both indoors and outdoors.&lt;br /&gt;
&lt;br /&gt;
KIO comes in two main configurations. The Regular Cell configuration guarantees a reliable accuracy of ±30 cm, according to the manufacturer’s specifications. The Small Cell configuration is designed for location-critical applications and provides reliable ±5 cm accuracy, according to the manufacturer’s specifications. Calibration done by the authors of this paper on the mock-up apartment shows that the error is higher in some areas, and lower in others, but on average, the claims of the manufacturer are correct.&lt;br /&gt;
&lt;br /&gt;
KIO calculates the position of a mobile transceiver, called a Tag. In order to do so, KIO uses radio beacons, called Anchors, distributed in known positions in the surroundings. Figure 1D shows a KIO anchor. KIO tags are the same size and must be placed on-board the tracking subject, in our case people. The red dots in Figure 1C show the location of the six anchors used in these experiments. They are placed on the ceiling. The distribution of the anchors has been chosen following the method shown in Guerrero-Higueras et al. (2017).&lt;br /&gt;
&lt;br /&gt;
=== Leg Detector (D) ===&lt;br /&gt;
&lt;br /&gt;
LD is a ROS package, which takes messages published by a LIDAR sensor as input and uses a machine-learning-trained classifier to detect groups of laser readings as possible legs. The code is available in a public repository, but is unsupported at this time.&lt;br /&gt;
&lt;br /&gt;
LD publishes the location for the individual legs. It can also attempt to pair the legs and publish their average as an estimate of where the center of a person is. LD may optionally also publish visualization marker messages to indicate where detections happened.&lt;br /&gt;
&lt;br /&gt;
=== PeTra ===&lt;br /&gt;
&lt;br /&gt;
PeTra is a person-tracker tool for detecting and tracking, developed by the Robotics Group at the University of León. The system is based on a Convolutional Neural Network (CNN) using a configuration based on the U-Net architecture by Ronneberger et al. (2015).&lt;br /&gt;
&lt;br /&gt;
The system performs the following steps in real time:&lt;br /&gt;
&lt;br /&gt;
1. First, the data provided by the LIDAR sensor are processed to build a two dimensional occupancy map centered around the robot. This occupancy map is represented as a binary matrix, where 1s denote positions where the LIDAR scan found an obstacle, and 0s denote positions where the LIDAR scan either went through without detecting any obstacle or did not go through that position.&lt;br /&gt;
&lt;br /&gt;
2. Then, the occupancy map is relayed to the network as input data. The network produces a second occupancy map representing the zones where legs have been detected.&lt;br /&gt;
&lt;br /&gt;
3. Finally, center of mass calculations return the location of persons. PeTra also publishes locations for the individual legs and Marker messages for visualization.&lt;br /&gt;
&lt;br /&gt;
=== Recording procedure ===&lt;br /&gt;
&lt;br /&gt;
The data were gathered in 14 different situations. In all of them, Orbi-One stood still as one or more people, carrying a KIO tag, moved around it. Three different locations for Orbi-One were defined (see Figure 1C) resulting in 42 scenarios (14 situations × 3 Orbi-One locations). Figure 2 shows the 14 different recognition scenarios recorded. These scenarios have been chosen according to different situations that may occur in robotics competitions such as ERL or RoboCup.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig2.png|frame|center|'''Fig. 2''': recognition scenarios recorded.]]&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
A rosbag file was created for each scenario (except for situations 3, 12, and 13 where 3 rosbag files where recorded, for situation 4 where 4 rosbag files where recorded, and for situation 9 where 5 rosbag files where recorded), recording LIDAR sensor measurements, location estimates from PeTra and LD, locations from KIO RTLS, and other useful data. Specifically, the following data were included in the rosbag files:&lt;br /&gt;
&lt;br /&gt;
* LIDAR sensor data. Data from LIDAR sensors are provided as ROS LaserScan Messages, which include, among other information, the following: acquisition time of the first ray in the scan, start/end angle of the scan, angular distance between measurements, and range data.&lt;br /&gt;
* PeTra location estimates provided as ROS PointStamped Messages, which include a position [x, y, z] and a timestamp.&lt;br /&gt;
* Location estimates calculated by LD. It publishes data for individual legs (as ROS PositionMeasurementArray Messages). It also attempts to pair the legs and publishes their average as an estimate of where the center of a person is as a ROS PositionMeasurement Message.&lt;br /&gt;
* Locations provided by KIO RTLS also provided as ROS PointStamped Messages.&lt;br /&gt;
* Messages from /map, /odom, and /tf ROS topics, which include map information, odometry of the robot base, and transformation information, respectively.&lt;br /&gt;
&lt;br /&gt;
Differents versions of the dataset are enumerated below.&lt;br /&gt;
&lt;br /&gt;
=== v1.0 [Nov-2017] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a first version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Location 1. Kitchen====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_01.bag test_01.bag]: '''duration:''' 14:56s, '''size:''' 227.8 MB, '''start date/time:''' Jul 20, 2017 12:49:21.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Jul 26, 2017 11:01:24.72&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_1.bag test_03_1.bag]: '''duration:''' 39.9s, '''size:''' 10.4 MB, '''start date/time:''' Jul 20, 2017 13:27:25.50&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_2.bag test_03_2.bag]: '''duration:''' 39.1s, '''size:''' 10.2 MB, '''start date/time:''' Jul 20, 2017 13:28:56.41&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_3.bag test_03_3.bag]: '''duration:''' 40.5s, '''size:''' 10.5 MB, '''start date/time:''' Jul 20, 2017 13:30:04.94&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_1.bag test_04_1.bag]: '''duration:''' 58.3s, '''size:''' 15.0 MB, '''start date/time:''' Jul 25, 2017 10:39:52.62 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_2.bag test_04_2.bag]: '''duration:''' 57.2s, '''size:''' 14.7 MB, '''start date/time:''' Jul 25, 2017 10:41:16.31&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_3.bag test_04_3.bag]: '''duration:''' 50.5s, '''size:''' 13.0 MB, '''start date/time:''' Jul 25, 2017 10:42:44.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_4.bag test_04_4.bag]: '''duration:''' 1:01s, '''size:''' 15.7 MB, '''start date/time:''' Jul 25, 2017 10:43:52.44&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_05.bag test_05.bag]: '''duration:''' 15:15s, '''size:''' 236.8 MB, '''start date/time:''' Jul 26, 2017 11:33:13.31&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_06.bag test_06.bag]: '''duration:''' 09:18s, '''size:''' 143.3 MB, '''start date/time:''' Jul 26, 2017 12:25:45.12&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Jul 25, 2017 11:40:01.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_08.bag test_08.bag]: '''duration:''' 03:39s, '''size:''' 55.8 MB, '''start date/time:''' Jul 25, 2017 12:25:29.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_1.bag test_09_1.bag]: '''duration:''' 22.9s, '''size:''' 6.1 MB, '''start date/time:''' Jul 25, 2017 10:50:02.95 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_2.bag test_09_2.bag]: '''duration:''' 20.8s, '''size:''' 5.6 MB, '''start date/time:''' Jul 25, 2017 10:51:02.62&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Jul 25, 2017 10:51:45.96 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_4.bag test_09_4.bag]: '''duration:''' 29.3s, '''size:''' 7.9 MB, '''start date/time:''' Jul 25, 2017 10:52:51.24&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_5.bag test_09_5.bag]: '''duration:''' 36.9s, '''size:''' 9.7 MB, '''start date/time:''' Jul 25, 2017 10:54:00.13&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_10.bag test_10.bag]: '''duration:''' 15:50s, '''size:''' 240.5 MB, '''start date/time:''' Jul 20, 2017 13:07:40.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 58.7 MB, '''start date/time:''' Jul 25, 2017 11:48:33.90&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_1.bag test_12_1.bag]: '''duration:''' 43.6s, '''size:''' 11.3 MB, '''start date/time:''' Jul 20, 2017 13:33:23.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_2.bag test_12_2.bag]: '''duration:''' 44.3s, '''size:''' 11.5 MB, '''start date/time:''' Jul 20, 2017 13:34:24.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_3.bag test_12_3.bag]: '''duration:''' 37.3s, '''size:'''  9.7 MB, '''start date/time:''' Jul 20, 2017 13:35:31.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_1.bag test_13_1.bag]: '''duration:''' 57.8s, '''size:''' 14.9 MB, '''start date/time:''' Jul 25, 2017 11:01:15.23&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_2.bag test_13_2.bag]: '''duration:''' 59.9s, '''size:''' 15.4 MB, '''start date/time:''' Jul 25, 2017 11:02:37.85&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_3.bag test_13_3.bag]: '''duration:''' 54.0s, '''size:''' 13.9 MB, '''start date/time:''' Jul 25, 2017 11:04:30.69&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_14.bag test_14.bag]: '''duration:''' 05.57s, '''size:''' 90.5 MB, '''start date/time:''' Jul 25, 2017 11:10:32.97&lt;br /&gt;
&lt;br /&gt;
====Location 2. Living room====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_01.bag test_01.bag]: '''duration:''' 15:37s, '''size:''' 239.7 MB, '''start date/time:''' Nov 16, 2017 12:20:19.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_02.bag test_02.bag]: '''duration:''' 15:05s, '''size:''' 231.4 MB, '''start date/time:''' Nov 27, 2017 18:32:24.35 &lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_1.bag test_03_1.bag]: '''duration:''' 55.2s, '''size:''' 14.5 MB, '''start date/time:''' Nov 16, 2017 12:43:58.29 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_2.bag test_03_2.bag]: '''duration:''' 44.4s, '''size:''' 10.2 MB, '''start date/time:''' Nov 16, 2017 12:46:31.99&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_3.bag test_03_3.bag]: '''duration:''' 51.1s, '''size:''' 13.4 MB, '''start date/time:''' Nov 16, 2017 12:47:29.66&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_1.bag test_04_1.bag]: '''duration:''' 01:23s, '''size:''' 21.6 MB, '''start date/time:''' Nov 16, 2017 1:50:22.18 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_2.bag test_04_2.bag]: '''duration:''' 01:20s, '''size:''' 21.0 MB, '''start date/time:''' Nov 16, 2017 12:52:18.44&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_3.bag test_04_3.bag]: '''duration:''' 01:09s, '''size:''' 18.1 MB, '''start date/time:''' Nov 16, 2017 12:53:51.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_4.bag test_04_4.bag]: '''duration:''' 1:21s, '''size:''' 21.3 MB, '''start date/time:''' Nov 16, 2017 12:55:31.81&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_05.bag test_05.bag]: '''duration:''' 15:07s, '''size:''' 232.4 MB, '''start date/time:''' Nov 28 2017 18:33:16.24&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_06.bag test_06.bag]: '''duration:''' 09:14s, '''size:''' 136.2 MB, '''start date/time:''' Nov 29 2017 13:12:42.61 &lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_07.bag test_07.bag]: '''duration:''' 4:39s, '''size:''' 71.5 MB, '''start date/time:''' Nov 22 2017 12:48:33.86 &lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_08.bag test_08.bag]: '''duration:''' 03:04s, '''size:''' 47.2 MB, '''start date/time:''' Nov 22 2017 12:59:08.58&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_1.bag test_09_1.bag]: '''duration:''' 28.8s, '''size:''' 7.7 MB, '''start date/time:''' Nov 28, 2017 17:17:46.49 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_2.bag test_09_2.bag]: '''duration:''' 31.9s, '''size:''' 8.4 MB, '''start date/time:''' Nov 28, 2017 17:20:48.26&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Nov 28, 2017 17:20:05.96&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_4.bag test_09_4.bag]: '''duration:''' 29.9s, '''size:''' 7.9 MB, '''start date/time:''' Nov 28, 2017 17:19:26.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_5.bag test_09_5.bag]: '''duration:''' 30.8s, '''size:''' 8.1 MB, '''start date/time:''' Nov 28, 2017 17:21:20.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_10.bag test_10.bag]: '''duration:''' 16:05s, '''size:''' 246.6 MB, '''start date/time:''' Nov 22 2017 11:55:02.39&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_11.bag test_11.bag]: '''duration:''' 03:25s, '''size:''' 52.8 MB, '''start date/time:''' Nov 22 2017 13:08:11.99&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_1.bag test_12_1.bag]: '''duration:''' 01:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:55:26.25&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_2.bag test_12_2.bag]: '''duration:''' 1:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:56:52.47&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_3.bag test_12_3.bag]: '''duration:''' 50.2s, '''size:'''  13.1 MB, '''start date/time:''' Nov 16, 2017 11:44:00.10&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_1.bag test_13_1.bag]: '''duration:''' 58.2s, '''size:''' 15.1 MB, '''start date/time:''' Nov 22, 2017 11:48:35.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_2.bag test_13_2.bag]: '''duration:''' 01:05s, '''size:''' 16.9 MB, '''start date/time:''' Nov 22, 2017 11:49:48.18&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_3.bag test_13_3.bag]: '''duration:''' 01:14s, '''size:''' 19.2 MB, '''start date/time:''' Nov 22, 2017 11:51:22.68&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_14.bag test_14.bag]: '''duration:''' 06:17s, '''size:''' 96.3 MB, '''start date/time:''' Nov 22 2017 12:23:07.92&lt;br /&gt;
&lt;br /&gt;
====Location 3. Bedroom====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_01.bag test_01.bag]: '''duration:''' 15:17s, '''size:''' 234.6 MB, '''start date/time:''' Nov 21, 2017 18:56:54.87&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 17:37:34.08&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_1.bag test_03_1.bag]: '''duration:''' 44.7s, '''size:''' 11.7 MB, '''start date/time:''' Nov 21, 2017 13:16:51.65&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_2.bag test_03_2.bag]: '''duration:''' 39.9s, '''size:''' 10.5 MB, '''start date/time:''' Nov 21, 2017 19:19:06.27&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_3.bag test_03_3.bag]: '''duration:''' 41.2s, '''size:''' 10.8 MB, '''start date/time:''' Nov 21, 2017 19:18:06.86&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_1.bag test_04_1.bag]: '''duration:''' 01:01s, '''size:''' 16.0 MB, '''start date/time:''' Nov 21, 2017 19:21:30.99 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_2.bag test_04_2.bag]: '''duration:''' 01:01s, '''size:''' 15.9 MB, '''start date/time:''' Nov 21, 2017 19:25:38.00&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_3.bag test_04_3.bag]: '''duration:''' 58.0s, '''size:''' 15.1 MB, '''start date/time:''' Nov 21, 2017 19:27:16.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_4.bag test_04_4.bag]: '''duration:''' 1:00s, '''size:''' 15.8 MB, '''start date/time:''' Nov 21, 2017 19:29:39.28&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_05.bag test_05.bag]: '''duration:''' 15:11s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 19:05:20.19&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_06.bag test_06.bag]: '''duration:''' 09:03s, '''size:''' 133.6 MB, '''start date/time:''' Nov 26, 2017 13:24:25.64&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Nov 27, 2017 17:34:50.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_08.bag test_08.bag]: '''duration:''' 03:47s, '''size:''' 58.3 MB, '''start date/time:''' Nov 27, 2017 17:41:48.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_1.bag test_09_1.bag]: '''duration:''' 31.3s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:23:58.31 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_2.bag test_09_2.bag]: '''duration:''' 30.4s, '''size:''' 8.0 MB, '''start date/time:''' Nov 28, 2017 17:24:41.09&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_3.bag test_09_3.bag]: '''duration:''' 31.2s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:25:22.22&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_4.bag test_09_4.bag]: '''duration:''' 31.4s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:26:06.98&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_5.bag test_09_5.bag]: '''duration:''' 33.2s, '''size:''' 8.8 MB, '''start date/time:''' Nov 28, 2017 17:26:51.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_10.bag test_10.bag]: '''duration:''' 10:00s, '''size:''' 153.7 MB, '''start date/time:''' Nov 22, 2017 18:15:29.53&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 59.1 MB, '''start date/time:''' Nov 22, 2017 18:25:54.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_1.bag test_12_1.bag]: '''duration:''' 41.3s, '''size:''' 10.8 MB, '''start date/time:''' Nov 22, 2017 18:35:11.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_2.bag test_12_2.bag]: '''duration:''' 41.8s, '''size:''' 10.9 MB, '''start date/time:''' Nov 22, 2017 18:36:12.11&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_3.bag test_12_3.bag]: '''duration:''' 39.8s, '''size:'''  10.5 MB, '''start date/time:''' Nov 22, 2017 18:37:05.06&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_1.bag test_13_1.bag]: '''duration:''' 01:01s, '''size:''' 14.9 MB, '''start date/time:''' Nov 22, 2017 18:40:13.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_2.bag test_13_2.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:41:28.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_3.bag test_13_3.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:42:39.52&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_14.bag test_14.bag]: '''duration:''' 06:12s, '''size:''' 95.1 MB, '''start date/time:''' Nov 23, 2017 12:53:06.48&lt;br /&gt;
&lt;br /&gt;
=== v2.0 [Feb-2018] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a second version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Location 1. Kitchen====&lt;br /&gt;
&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_01.bag cocina_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_02.bag cocina_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_03.bag cocina_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_04.bag cocina_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_05.bag cocina_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_06.bag cocina_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_07.bag cocina_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_08.bag cocina_08.bag]&lt;br /&gt;
* Situation 19: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_09.bag cocina_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_10.bag cocina_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_11.bag cocina_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_12.bag cocina_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_13.bag cocina_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_14.bag cocina_14.bag]&lt;br /&gt;
&lt;br /&gt;
====Location 2. Living room====&lt;br /&gt;
&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_01.bag salon_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_02.bag salon_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_03.bag salon_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_04.bag salon_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_05.bag salon_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_06.bag salon_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_07.bag salon_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_08.bag salon_08.bag]&lt;br /&gt;
* Situation 19: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_09.bag salon_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_10.bag salon_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_11.bag salon_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_12.bag salon_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_13.bag salon_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location2/salon_14.bag salon_14.bag]&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5212</id>
		<title>Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5212"/>
				<updated>2018-02-13T11:58:13Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: /* Location 1. Kitchen */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;blockquote style=&amp;quot;background-color:#ffe;border:1px solid #fb0;padding:5px 10px&amp;quot;&amp;gt;SciCrunch! reference: '''Range-based people tracker classifiers Benchmark Dataset, RRID:SCR_015743.'''&amp;lt;/blockquote&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to evaluate the performance of different approaches for detecting and tracking people by using LIDAR sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers. &lt;br /&gt;
&lt;br /&gt;
Further information at [https://www.frontiersin.org/articles/10.3389/fnbot.2017.00072/full Álvarez-Aparicio et al. (2018)].&lt;br /&gt;
&lt;br /&gt;
== Materials and methods ==&lt;br /&gt;
&lt;br /&gt;
The following section describes the materials (shown in Figure 1) used to gather data, which include: a certified study area, an autonomous robot with an on-board LIDAR sensor, and a real-time location system (RTLS) to obtain ground-truth data about person location. Recorded data include location estimates calculated by two people trackers, LD and PeTra, also described below. Finally, the recording procedure used to build the dataset is explained.&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig1.jpg|frame|center|'''Fig. 1''': Materials used in the experiments. (A) Orbi-One robot. (B) Leon@Home Testbed plan. (C) Floor plan of the apartment. Red dots show the location of KIO anchors. Black numbered dots denote Orbi-One locations during the data gathering. (D) KIO anchors. (E) General view of the apartment. (F) Furniture in mock-up apartment.]]&lt;br /&gt;
&lt;br /&gt;
=== Leon@Home Testbed ===&lt;br /&gt;
&lt;br /&gt;
Data have been gathered at Leon@Home Testbed. This is a Certified Testbed of the European Robotics league (ERL). Its main purpose is to benchmark service robots in a realistic home environment. Our testbed is made up of four parts, shown in Figure 1B: a mock-up apartment, a control zone with direct vision (glass wall) into the apartment, a small workshop, and a larger development zone, where researchers work.&lt;br /&gt;
&lt;br /&gt;
Leon@Home Testbed is located on the second floor of the Módulo de Investigación en Cibernética (Building for Research in Cybernetics) on the Vegazana Campus of the University of León (Spain). The apartment is a single bedroom mock-up home built in an 8 m × 7 m space. Figure 1C shows a plan of the apartment. 60 cm high walls divide it into a kitchen, living room, bathroom, and bedroom. The furniture (Figures 1E,F) has been chosen to test different robot abilities. For instance, the kitchen cabinets all have different types of handles.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-one ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One (Figure 1A) is an assistant robot manufactured by Robotnik. It has several sensors, among them, a RGBD camera, a LIDAR sensor, and an inertial unit. It can operate a manipulator arm attached to its torso and has a wheeled base for moving around the room. Orbi-One includes a wireless access point, which allows WiFi communications with other robots and computers.&lt;br /&gt;
&lt;br /&gt;
The software to control the robot hardware is based on a ROS framework. ROS is basically a set of libraries for robotics similar to operating system services, providing hardware abstraction for sensors and actuators, low-level device control, and inter-process communication. Computation takes place in processes named Nodes, which can receive and send Messages. Nodes publish Messages into information buffers called Topics.&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
In order to acquire ground-truth data about person location in the study area, we need an RTLS for indoor environments. The KIO RTLS commercial solution by Eliko has been used. KIO is a precise RTLS for tracking any object in 2- or 3-dimensional space. The Ultra Wideband technology enables to micro-position objects through obstructions. KIO also works in non-line-of-sight conditions and both indoors and outdoors.&lt;br /&gt;
&lt;br /&gt;
KIO comes in two main configurations. The Regular Cell configuration guarantees a reliable accuracy of ±30 cm, according to the manufacturer’s specifications. The Small Cell configuration is designed for location-critical applications and provides reliable ±5 cm accuracy, according to the manufacturer’s specifications. Calibration done by the authors of this paper on the mock-up apartment shows that the error is higher in some areas, and lower in others, but on average, the claims of the manufacturer are correct.&lt;br /&gt;
&lt;br /&gt;
KIO calculates the position of a mobile transceiver, called a Tag. In order to do so, KIO uses radio beacons, called Anchors, distributed in known positions in the surroundings. Figure 1D shows a KIO anchor. KIO tags are the same size and must be placed on-board the tracking subject, in our case people. The red dots in Figure 1C show the location of the six anchors used in these experiments. They are placed on the ceiling. The distribution of the anchors has been chosen following the method shown in Guerrero-Higueras et al. (2017).&lt;br /&gt;
&lt;br /&gt;
=== Leg Detector (D) ===&lt;br /&gt;
&lt;br /&gt;
LD is a ROS package, which takes messages published by a LIDAR sensor as input and uses a machine-learning-trained classifier to detect groups of laser readings as possible legs. The code is available in a public repository, but is unsupported at this time.&lt;br /&gt;
&lt;br /&gt;
LD publishes the location for the individual legs. It can also attempt to pair the legs and publish their average as an estimate of where the center of a person is. LD may optionally also publish visualization marker messages to indicate where detections happened.&lt;br /&gt;
&lt;br /&gt;
=== PeTra ===&lt;br /&gt;
&lt;br /&gt;
PeTra is a person-tracker tool for detecting and tracking, developed by the Robotics Group at the University of León. The system is based on a Convolutional Neural Network (CNN) using a configuration based on the U-Net architecture by Ronneberger et al. (2015).&lt;br /&gt;
&lt;br /&gt;
The system performs the following steps in real time:&lt;br /&gt;
&lt;br /&gt;
1. First, the data provided by the LIDAR sensor are processed to build a two dimensional occupancy map centered around the robot. This occupancy map is represented as a binary matrix, where 1s denote positions where the LIDAR scan found an obstacle, and 0s denote positions where the LIDAR scan either went through without detecting any obstacle or did not go through that position.&lt;br /&gt;
&lt;br /&gt;
2. Then, the occupancy map is relayed to the network as input data. The network produces a second occupancy map representing the zones where legs have been detected.&lt;br /&gt;
&lt;br /&gt;
3. Finally, center of mass calculations return the location of persons. PeTra also publishes locations for the individual legs and Marker messages for visualization.&lt;br /&gt;
&lt;br /&gt;
=== Recording procedure ===&lt;br /&gt;
&lt;br /&gt;
The data were gathered in 14 different situations. In all of them, Orbi-One stood still as one or more people, carrying a KIO tag, moved around it. Three different locations for Orbi-One were defined (see Figure 1C) resulting in 42 scenarios (14 situations × 3 Orbi-One locations). Figure 2 shows the 14 different recognition scenarios recorded. These scenarios have been chosen according to different situations that may occur in robotics competitions such as ERL or RoboCup.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig2.png|frame|center|'''Fig. 2''': recognition scenarios recorded.]]&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
A rosbag file was created for each scenario (except for situations 3, 12, and 13 where 3 rosbag files where recorded, for situation 4 where 4 rosbag files where recorded, and for situation 9 where 5 rosbag files where recorded), recording LIDAR sensor measurements, location estimates from PeTra and LD, locations from KIO RTLS, and other useful data. Specifically, the following data were included in the rosbag files:&lt;br /&gt;
&lt;br /&gt;
* LIDAR sensor data. Data from LIDAR sensors are provided as ROS LaserScan Messages, which include, among other information, the following: acquisition time of the first ray in the scan, start/end angle of the scan, angular distance between measurements, and range data.&lt;br /&gt;
* PeTra location estimates provided as ROS PointStamped Messages, which include a position [x, y, z] and a timestamp.&lt;br /&gt;
* Location estimates calculated by LD. It publishes data for individual legs (as ROS PositionMeasurementArray Messages). It also attempts to pair the legs and publishes their average as an estimate of where the center of a person is as a ROS PositionMeasurement Message.&lt;br /&gt;
* Locations provided by KIO RTLS also provided as ROS PointStamped Messages.&lt;br /&gt;
* Messages from /map, /odom, and /tf ROS topics, which include map information, odometry of the robot base, and transformation information, respectively.&lt;br /&gt;
&lt;br /&gt;
Differents versions of the dataset are enumerated below.&lt;br /&gt;
&lt;br /&gt;
=== v1.0 [Nov-2017] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a first version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Location 1. Kitchen====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_01.bag test_01.bag]: '''duration:''' 14:56s, '''size:''' 227.8 MB, '''start date/time:''' Jul 20, 2017 12:49:21.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Jul 26, 2017 11:01:24.72&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_1.bag test_03_1.bag]: '''duration:''' 39.9s, '''size:''' 10.4 MB, '''start date/time:''' Jul 20, 2017 13:27:25.50&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_2.bag test_03_2.bag]: '''duration:''' 39.1s, '''size:''' 10.2 MB, '''start date/time:''' Jul 20, 2017 13:28:56.41&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_3.bag test_03_3.bag]: '''duration:''' 40.5s, '''size:''' 10.5 MB, '''start date/time:''' Jul 20, 2017 13:30:04.94&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_1.bag test_04_1.bag]: '''duration:''' 58.3s, '''size:''' 15.0 MB, '''start date/time:''' Jul 25, 2017 10:39:52.62 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_2.bag test_04_2.bag]: '''duration:''' 57.2s, '''size:''' 14.7 MB, '''start date/time:''' Jul 25, 2017 10:41:16.31&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_3.bag test_04_3.bag]: '''duration:''' 50.5s, '''size:''' 13.0 MB, '''start date/time:''' Jul 25, 2017 10:42:44.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_4.bag test_04_4.bag]: '''duration:''' 1:01s, '''size:''' 15.7 MB, '''start date/time:''' Jul 25, 2017 10:43:52.44&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_05.bag test_05.bag]: '''duration:''' 15:15s, '''size:''' 236.8 MB, '''start date/time:''' Jul 26, 2017 11:33:13.31&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_06.bag test_06.bag]: '''duration:''' 09:18s, '''size:''' 143.3 MB, '''start date/time:''' Jul 26, 2017 12:25:45.12&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Jul 25, 2017 11:40:01.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_08.bag test_08.bag]: '''duration:''' 03:39s, '''size:''' 55.8 MB, '''start date/time:''' Jul 25, 2017 12:25:29.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_1.bag test_09_1.bag]: '''duration:''' 22.9s, '''size:''' 6.1 MB, '''start date/time:''' Jul 25, 2017 10:50:02.95 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_2.bag test_09_2.bag]: '''duration:''' 20.8s, '''size:''' 5.6 MB, '''start date/time:''' Jul 25, 2017 10:51:02.62&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Jul 25, 2017 10:51:45.96 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_4.bag test_09_4.bag]: '''duration:''' 29.3s, '''size:''' 7.9 MB, '''start date/time:''' Jul 25, 2017 10:52:51.24&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_5.bag test_09_5.bag]: '''duration:''' 36.9s, '''size:''' 9.7 MB, '''start date/time:''' Jul 25, 2017 10:54:00.13&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_10.bag test_10.bag]: '''duration:''' 15:50s, '''size:''' 240.5 MB, '''start date/time:''' Jul 20, 2017 13:07:40.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 58.7 MB, '''start date/time:''' Jul 25, 2017 11:48:33.90&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_1.bag test_12_1.bag]: '''duration:''' 43.6s, '''size:''' 11.3 MB, '''start date/time:''' Jul 20, 2017 13:33:23.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_2.bag test_12_2.bag]: '''duration:''' 44.3s, '''size:''' 11.5 MB, '''start date/time:''' Jul 20, 2017 13:34:24.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_3.bag test_12_3.bag]: '''duration:''' 37.3s, '''size:'''  9.7 MB, '''start date/time:''' Jul 20, 2017 13:35:31.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_1.bag test_13_1.bag]: '''duration:''' 57.8s, '''size:''' 14.9 MB, '''start date/time:''' Jul 25, 2017 11:01:15.23&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_2.bag test_13_2.bag]: '''duration:''' 59.9s, '''size:''' 15.4 MB, '''start date/time:''' Jul 25, 2017 11:02:37.85&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_3.bag test_13_3.bag]: '''duration:''' 54.0s, '''size:''' 13.9 MB, '''start date/time:''' Jul 25, 2017 11:04:30.69&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_14.bag test_14.bag]: '''duration:''' 05.57s, '''size:''' 90.5 MB, '''start date/time:''' Jul 25, 2017 11:10:32.97&lt;br /&gt;
&lt;br /&gt;
====Location 2. Living room====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_01.bag test_01.bag]: '''duration:''' 15:37s, '''size:''' 239.7 MB, '''start date/time:''' Nov 16, 2017 12:20:19.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_02.bag test_02.bag]: '''duration:''' 15:05s, '''size:''' 231.4 MB, '''start date/time:''' Nov 27, 2017 18:32:24.35 &lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_1.bag test_03_1.bag]: '''duration:''' 55.2s, '''size:''' 14.5 MB, '''start date/time:''' Nov 16, 2017 12:43:58.29 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_2.bag test_03_2.bag]: '''duration:''' 44.4s, '''size:''' 10.2 MB, '''start date/time:''' Nov 16, 2017 12:46:31.99&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_3.bag test_03_3.bag]: '''duration:''' 51.1s, '''size:''' 13.4 MB, '''start date/time:''' Nov 16, 2017 12:47:29.66&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_1.bag test_04_1.bag]: '''duration:''' 01:23s, '''size:''' 21.6 MB, '''start date/time:''' Nov 16, 2017 1:50:22.18 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_2.bag test_04_2.bag]: '''duration:''' 01:20s, '''size:''' 21.0 MB, '''start date/time:''' Nov 16, 2017 12:52:18.44&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_3.bag test_04_3.bag]: '''duration:''' 01:09s, '''size:''' 18.1 MB, '''start date/time:''' Nov 16, 2017 12:53:51.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_4.bag test_04_4.bag]: '''duration:''' 1:21s, '''size:''' 21.3 MB, '''start date/time:''' Nov 16, 2017 12:55:31.81&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_05.bag test_05.bag]: '''duration:''' 15:07s, '''size:''' 232.4 MB, '''start date/time:''' Nov 28 2017 18:33:16.24&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_06.bag test_06.bag]: '''duration:''' 09:14s, '''size:''' 136.2 MB, '''start date/time:''' Nov 29 2017 13:12:42.61 &lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_07.bag test_07.bag]: '''duration:''' 4:39s, '''size:''' 71.5 MB, '''start date/time:''' Nov 22 2017 12:48:33.86 &lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_08.bag test_08.bag]: '''duration:''' 03:04s, '''size:''' 47.2 MB, '''start date/time:''' Nov 22 2017 12:59:08.58&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_1.bag test_09_1.bag]: '''duration:''' 28.8s, '''size:''' 7.7 MB, '''start date/time:''' Nov 28, 2017 17:17:46.49 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_2.bag test_09_2.bag]: '''duration:''' 31.9s, '''size:''' 8.4 MB, '''start date/time:''' Nov 28, 2017 17:20:48.26&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Nov 28, 2017 17:20:05.96&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_4.bag test_09_4.bag]: '''duration:''' 29.9s, '''size:''' 7.9 MB, '''start date/time:''' Nov 28, 2017 17:19:26.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_5.bag test_09_5.bag]: '''duration:''' 30.8s, '''size:''' 8.1 MB, '''start date/time:''' Nov 28, 2017 17:21:20.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_10.bag test_10.bag]: '''duration:''' 16:05s, '''size:''' 246.6 MB, '''start date/time:''' Nov 22 2017 11:55:02.39&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_11.bag test_11.bag]: '''duration:''' 03:25s, '''size:''' 52.8 MB, '''start date/time:''' Nov 22 2017 13:08:11.99&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_1.bag test_12_1.bag]: '''duration:''' 01:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:55:26.25&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_2.bag test_12_2.bag]: '''duration:''' 1:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:56:52.47&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_3.bag test_12_3.bag]: '''duration:''' 50.2s, '''size:'''  13.1 MB, '''start date/time:''' Nov 16, 2017 11:44:00.10&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_1.bag test_13_1.bag]: '''duration:''' 58.2s, '''size:''' 15.1 MB, '''start date/time:''' Nov 22, 2017 11:48:35.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_2.bag test_13_2.bag]: '''duration:''' 01:05s, '''size:''' 16.9 MB, '''start date/time:''' Nov 22, 2017 11:49:48.18&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_3.bag test_13_3.bag]: '''duration:''' 01:14s, '''size:''' 19.2 MB, '''start date/time:''' Nov 22, 2017 11:51:22.68&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_14.bag test_14.bag]: '''duration:''' 06:17s, '''size:''' 96.3 MB, '''start date/time:''' Nov 22 2017 12:23:07.92&lt;br /&gt;
&lt;br /&gt;
====Location 3. Bedroom====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_01.bag test_01.bag]: '''duration:''' 15:17s, '''size:''' 234.6 MB, '''start date/time:''' Nov 21, 2017 18:56:54.87&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 17:37:34.08&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_1.bag test_03_1.bag]: '''duration:''' 44.7s, '''size:''' 11.7 MB, '''start date/time:''' Nov 21, 2017 13:16:51.65&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_2.bag test_03_2.bag]: '''duration:''' 39.9s, '''size:''' 10.5 MB, '''start date/time:''' Nov 21, 2017 19:19:06.27&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_3.bag test_03_3.bag]: '''duration:''' 41.2s, '''size:''' 10.8 MB, '''start date/time:''' Nov 21, 2017 19:18:06.86&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_1.bag test_04_1.bag]: '''duration:''' 01:01s, '''size:''' 16.0 MB, '''start date/time:''' Nov 21, 2017 19:21:30.99 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_2.bag test_04_2.bag]: '''duration:''' 01:01s, '''size:''' 15.9 MB, '''start date/time:''' Nov 21, 2017 19:25:38.00&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_3.bag test_04_3.bag]: '''duration:''' 58.0s, '''size:''' 15.1 MB, '''start date/time:''' Nov 21, 2017 19:27:16.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_4.bag test_04_4.bag]: '''duration:''' 1:00s, '''size:''' 15.8 MB, '''start date/time:''' Nov 21, 2017 19:29:39.28&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_05.bag test_05.bag]: '''duration:''' 15:11s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 19:05:20.19&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_06.bag test_06.bag]: '''duration:''' 09:03s, '''size:''' 133.6 MB, '''start date/time:''' Nov 26, 2017 13:24:25.64&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Nov 27, 2017 17:34:50.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_08.bag test_08.bag]: '''duration:''' 03:47s, '''size:''' 58.3 MB, '''start date/time:''' Nov 27, 2017 17:41:48.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_1.bag test_09_1.bag]: '''duration:''' 31.3s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:23:58.31 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_2.bag test_09_2.bag]: '''duration:''' 30.4s, '''size:''' 8.0 MB, '''start date/time:''' Nov 28, 2017 17:24:41.09&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_3.bag test_09_3.bag]: '''duration:''' 31.2s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:25:22.22&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_4.bag test_09_4.bag]: '''duration:''' 31.4s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:26:06.98&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_5.bag test_09_5.bag]: '''duration:''' 33.2s, '''size:''' 8.8 MB, '''start date/time:''' Nov 28, 2017 17:26:51.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_10.bag test_10.bag]: '''duration:''' 10:00s, '''size:''' 153.7 MB, '''start date/time:''' Nov 22, 2017 18:15:29.53&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 59.1 MB, '''start date/time:''' Nov 22, 2017 18:25:54.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_1.bag test_12_1.bag]: '''duration:''' 41.3s, '''size:''' 10.8 MB, '''start date/time:''' Nov 22, 2017 18:35:11.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_2.bag test_12_2.bag]: '''duration:''' 41.8s, '''size:''' 10.9 MB, '''start date/time:''' Nov 22, 2017 18:36:12.11&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_3.bag test_12_3.bag]: '''duration:''' 39.8s, '''size:'''  10.5 MB, '''start date/time:''' Nov 22, 2017 18:37:05.06&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_1.bag test_13_1.bag]: '''duration:''' 01:01s, '''size:''' 14.9 MB, '''start date/time:''' Nov 22, 2017 18:40:13.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_2.bag test_13_2.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:41:28.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_3.bag test_13_3.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:42:39.52&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_14.bag test_14.bag]: '''duration:''' 06:12s, '''size:''' 95.1 MB, '''start date/time:''' Nov 23, 2017 12:53:06.48&lt;br /&gt;
&lt;br /&gt;
=== v2.0 [Feb-2018] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a second version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Location 1. Kitchen====&lt;br /&gt;
&lt;br /&gt;
* Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_01.bag cocina_01.bag]&lt;br /&gt;
* Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_02.bag cocina_02.bag]&lt;br /&gt;
* Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_03.bag cocina_03.bag]&lt;br /&gt;
* Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_04.bag cocina_04.bag]&lt;br /&gt;
* Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_05.bag cocina_05.bag]&lt;br /&gt;
* Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_06.bag cocina_06.bag]&lt;br /&gt;
* Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_07.bag cocina_07.bag]&lt;br /&gt;
* Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_08.bag cocina_08.bag]&lt;br /&gt;
* Situation 19: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_09.bag cocina_09.bag]&lt;br /&gt;
* Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_10.bag cocina_10.bag]&lt;br /&gt;
* Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_11.bag cocina_11.bag]&lt;br /&gt;
* Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_12.bag cocina_12.bag]&lt;br /&gt;
* Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_13.bag cocina_13.bag]&lt;br /&gt;
* Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_14.bag cocina_14.bag]&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5211</id>
		<title>Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5211"/>
				<updated>2018-02-13T11:57:39Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: /* Location 1. Kitchen */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;blockquote style=&amp;quot;background-color:#ffe;border:1px solid #fb0;padding:5px 10px&amp;quot;&amp;gt;SciCrunch! reference: '''Range-based people tracker classifiers Benchmark Dataset, RRID:SCR_015743.'''&amp;lt;/blockquote&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to evaluate the performance of different approaches for detecting and tracking people by using LIDAR sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers. &lt;br /&gt;
&lt;br /&gt;
Further information at [https://www.frontiersin.org/articles/10.3389/fnbot.2017.00072/full Álvarez-Aparicio et al. (2018)].&lt;br /&gt;
&lt;br /&gt;
== Materials and methods ==&lt;br /&gt;
&lt;br /&gt;
The following section describes the materials (shown in Figure 1) used to gather data, which include: a certified study area, an autonomous robot with an on-board LIDAR sensor, and a real-time location system (RTLS) to obtain ground-truth data about person location. Recorded data include location estimates calculated by two people trackers, LD and PeTra, also described below. Finally, the recording procedure used to build the dataset is explained.&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig1.jpg|frame|center|'''Fig. 1''': Materials used in the experiments. (A) Orbi-One robot. (B) Leon@Home Testbed plan. (C) Floor plan of the apartment. Red dots show the location of KIO anchors. Black numbered dots denote Orbi-One locations during the data gathering. (D) KIO anchors. (E) General view of the apartment. (F) Furniture in mock-up apartment.]]&lt;br /&gt;
&lt;br /&gt;
=== Leon@Home Testbed ===&lt;br /&gt;
&lt;br /&gt;
Data have been gathered at Leon@Home Testbed. This is a Certified Testbed of the European Robotics league (ERL). Its main purpose is to benchmark service robots in a realistic home environment. Our testbed is made up of four parts, shown in Figure 1B: a mock-up apartment, a control zone with direct vision (glass wall) into the apartment, a small workshop, and a larger development zone, where researchers work.&lt;br /&gt;
&lt;br /&gt;
Leon@Home Testbed is located on the second floor of the Módulo de Investigación en Cibernética (Building for Research in Cybernetics) on the Vegazana Campus of the University of León (Spain). The apartment is a single bedroom mock-up home built in an 8 m × 7 m space. Figure 1C shows a plan of the apartment. 60 cm high walls divide it into a kitchen, living room, bathroom, and bedroom. The furniture (Figures 1E,F) has been chosen to test different robot abilities. For instance, the kitchen cabinets all have different types of handles.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-one ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One (Figure 1A) is an assistant robot manufactured by Robotnik. It has several sensors, among them, a RGBD camera, a LIDAR sensor, and an inertial unit. It can operate a manipulator arm attached to its torso and has a wheeled base for moving around the room. Orbi-One includes a wireless access point, which allows WiFi communications with other robots and computers.&lt;br /&gt;
&lt;br /&gt;
The software to control the robot hardware is based on a ROS framework. ROS is basically a set of libraries for robotics similar to operating system services, providing hardware abstraction for sensors and actuators, low-level device control, and inter-process communication. Computation takes place in processes named Nodes, which can receive and send Messages. Nodes publish Messages into information buffers called Topics.&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
In order to acquire ground-truth data about person location in the study area, we need an RTLS for indoor environments. The KIO RTLS commercial solution by Eliko has been used. KIO is a precise RTLS for tracking any object in 2- or 3-dimensional space. The Ultra Wideband technology enables to micro-position objects through obstructions. KIO also works in non-line-of-sight conditions and both indoors and outdoors.&lt;br /&gt;
&lt;br /&gt;
KIO comes in two main configurations. The Regular Cell configuration guarantees a reliable accuracy of ±30 cm, according to the manufacturer’s specifications. The Small Cell configuration is designed for location-critical applications and provides reliable ±5 cm accuracy, according to the manufacturer’s specifications. Calibration done by the authors of this paper on the mock-up apartment shows that the error is higher in some areas, and lower in others, but on average, the claims of the manufacturer are correct.&lt;br /&gt;
&lt;br /&gt;
KIO calculates the position of a mobile transceiver, called a Tag. In order to do so, KIO uses radio beacons, called Anchors, distributed in known positions in the surroundings. Figure 1D shows a KIO anchor. KIO tags are the same size and must be placed on-board the tracking subject, in our case people. The red dots in Figure 1C show the location of the six anchors used in these experiments. They are placed on the ceiling. The distribution of the anchors has been chosen following the method shown in Guerrero-Higueras et al. (2017).&lt;br /&gt;
&lt;br /&gt;
=== Leg Detector (D) ===&lt;br /&gt;
&lt;br /&gt;
LD is a ROS package, which takes messages published by a LIDAR sensor as input and uses a machine-learning-trained classifier to detect groups of laser readings as possible legs. The code is available in a public repository, but is unsupported at this time.&lt;br /&gt;
&lt;br /&gt;
LD publishes the location for the individual legs. It can also attempt to pair the legs and publish their average as an estimate of where the center of a person is. LD may optionally also publish visualization marker messages to indicate where detections happened.&lt;br /&gt;
&lt;br /&gt;
=== PeTra ===&lt;br /&gt;
&lt;br /&gt;
PeTra is a person-tracker tool for detecting and tracking, developed by the Robotics Group at the University of León. The system is based on a Convolutional Neural Network (CNN) using a configuration based on the U-Net architecture by Ronneberger et al. (2015).&lt;br /&gt;
&lt;br /&gt;
The system performs the following steps in real time:&lt;br /&gt;
&lt;br /&gt;
1. First, the data provided by the LIDAR sensor are processed to build a two dimensional occupancy map centered around the robot. This occupancy map is represented as a binary matrix, where 1s denote positions where the LIDAR scan found an obstacle, and 0s denote positions where the LIDAR scan either went through without detecting any obstacle or did not go through that position.&lt;br /&gt;
&lt;br /&gt;
2. Then, the occupancy map is relayed to the network as input data. The network produces a second occupancy map representing the zones where legs have been detected.&lt;br /&gt;
&lt;br /&gt;
3. Finally, center of mass calculations return the location of persons. PeTra also publishes locations for the individual legs and Marker messages for visualization.&lt;br /&gt;
&lt;br /&gt;
=== Recording procedure ===&lt;br /&gt;
&lt;br /&gt;
The data were gathered in 14 different situations. In all of them, Orbi-One stood still as one or more people, carrying a KIO tag, moved around it. Three different locations for Orbi-One were defined (see Figure 1C) resulting in 42 scenarios (14 situations × 3 Orbi-One locations). Figure 2 shows the 14 different recognition scenarios recorded. These scenarios have been chosen according to different situations that may occur in robotics competitions such as ERL or RoboCup.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig2.png|frame|center|'''Fig. 2''': recognition scenarios recorded.]]&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
A rosbag file was created for each scenario (except for situations 3, 12, and 13 where 3 rosbag files where recorded, for situation 4 where 4 rosbag files where recorded, and for situation 9 where 5 rosbag files where recorded), recording LIDAR sensor measurements, location estimates from PeTra and LD, locations from KIO RTLS, and other useful data. Specifically, the following data were included in the rosbag files:&lt;br /&gt;
&lt;br /&gt;
* LIDAR sensor data. Data from LIDAR sensors are provided as ROS LaserScan Messages, which include, among other information, the following: acquisition time of the first ray in the scan, start/end angle of the scan, angular distance between measurements, and range data.&lt;br /&gt;
* PeTra location estimates provided as ROS PointStamped Messages, which include a position [x, y, z] and a timestamp.&lt;br /&gt;
* Location estimates calculated by LD. It publishes data for individual legs (as ROS PositionMeasurementArray Messages). It also attempts to pair the legs and publishes their average as an estimate of where the center of a person is as a ROS PositionMeasurement Message.&lt;br /&gt;
* Locations provided by KIO RTLS also provided as ROS PointStamped Messages.&lt;br /&gt;
* Messages from /map, /odom, and /tf ROS topics, which include map information, odometry of the robot base, and transformation information, respectively.&lt;br /&gt;
&lt;br /&gt;
Differents versions of the dataset are enumerated below.&lt;br /&gt;
&lt;br /&gt;
=== v1.0 [Nov-2017] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a first version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Location 1. Kitchen====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_01.bag test_01.bag]: '''duration:''' 14:56s, '''size:''' 227.8 MB, '''start date/time:''' Jul 20, 2017 12:49:21.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Jul 26, 2017 11:01:24.72&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_1.bag test_03_1.bag]: '''duration:''' 39.9s, '''size:''' 10.4 MB, '''start date/time:''' Jul 20, 2017 13:27:25.50&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_2.bag test_03_2.bag]: '''duration:''' 39.1s, '''size:''' 10.2 MB, '''start date/time:''' Jul 20, 2017 13:28:56.41&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_3.bag test_03_3.bag]: '''duration:''' 40.5s, '''size:''' 10.5 MB, '''start date/time:''' Jul 20, 2017 13:30:04.94&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_1.bag test_04_1.bag]: '''duration:''' 58.3s, '''size:''' 15.0 MB, '''start date/time:''' Jul 25, 2017 10:39:52.62 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_2.bag test_04_2.bag]: '''duration:''' 57.2s, '''size:''' 14.7 MB, '''start date/time:''' Jul 25, 2017 10:41:16.31&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_3.bag test_04_3.bag]: '''duration:''' 50.5s, '''size:''' 13.0 MB, '''start date/time:''' Jul 25, 2017 10:42:44.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_4.bag test_04_4.bag]: '''duration:''' 1:01s, '''size:''' 15.7 MB, '''start date/time:''' Jul 25, 2017 10:43:52.44&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_05.bag test_05.bag]: '''duration:''' 15:15s, '''size:''' 236.8 MB, '''start date/time:''' Jul 26, 2017 11:33:13.31&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_06.bag test_06.bag]: '''duration:''' 09:18s, '''size:''' 143.3 MB, '''start date/time:''' Jul 26, 2017 12:25:45.12&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Jul 25, 2017 11:40:01.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_08.bag test_08.bag]: '''duration:''' 03:39s, '''size:''' 55.8 MB, '''start date/time:''' Jul 25, 2017 12:25:29.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_1.bag test_09_1.bag]: '''duration:''' 22.9s, '''size:''' 6.1 MB, '''start date/time:''' Jul 25, 2017 10:50:02.95 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_2.bag test_09_2.bag]: '''duration:''' 20.8s, '''size:''' 5.6 MB, '''start date/time:''' Jul 25, 2017 10:51:02.62&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Jul 25, 2017 10:51:45.96 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_4.bag test_09_4.bag]: '''duration:''' 29.3s, '''size:''' 7.9 MB, '''start date/time:''' Jul 25, 2017 10:52:51.24&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_5.bag test_09_5.bag]: '''duration:''' 36.9s, '''size:''' 9.7 MB, '''start date/time:''' Jul 25, 2017 10:54:00.13&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_10.bag test_10.bag]: '''duration:''' 15:50s, '''size:''' 240.5 MB, '''start date/time:''' Jul 20, 2017 13:07:40.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 58.7 MB, '''start date/time:''' Jul 25, 2017 11:48:33.90&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_1.bag test_12_1.bag]: '''duration:''' 43.6s, '''size:''' 11.3 MB, '''start date/time:''' Jul 20, 2017 13:33:23.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_2.bag test_12_2.bag]: '''duration:''' 44.3s, '''size:''' 11.5 MB, '''start date/time:''' Jul 20, 2017 13:34:24.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_3.bag test_12_3.bag]: '''duration:''' 37.3s, '''size:'''  9.7 MB, '''start date/time:''' Jul 20, 2017 13:35:31.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_1.bag test_13_1.bag]: '''duration:''' 57.8s, '''size:''' 14.9 MB, '''start date/time:''' Jul 25, 2017 11:01:15.23&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_2.bag test_13_2.bag]: '''duration:''' 59.9s, '''size:''' 15.4 MB, '''start date/time:''' Jul 25, 2017 11:02:37.85&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_3.bag test_13_3.bag]: '''duration:''' 54.0s, '''size:''' 13.9 MB, '''start date/time:''' Jul 25, 2017 11:04:30.69&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_14.bag test_14.bag]: '''duration:''' 05.57s, '''size:''' 90.5 MB, '''start date/time:''' Jul 25, 2017 11:10:32.97&lt;br /&gt;
&lt;br /&gt;
====Location 2. Living room====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_01.bag test_01.bag]: '''duration:''' 15:37s, '''size:''' 239.7 MB, '''start date/time:''' Nov 16, 2017 12:20:19.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_02.bag test_02.bag]: '''duration:''' 15:05s, '''size:''' 231.4 MB, '''start date/time:''' Nov 27, 2017 18:32:24.35 &lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_1.bag test_03_1.bag]: '''duration:''' 55.2s, '''size:''' 14.5 MB, '''start date/time:''' Nov 16, 2017 12:43:58.29 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_2.bag test_03_2.bag]: '''duration:''' 44.4s, '''size:''' 10.2 MB, '''start date/time:''' Nov 16, 2017 12:46:31.99&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_3.bag test_03_3.bag]: '''duration:''' 51.1s, '''size:''' 13.4 MB, '''start date/time:''' Nov 16, 2017 12:47:29.66&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_1.bag test_04_1.bag]: '''duration:''' 01:23s, '''size:''' 21.6 MB, '''start date/time:''' Nov 16, 2017 1:50:22.18 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_2.bag test_04_2.bag]: '''duration:''' 01:20s, '''size:''' 21.0 MB, '''start date/time:''' Nov 16, 2017 12:52:18.44&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_3.bag test_04_3.bag]: '''duration:''' 01:09s, '''size:''' 18.1 MB, '''start date/time:''' Nov 16, 2017 12:53:51.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_4.bag test_04_4.bag]: '''duration:''' 1:21s, '''size:''' 21.3 MB, '''start date/time:''' Nov 16, 2017 12:55:31.81&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_05.bag test_05.bag]: '''duration:''' 15:07s, '''size:''' 232.4 MB, '''start date/time:''' Nov 28 2017 18:33:16.24&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_06.bag test_06.bag]: '''duration:''' 09:14s, '''size:''' 136.2 MB, '''start date/time:''' Nov 29 2017 13:12:42.61 &lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_07.bag test_07.bag]: '''duration:''' 4:39s, '''size:''' 71.5 MB, '''start date/time:''' Nov 22 2017 12:48:33.86 &lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_08.bag test_08.bag]: '''duration:''' 03:04s, '''size:''' 47.2 MB, '''start date/time:''' Nov 22 2017 12:59:08.58&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_1.bag test_09_1.bag]: '''duration:''' 28.8s, '''size:''' 7.7 MB, '''start date/time:''' Nov 28, 2017 17:17:46.49 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_2.bag test_09_2.bag]: '''duration:''' 31.9s, '''size:''' 8.4 MB, '''start date/time:''' Nov 28, 2017 17:20:48.26&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Nov 28, 2017 17:20:05.96&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_4.bag test_09_4.bag]: '''duration:''' 29.9s, '''size:''' 7.9 MB, '''start date/time:''' Nov 28, 2017 17:19:26.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_5.bag test_09_5.bag]: '''duration:''' 30.8s, '''size:''' 8.1 MB, '''start date/time:''' Nov 28, 2017 17:21:20.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_10.bag test_10.bag]: '''duration:''' 16:05s, '''size:''' 246.6 MB, '''start date/time:''' Nov 22 2017 11:55:02.39&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_11.bag test_11.bag]: '''duration:''' 03:25s, '''size:''' 52.8 MB, '''start date/time:''' Nov 22 2017 13:08:11.99&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_1.bag test_12_1.bag]: '''duration:''' 01:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:55:26.25&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_2.bag test_12_2.bag]: '''duration:''' 1:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:56:52.47&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_3.bag test_12_3.bag]: '''duration:''' 50.2s, '''size:'''  13.1 MB, '''start date/time:''' Nov 16, 2017 11:44:00.10&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_1.bag test_13_1.bag]: '''duration:''' 58.2s, '''size:''' 15.1 MB, '''start date/time:''' Nov 22, 2017 11:48:35.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_2.bag test_13_2.bag]: '''duration:''' 01:05s, '''size:''' 16.9 MB, '''start date/time:''' Nov 22, 2017 11:49:48.18&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_3.bag test_13_3.bag]: '''duration:''' 01:14s, '''size:''' 19.2 MB, '''start date/time:''' Nov 22, 2017 11:51:22.68&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_14.bag test_14.bag]: '''duration:''' 06:17s, '''size:''' 96.3 MB, '''start date/time:''' Nov 22 2017 12:23:07.92&lt;br /&gt;
&lt;br /&gt;
====Location 3. Bedroom====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_01.bag test_01.bag]: '''duration:''' 15:17s, '''size:''' 234.6 MB, '''start date/time:''' Nov 21, 2017 18:56:54.87&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 17:37:34.08&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_1.bag test_03_1.bag]: '''duration:''' 44.7s, '''size:''' 11.7 MB, '''start date/time:''' Nov 21, 2017 13:16:51.65&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_2.bag test_03_2.bag]: '''duration:''' 39.9s, '''size:''' 10.5 MB, '''start date/time:''' Nov 21, 2017 19:19:06.27&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_3.bag test_03_3.bag]: '''duration:''' 41.2s, '''size:''' 10.8 MB, '''start date/time:''' Nov 21, 2017 19:18:06.86&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_1.bag test_04_1.bag]: '''duration:''' 01:01s, '''size:''' 16.0 MB, '''start date/time:''' Nov 21, 2017 19:21:30.99 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_2.bag test_04_2.bag]: '''duration:''' 01:01s, '''size:''' 15.9 MB, '''start date/time:''' Nov 21, 2017 19:25:38.00&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_3.bag test_04_3.bag]: '''duration:''' 58.0s, '''size:''' 15.1 MB, '''start date/time:''' Nov 21, 2017 19:27:16.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_4.bag test_04_4.bag]: '''duration:''' 1:00s, '''size:''' 15.8 MB, '''start date/time:''' Nov 21, 2017 19:29:39.28&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_05.bag test_05.bag]: '''duration:''' 15:11s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 19:05:20.19&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_06.bag test_06.bag]: '''duration:''' 09:03s, '''size:''' 133.6 MB, '''start date/time:''' Nov 26, 2017 13:24:25.64&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Nov 27, 2017 17:34:50.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_08.bag test_08.bag]: '''duration:''' 03:47s, '''size:''' 58.3 MB, '''start date/time:''' Nov 27, 2017 17:41:48.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_1.bag test_09_1.bag]: '''duration:''' 31.3s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:23:58.31 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_2.bag test_09_2.bag]: '''duration:''' 30.4s, '''size:''' 8.0 MB, '''start date/time:''' Nov 28, 2017 17:24:41.09&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_3.bag test_09_3.bag]: '''duration:''' 31.2s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:25:22.22&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_4.bag test_09_4.bag]: '''duration:''' 31.4s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:26:06.98&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_5.bag test_09_5.bag]: '''duration:''' 33.2s, '''size:''' 8.8 MB, '''start date/time:''' Nov 28, 2017 17:26:51.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_10.bag test_10.bag]: '''duration:''' 10:00s, '''size:''' 153.7 MB, '''start date/time:''' Nov 22, 2017 18:15:29.53&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 59.1 MB, '''start date/time:''' Nov 22, 2017 18:25:54.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_1.bag test_12_1.bag]: '''duration:''' 41.3s, '''size:''' 10.8 MB, '''start date/time:''' Nov 22, 2017 18:35:11.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_2.bag test_12_2.bag]: '''duration:''' 41.8s, '''size:''' 10.9 MB, '''start date/time:''' Nov 22, 2017 18:36:12.11&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_3.bag test_12_3.bag]: '''duration:''' 39.8s, '''size:'''  10.5 MB, '''start date/time:''' Nov 22, 2017 18:37:05.06&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_1.bag test_13_1.bag]: '''duration:''' 01:01s, '''size:''' 14.9 MB, '''start date/time:''' Nov 22, 2017 18:40:13.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_2.bag test_13_2.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:41:28.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_3.bag test_13_3.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:42:39.52&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_14.bag test_14.bag]: '''duration:''' 06:12s, '''size:''' 95.1 MB, '''start date/time:''' Nov 23, 2017 12:53:06.48&lt;br /&gt;
&lt;br /&gt;
=== v2.0 [Feb-2018] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a second version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Location 1. Kitchen====&lt;br /&gt;
&lt;br /&gt;
Situation  1: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_01.bag cocina_01.bag]&lt;br /&gt;
Situation  2: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_02.bag cocina_02.bag]&lt;br /&gt;
Situation  3: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_03.bag cocina_03.bag]&lt;br /&gt;
Situation  4: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_04.bag cocina_04.bag]&lt;br /&gt;
Situation  5: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_05.bag cocina_05.bag]&lt;br /&gt;
Situation  6: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_06.bag cocina_06.bag]&lt;br /&gt;
Situation  7: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_07.bag cocina_07.bag]&lt;br /&gt;
Situation  8: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_08.bag cocina_08.bag]&lt;br /&gt;
Situation 19: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_09.bag cocina_09.bag]&lt;br /&gt;
Situation 10: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_10.bag cocina_10.bag]&lt;br /&gt;
Situation 11: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_11.bag cocina_11.bag]&lt;br /&gt;
Situation 12: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_12.bag cocina_12.bag]&lt;br /&gt;
Situation 13: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_13.bag cocina_13.bag]&lt;br /&gt;
Situation 14: [http://robotica.unileon.es/~datasets/LegTracking/v2.0/location1/cocina_14.bag cocina_14.bag]&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5210</id>
		<title>Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5210"/>
				<updated>2018-02-13T11:53:41Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;blockquote style=&amp;quot;background-color:#ffe;border:1px solid #fb0;padding:5px 10px&amp;quot;&amp;gt;SciCrunch! reference: '''Range-based people tracker classifiers Benchmark Dataset, RRID:SCR_015743.'''&amp;lt;/blockquote&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to evaluate the performance of different approaches for detecting and tracking people by using LIDAR sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers. &lt;br /&gt;
&lt;br /&gt;
Further information at [https://www.frontiersin.org/articles/10.3389/fnbot.2017.00072/full Álvarez-Aparicio et al. (2018)].&lt;br /&gt;
&lt;br /&gt;
== Materials and methods ==&lt;br /&gt;
&lt;br /&gt;
The following section describes the materials (shown in Figure 1) used to gather data, which include: a certified study area, an autonomous robot with an on-board LIDAR sensor, and a real-time location system (RTLS) to obtain ground-truth data about person location. Recorded data include location estimates calculated by two people trackers, LD and PeTra, also described below. Finally, the recording procedure used to build the dataset is explained.&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig1.jpg|frame|center|'''Fig. 1''': Materials used in the experiments. (A) Orbi-One robot. (B) Leon@Home Testbed plan. (C) Floor plan of the apartment. Red dots show the location of KIO anchors. Black numbered dots denote Orbi-One locations during the data gathering. (D) KIO anchors. (E) General view of the apartment. (F) Furniture in mock-up apartment.]]&lt;br /&gt;
&lt;br /&gt;
=== Leon@Home Testbed ===&lt;br /&gt;
&lt;br /&gt;
Data have been gathered at Leon@Home Testbed. This is a Certified Testbed of the European Robotics league (ERL). Its main purpose is to benchmark service robots in a realistic home environment. Our testbed is made up of four parts, shown in Figure 1B: a mock-up apartment, a control zone with direct vision (glass wall) into the apartment, a small workshop, and a larger development zone, where researchers work.&lt;br /&gt;
&lt;br /&gt;
Leon@Home Testbed is located on the second floor of the Módulo de Investigación en Cibernética (Building for Research in Cybernetics) on the Vegazana Campus of the University of León (Spain). The apartment is a single bedroom mock-up home built in an 8 m × 7 m space. Figure 1C shows a plan of the apartment. 60 cm high walls divide it into a kitchen, living room, bathroom, and bedroom. The furniture (Figures 1E,F) has been chosen to test different robot abilities. For instance, the kitchen cabinets all have different types of handles.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-one ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One (Figure 1A) is an assistant robot manufactured by Robotnik. It has several sensors, among them, a RGBD camera, a LIDAR sensor, and an inertial unit. It can operate a manipulator arm attached to its torso and has a wheeled base for moving around the room. Orbi-One includes a wireless access point, which allows WiFi communications with other robots and computers.&lt;br /&gt;
&lt;br /&gt;
The software to control the robot hardware is based on a ROS framework. ROS is basically a set of libraries for robotics similar to operating system services, providing hardware abstraction for sensors and actuators, low-level device control, and inter-process communication. Computation takes place in processes named Nodes, which can receive and send Messages. Nodes publish Messages into information buffers called Topics.&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
In order to acquire ground-truth data about person location in the study area, we need an RTLS for indoor environments. The KIO RTLS commercial solution by Eliko has been used. KIO is a precise RTLS for tracking any object in 2- or 3-dimensional space. The Ultra Wideband technology enables to micro-position objects through obstructions. KIO also works in non-line-of-sight conditions and both indoors and outdoors.&lt;br /&gt;
&lt;br /&gt;
KIO comes in two main configurations. The Regular Cell configuration guarantees a reliable accuracy of ±30 cm, according to the manufacturer’s specifications. The Small Cell configuration is designed for location-critical applications and provides reliable ±5 cm accuracy, according to the manufacturer’s specifications. Calibration done by the authors of this paper on the mock-up apartment shows that the error is higher in some areas, and lower in others, but on average, the claims of the manufacturer are correct.&lt;br /&gt;
&lt;br /&gt;
KIO calculates the position of a mobile transceiver, called a Tag. In order to do so, KIO uses radio beacons, called Anchors, distributed in known positions in the surroundings. Figure 1D shows a KIO anchor. KIO tags are the same size and must be placed on-board the tracking subject, in our case people. The red dots in Figure 1C show the location of the six anchors used in these experiments. They are placed on the ceiling. The distribution of the anchors has been chosen following the method shown in Guerrero-Higueras et al. (2017).&lt;br /&gt;
&lt;br /&gt;
=== Leg Detector (D) ===&lt;br /&gt;
&lt;br /&gt;
LD is a ROS package, which takes messages published by a LIDAR sensor as input and uses a machine-learning-trained classifier to detect groups of laser readings as possible legs. The code is available in a public repository, but is unsupported at this time.&lt;br /&gt;
&lt;br /&gt;
LD publishes the location for the individual legs. It can also attempt to pair the legs and publish their average as an estimate of where the center of a person is. LD may optionally also publish visualization marker messages to indicate where detections happened.&lt;br /&gt;
&lt;br /&gt;
=== PeTra ===&lt;br /&gt;
&lt;br /&gt;
PeTra is a person-tracker tool for detecting and tracking, developed by the Robotics Group at the University of León. The system is based on a Convolutional Neural Network (CNN) using a configuration based on the U-Net architecture by Ronneberger et al. (2015).&lt;br /&gt;
&lt;br /&gt;
The system performs the following steps in real time:&lt;br /&gt;
&lt;br /&gt;
1. First, the data provided by the LIDAR sensor are processed to build a two dimensional occupancy map centered around the robot. This occupancy map is represented as a binary matrix, where 1s denote positions where the LIDAR scan found an obstacle, and 0s denote positions where the LIDAR scan either went through without detecting any obstacle or did not go through that position.&lt;br /&gt;
&lt;br /&gt;
2. Then, the occupancy map is relayed to the network as input data. The network produces a second occupancy map representing the zones where legs have been detected.&lt;br /&gt;
&lt;br /&gt;
3. Finally, center of mass calculations return the location of persons. PeTra also publishes locations for the individual legs and Marker messages for visualization.&lt;br /&gt;
&lt;br /&gt;
=== Recording procedure ===&lt;br /&gt;
&lt;br /&gt;
The data were gathered in 14 different situations. In all of them, Orbi-One stood still as one or more people, carrying a KIO tag, moved around it. Three different locations for Orbi-One were defined (see Figure 1C) resulting in 42 scenarios (14 situations × 3 Orbi-One locations). Figure 2 shows the 14 different recognition scenarios recorded. These scenarios have been chosen according to different situations that may occur in robotics competitions such as ERL or RoboCup.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig2.png|frame|center|'''Fig. 2''': recognition scenarios recorded.]]&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
A rosbag file was created for each scenario (except for situations 3, 12, and 13 where 3 rosbag files where recorded, for situation 4 where 4 rosbag files where recorded, and for situation 9 where 5 rosbag files where recorded), recording LIDAR sensor measurements, location estimates from PeTra and LD, locations from KIO RTLS, and other useful data. Specifically, the following data were included in the rosbag files:&lt;br /&gt;
&lt;br /&gt;
* LIDAR sensor data. Data from LIDAR sensors are provided as ROS LaserScan Messages, which include, among other information, the following: acquisition time of the first ray in the scan, start/end angle of the scan, angular distance between measurements, and range data.&lt;br /&gt;
* PeTra location estimates provided as ROS PointStamped Messages, which include a position [x, y, z] and a timestamp.&lt;br /&gt;
* Location estimates calculated by LD. It publishes data for individual legs (as ROS PositionMeasurementArray Messages). It also attempts to pair the legs and publishes their average as an estimate of where the center of a person is as a ROS PositionMeasurement Message.&lt;br /&gt;
* Locations provided by KIO RTLS also provided as ROS PointStamped Messages.&lt;br /&gt;
* Messages from /map, /odom, and /tf ROS topics, which include map information, odometry of the robot base, and transformation information, respectively.&lt;br /&gt;
&lt;br /&gt;
Differents versions of the dataset are enumerated below.&lt;br /&gt;
&lt;br /&gt;
=== v1.0 [Nov-2017] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a first version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Location 1. Kitchen====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_01.bag test_01.bag]: '''duration:''' 14:56s, '''size:''' 227.8 MB, '''start date/time:''' Jul 20, 2017 12:49:21.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Jul 26, 2017 11:01:24.72&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_1.bag test_03_1.bag]: '''duration:''' 39.9s, '''size:''' 10.4 MB, '''start date/time:''' Jul 20, 2017 13:27:25.50&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_2.bag test_03_2.bag]: '''duration:''' 39.1s, '''size:''' 10.2 MB, '''start date/time:''' Jul 20, 2017 13:28:56.41&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_3.bag test_03_3.bag]: '''duration:''' 40.5s, '''size:''' 10.5 MB, '''start date/time:''' Jul 20, 2017 13:30:04.94&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_1.bag test_04_1.bag]: '''duration:''' 58.3s, '''size:''' 15.0 MB, '''start date/time:''' Jul 25, 2017 10:39:52.62 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_2.bag test_04_2.bag]: '''duration:''' 57.2s, '''size:''' 14.7 MB, '''start date/time:''' Jul 25, 2017 10:41:16.31&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_3.bag test_04_3.bag]: '''duration:''' 50.5s, '''size:''' 13.0 MB, '''start date/time:''' Jul 25, 2017 10:42:44.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_4.bag test_04_4.bag]: '''duration:''' 1:01s, '''size:''' 15.7 MB, '''start date/time:''' Jul 25, 2017 10:43:52.44&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_05.bag test_05.bag]: '''duration:''' 15:15s, '''size:''' 236.8 MB, '''start date/time:''' Jul 26, 2017 11:33:13.31&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_06.bag test_06.bag]: '''duration:''' 09:18s, '''size:''' 143.3 MB, '''start date/time:''' Jul 26, 2017 12:25:45.12&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Jul 25, 2017 11:40:01.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_08.bag test_08.bag]: '''duration:''' 03:39s, '''size:''' 55.8 MB, '''start date/time:''' Jul 25, 2017 12:25:29.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_1.bag test_09_1.bag]: '''duration:''' 22.9s, '''size:''' 6.1 MB, '''start date/time:''' Jul 25, 2017 10:50:02.95 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_2.bag test_09_2.bag]: '''duration:''' 20.8s, '''size:''' 5.6 MB, '''start date/time:''' Jul 25, 2017 10:51:02.62&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Jul 25, 2017 10:51:45.96 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_4.bag test_09_4.bag]: '''duration:''' 29.3s, '''size:''' 7.9 MB, '''start date/time:''' Jul 25, 2017 10:52:51.24&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_5.bag test_09_5.bag]: '''duration:''' 36.9s, '''size:''' 9.7 MB, '''start date/time:''' Jul 25, 2017 10:54:00.13&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_10.bag test_10.bag]: '''duration:''' 15:50s, '''size:''' 240.5 MB, '''start date/time:''' Jul 20, 2017 13:07:40.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 58.7 MB, '''start date/time:''' Jul 25, 2017 11:48:33.90&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_1.bag test_12_1.bag]: '''duration:''' 43.6s, '''size:''' 11.3 MB, '''start date/time:''' Jul 20, 2017 13:33:23.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_2.bag test_12_2.bag]: '''duration:''' 44.3s, '''size:''' 11.5 MB, '''start date/time:''' Jul 20, 2017 13:34:24.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_3.bag test_12_3.bag]: '''duration:''' 37.3s, '''size:'''  9.7 MB, '''start date/time:''' Jul 20, 2017 13:35:31.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_1.bag test_13_1.bag]: '''duration:''' 57.8s, '''size:''' 14.9 MB, '''start date/time:''' Jul 25, 2017 11:01:15.23&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_2.bag test_13_2.bag]: '''duration:''' 59.9s, '''size:''' 15.4 MB, '''start date/time:''' Jul 25, 2017 11:02:37.85&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_3.bag test_13_3.bag]: '''duration:''' 54.0s, '''size:''' 13.9 MB, '''start date/time:''' Jul 25, 2017 11:04:30.69&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_14.bag test_14.bag]: '''duration:''' 05.57s, '''size:''' 90.5 MB, '''start date/time:''' Jul 25, 2017 11:10:32.97&lt;br /&gt;
&lt;br /&gt;
====Location 2. Living room====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_01.bag test_01.bag]: '''duration:''' 15:37s, '''size:''' 239.7 MB, '''start date/time:''' Nov 16, 2017 12:20:19.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_02.bag test_02.bag]: '''duration:''' 15:05s, '''size:''' 231.4 MB, '''start date/time:''' Nov 27, 2017 18:32:24.35 &lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_1.bag test_03_1.bag]: '''duration:''' 55.2s, '''size:''' 14.5 MB, '''start date/time:''' Nov 16, 2017 12:43:58.29 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_2.bag test_03_2.bag]: '''duration:''' 44.4s, '''size:''' 10.2 MB, '''start date/time:''' Nov 16, 2017 12:46:31.99&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_3.bag test_03_3.bag]: '''duration:''' 51.1s, '''size:''' 13.4 MB, '''start date/time:''' Nov 16, 2017 12:47:29.66&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_1.bag test_04_1.bag]: '''duration:''' 01:23s, '''size:''' 21.6 MB, '''start date/time:''' Nov 16, 2017 1:50:22.18 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_2.bag test_04_2.bag]: '''duration:''' 01:20s, '''size:''' 21.0 MB, '''start date/time:''' Nov 16, 2017 12:52:18.44&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_3.bag test_04_3.bag]: '''duration:''' 01:09s, '''size:''' 18.1 MB, '''start date/time:''' Nov 16, 2017 12:53:51.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_4.bag test_04_4.bag]: '''duration:''' 1:21s, '''size:''' 21.3 MB, '''start date/time:''' Nov 16, 2017 12:55:31.81&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_05.bag test_05.bag]: '''duration:''' 15:07s, '''size:''' 232.4 MB, '''start date/time:''' Nov 28 2017 18:33:16.24&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_06.bag test_06.bag]: '''duration:''' 09:14s, '''size:''' 136.2 MB, '''start date/time:''' Nov 29 2017 13:12:42.61 &lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_07.bag test_07.bag]: '''duration:''' 4:39s, '''size:''' 71.5 MB, '''start date/time:''' Nov 22 2017 12:48:33.86 &lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_08.bag test_08.bag]: '''duration:''' 03:04s, '''size:''' 47.2 MB, '''start date/time:''' Nov 22 2017 12:59:08.58&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_1.bag test_09_1.bag]: '''duration:''' 28.8s, '''size:''' 7.7 MB, '''start date/time:''' Nov 28, 2017 17:17:46.49 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_2.bag test_09_2.bag]: '''duration:''' 31.9s, '''size:''' 8.4 MB, '''start date/time:''' Nov 28, 2017 17:20:48.26&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Nov 28, 2017 17:20:05.96&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_4.bag test_09_4.bag]: '''duration:''' 29.9s, '''size:''' 7.9 MB, '''start date/time:''' Nov 28, 2017 17:19:26.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_5.bag test_09_5.bag]: '''duration:''' 30.8s, '''size:''' 8.1 MB, '''start date/time:''' Nov 28, 2017 17:21:20.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_10.bag test_10.bag]: '''duration:''' 16:05s, '''size:''' 246.6 MB, '''start date/time:''' Nov 22 2017 11:55:02.39&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_11.bag test_11.bag]: '''duration:''' 03:25s, '''size:''' 52.8 MB, '''start date/time:''' Nov 22 2017 13:08:11.99&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_1.bag test_12_1.bag]: '''duration:''' 01:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:55:26.25&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_2.bag test_12_2.bag]: '''duration:''' 1:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:56:52.47&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_3.bag test_12_3.bag]: '''duration:''' 50.2s, '''size:'''  13.1 MB, '''start date/time:''' Nov 16, 2017 11:44:00.10&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_1.bag test_13_1.bag]: '''duration:''' 58.2s, '''size:''' 15.1 MB, '''start date/time:''' Nov 22, 2017 11:48:35.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_2.bag test_13_2.bag]: '''duration:''' 01:05s, '''size:''' 16.9 MB, '''start date/time:''' Nov 22, 2017 11:49:48.18&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_3.bag test_13_3.bag]: '''duration:''' 01:14s, '''size:''' 19.2 MB, '''start date/time:''' Nov 22, 2017 11:51:22.68&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_14.bag test_14.bag]: '''duration:''' 06:17s, '''size:''' 96.3 MB, '''start date/time:''' Nov 22 2017 12:23:07.92&lt;br /&gt;
&lt;br /&gt;
====Location 3. Bedroom====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_01.bag test_01.bag]: '''duration:''' 15:17s, '''size:''' 234.6 MB, '''start date/time:''' Nov 21, 2017 18:56:54.87&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 17:37:34.08&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_1.bag test_03_1.bag]: '''duration:''' 44.7s, '''size:''' 11.7 MB, '''start date/time:''' Nov 21, 2017 13:16:51.65&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_2.bag test_03_2.bag]: '''duration:''' 39.9s, '''size:''' 10.5 MB, '''start date/time:''' Nov 21, 2017 19:19:06.27&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_3.bag test_03_3.bag]: '''duration:''' 41.2s, '''size:''' 10.8 MB, '''start date/time:''' Nov 21, 2017 19:18:06.86&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_1.bag test_04_1.bag]: '''duration:''' 01:01s, '''size:''' 16.0 MB, '''start date/time:''' Nov 21, 2017 19:21:30.99 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_2.bag test_04_2.bag]: '''duration:''' 01:01s, '''size:''' 15.9 MB, '''start date/time:''' Nov 21, 2017 19:25:38.00&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_3.bag test_04_3.bag]: '''duration:''' 58.0s, '''size:''' 15.1 MB, '''start date/time:''' Nov 21, 2017 19:27:16.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_4.bag test_04_4.bag]: '''duration:''' 1:00s, '''size:''' 15.8 MB, '''start date/time:''' Nov 21, 2017 19:29:39.28&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_05.bag test_05.bag]: '''duration:''' 15:11s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 19:05:20.19&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_06.bag test_06.bag]: '''duration:''' 09:03s, '''size:''' 133.6 MB, '''start date/time:''' Nov 26, 2017 13:24:25.64&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Nov 27, 2017 17:34:50.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_08.bag test_08.bag]: '''duration:''' 03:47s, '''size:''' 58.3 MB, '''start date/time:''' Nov 27, 2017 17:41:48.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_1.bag test_09_1.bag]: '''duration:''' 31.3s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:23:58.31 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_2.bag test_09_2.bag]: '''duration:''' 30.4s, '''size:''' 8.0 MB, '''start date/time:''' Nov 28, 2017 17:24:41.09&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_3.bag test_09_3.bag]: '''duration:''' 31.2s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:25:22.22&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_4.bag test_09_4.bag]: '''duration:''' 31.4s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:26:06.98&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_5.bag test_09_5.bag]: '''duration:''' 33.2s, '''size:''' 8.8 MB, '''start date/time:''' Nov 28, 2017 17:26:51.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_10.bag test_10.bag]: '''duration:''' 10:00s, '''size:''' 153.7 MB, '''start date/time:''' Nov 22, 2017 18:15:29.53&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 59.1 MB, '''start date/time:''' Nov 22, 2017 18:25:54.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_1.bag test_12_1.bag]: '''duration:''' 41.3s, '''size:''' 10.8 MB, '''start date/time:''' Nov 22, 2017 18:35:11.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_2.bag test_12_2.bag]: '''duration:''' 41.8s, '''size:''' 10.9 MB, '''start date/time:''' Nov 22, 2017 18:36:12.11&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_3.bag test_12_3.bag]: '''duration:''' 39.8s, '''size:'''  10.5 MB, '''start date/time:''' Nov 22, 2017 18:37:05.06&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_1.bag test_13_1.bag]: '''duration:''' 01:01s, '''size:''' 14.9 MB, '''start date/time:''' Nov 22, 2017 18:40:13.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_2.bag test_13_2.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:41:28.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_3.bag test_13_3.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:42:39.52&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_14.bag test_14.bag]: '''duration:''' 06:12s, '''size:''' 95.1 MB, '''start date/time:''' Nov 23, 2017 12:53:06.48&lt;br /&gt;
&lt;br /&gt;
=== v2.0 [Feb-2018] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a second version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Location 1. Kitchen====&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=File:PeopleTrackingFig2.png&amp;diff=5209</id>
		<title>File:PeopleTrackingFig2.png</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=File:PeopleTrackingFig2.png&amp;diff=5209"/>
				<updated>2018-02-13T11:43:32Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: Am6 uploaded a new version of File:PeopleTrackingFig2.png&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5208</id>
		<title>Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5208"/>
				<updated>2018-02-13T11:41:55Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;blockquote style=&amp;quot;background-color:#ffe;border:1px solid #fb0;padding:5px 10px&amp;quot;&amp;gt;SciCrunch! reference: '''Range-based people tracker classifiers Benchmark Dataset, RRID:SCR_015743.'''&amp;lt;/blockquote&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to evaluate the performance of different approaches for detecting and tracking people by using LIDAR sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers. &lt;br /&gt;
&lt;br /&gt;
Further information at [https://www.frontiersin.org/articles/10.3389/fnbot.2017.00072/full Álvarez-Aparicio et al. (2018)].&lt;br /&gt;
&lt;br /&gt;
== Materials and methods ==&lt;br /&gt;
&lt;br /&gt;
The following section describes the materials (shown in Figure 1) used to gather data, which include: a certified study area, an autonomous robot with an on-board LIDAR sensor, and a real-time location system (RTLS) to obtain ground-truth data about person location. Recorded data include location estimates calculated by two people trackers, LD and PeTra, also described below. Finally, the recording procedure used to build the dataset is explained.&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig1.jpg|frame|center|'''Fig. 1''': Materials used in the experiments. (A) Orbi-One robot. (B) Leon@Home Testbed plan. (C) Floor plan of the apartment. Red dots show the location of KIO anchors. Black numbered dots denote Orbi-One locations during the data gathering. (D) KIO anchors. (E) General view of the apartment. (F) Furniture in mock-up apartment.]]&lt;br /&gt;
&lt;br /&gt;
=== Leon@Home Testbed ===&lt;br /&gt;
&lt;br /&gt;
Data have been gathered at Leon@Home Testbed. This is a Certified Testbed of the European Robotics league (ERL). Its main purpose is to benchmark service robots in a realistic home environment. Our testbed is made up of four parts, shown in Figure 1B: a mock-up apartment, a control zone with direct vision (glass wall) into the apartment, a small workshop, and a larger development zone, where researchers work.&lt;br /&gt;
&lt;br /&gt;
Leon@Home Testbed is located on the second floor of the Módulo de Investigación en Cibernética (Building for Research in Cybernetics) on the Vegazana Campus of the University of León (Spain). The apartment is a single bedroom mock-up home built in an 8 m × 7 m space. Figure 1C shows a plan of the apartment. 60 cm high walls divide it into a kitchen, living room, bathroom, and bedroom. The furniture (Figures 1E,F) has been chosen to test different robot abilities. For instance, the kitchen cabinets all have different types of handles.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-one ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One (Figure 1A) is an assistant robot manufactured by Robotnik. It has several sensors, among them, a RGBD camera, a LIDAR sensor, and an inertial unit. It can operate a manipulator arm attached to its torso and has a wheeled base for moving around the room. Orbi-One includes a wireless access point, which allows WiFi communications with other robots and computers.&lt;br /&gt;
&lt;br /&gt;
The software to control the robot hardware is based on a ROS framework. ROS is basically a set of libraries for robotics similar to operating system services, providing hardware abstraction for sensors and actuators, low-level device control, and inter-process communication. Computation takes place in processes named Nodes, which can receive and send Messages. Nodes publish Messages into information buffers called Topics.&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
In order to acquire ground-truth data about person location in the study area, we need an RTLS for indoor environments. The KIO RTLS commercial solution by Eliko has been used. KIO is a precise RTLS for tracking any object in 2- or 3-dimensional space. The Ultra Wideband technology enables to micro-position objects through obstructions. KIO also works in non-line-of-sight conditions and both indoors and outdoors.&lt;br /&gt;
&lt;br /&gt;
KIO comes in two main configurations. The Regular Cell configuration guarantees a reliable accuracy of ±30 cm, according to the manufacturer’s specifications. The Small Cell configuration is designed for location-critical applications and provides reliable ±5 cm accuracy, according to the manufacturer’s specifications. Calibration done by the authors of this paper on the mock-up apartment shows that the error is higher in some areas, and lower in others, but on average, the claims of the manufacturer are correct.&lt;br /&gt;
&lt;br /&gt;
KIO calculates the position of a mobile transceiver, called a Tag. In order to do so, KIO uses radio beacons, called Anchors, distributed in known positions in the surroundings. Figure 1D shows a KIO anchor. KIO tags are the same size and must be placed on-board the tracking subject, in our case people. The red dots in Figure 1C show the location of the six anchors used in these experiments. They are placed on the ceiling. The distribution of the anchors has been chosen following the method shown in Guerrero-Higueras et al. (2017).&lt;br /&gt;
&lt;br /&gt;
=== Leg Detector (D) ===&lt;br /&gt;
&lt;br /&gt;
LD is a ROS package, which takes messages published by a LIDAR sensor as input and uses a machine-learning-trained classifier to detect groups of laser readings as possible legs. The code is available in a public repository, but is unsupported at this time.&lt;br /&gt;
&lt;br /&gt;
LD publishes the location for the individual legs. It can also attempt to pair the legs and publish their average as an estimate of where the center of a person is. LD may optionally also publish visualization marker messages to indicate where detections happened.&lt;br /&gt;
&lt;br /&gt;
=== PeTra ===&lt;br /&gt;
&lt;br /&gt;
PeTra is a person-tracker tool for detecting and tracking, developed by the Robotics Group at the University of León. The system is based on a Convolutional Neural Network (CNN) using a configuration based on the U-Net architecture by Ronneberger et al. (2015).&lt;br /&gt;
&lt;br /&gt;
The system performs the following steps in real time:&lt;br /&gt;
&lt;br /&gt;
1. First, the data provided by the LIDAR sensor are processed to build a two dimensional occupancy map centered around the robot. This occupancy map is represented as a binary matrix, where 1s denote positions where the LIDAR scan found an obstacle, and 0s denote positions where the LIDAR scan either went through without detecting any obstacle or did not go through that position.&lt;br /&gt;
&lt;br /&gt;
2. Then, the occupancy map is relayed to the network as input data. The network produces a second occupancy map representing the zones where legs have been detected.&lt;br /&gt;
&lt;br /&gt;
3. Finally, center of mass calculations return the location of persons. PeTra also publishes locations for the individual legs and Marker messages for visualization.&lt;br /&gt;
&lt;br /&gt;
=== Recording procedure ===&lt;br /&gt;
&lt;br /&gt;
The data were gathered in 14 different situations. In all of them, Orbi-One stood still as one or more people, carrying a KIO tag, moved around it. Three different locations for Orbi-One were defined (see Figure 1C) resulting in 42 scenarios (14 situations × 3 Orbi-One locations). Figure 2 shows the 14 different recognition scenarios recorded. These scenarios have been chosen according to different situations that may occur in robotics competitions such as ERL or RoboCup.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig2.png|frame|center|'''Fig. 2''': recognition scenarios recorded.]]&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
A rosbag file was created for each scenario (except for situations 3, 12, and 13 where 3 rosbag files where recorded, for situation 4 where 4 rosbag files where recorded, and for situation 9 where 5 rosbag files where recorded), recording LIDAR sensor measurements, location estimates from PeTra and LD, locations from KIO RTLS, and other useful data. Specifically, the following data were included in the rosbag files:&lt;br /&gt;
&lt;br /&gt;
* LIDAR sensor data. Data from LIDAR sensors are provided as ROS LaserScan Messages, which include, among other information, the following: acquisition time of the first ray in the scan, start/end angle of the scan, angular distance between measurements, and range data.&lt;br /&gt;
* PeTra location estimates provided as ROS PointStamped Messages, which include a position [x, y, z] and a timestamp.&lt;br /&gt;
* Location estimates calculated by LD. It publishes data for individual legs (as ROS PositionMeasurementArray Messages). It also attempts to pair the legs and publishes their average as an estimate of where the center of a person is as a ROS PositionMeasurement Message.&lt;br /&gt;
* Locations provided by KIO RTLS also provided as ROS PointStamped Messages.&lt;br /&gt;
* Messages from /map, /odom, and /tf ROS topics, which include map information, odometry of the robot base, and transformation information, respectively.&lt;br /&gt;
&lt;br /&gt;
Differents versions of the dataset are enumerated below.&lt;br /&gt;
&lt;br /&gt;
=== v1.0 [Nov-2017] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a first version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Scenario 1. Kitchen====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_01.bag test_01.bag]: '''duration:''' 14:56s, '''size:''' 227.8 MB, '''start date/time:''' Jul 20, 2017 12:49:21.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Jul 26, 2017 11:01:24.72&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_1.bag test_03_1.bag]: '''duration:''' 39.9s, '''size:''' 10.4 MB, '''start date/time:''' Jul 20, 2017 13:27:25.50&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_2.bag test_03_2.bag]: '''duration:''' 39.1s, '''size:''' 10.2 MB, '''start date/time:''' Jul 20, 2017 13:28:56.41&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_3.bag test_03_3.bag]: '''duration:''' 40.5s, '''size:''' 10.5 MB, '''start date/time:''' Jul 20, 2017 13:30:04.94&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_1.bag test_04_1.bag]: '''duration:''' 58.3s, '''size:''' 15.0 MB, '''start date/time:''' Jul 25, 2017 10:39:52.62 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_2.bag test_04_2.bag]: '''duration:''' 57.2s, '''size:''' 14.7 MB, '''start date/time:''' Jul 25, 2017 10:41:16.31&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_3.bag test_04_3.bag]: '''duration:''' 50.5s, '''size:''' 13.0 MB, '''start date/time:''' Jul 25, 2017 10:42:44.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_4.bag test_04_4.bag]: '''duration:''' 1:01s, '''size:''' 15.7 MB, '''start date/time:''' Jul 25, 2017 10:43:52.44&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_05.bag test_05.bag]: '''duration:''' 15:15s, '''size:''' 236.8 MB, '''start date/time:''' Jul 26, 2017 11:33:13.31&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_06.bag test_06.bag]: '''duration:''' 09:18s, '''size:''' 143.3 MB, '''start date/time:''' Jul 26, 2017 12:25:45.12&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Jul 25, 2017 11:40:01.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_08.bag test_08.bag]: '''duration:''' 03:39s, '''size:''' 55.8 MB, '''start date/time:''' Jul 25, 2017 12:25:29.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_1.bag test_09_1.bag]: '''duration:''' 22.9s, '''size:''' 6.1 MB, '''start date/time:''' Jul 25, 2017 10:50:02.95 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_2.bag test_09_2.bag]: '''duration:''' 20.8s, '''size:''' 5.6 MB, '''start date/time:''' Jul 25, 2017 10:51:02.62&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Jul 25, 2017 10:51:45.96 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_4.bag test_09_4.bag]: '''duration:''' 29.3s, '''size:''' 7.9 MB, '''start date/time:''' Jul 25, 2017 10:52:51.24&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_5.bag test_09_5.bag]: '''duration:''' 36.9s, '''size:''' 9.7 MB, '''start date/time:''' Jul 25, 2017 10:54:00.13&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_10.bag test_10.bag]: '''duration:''' 15:50s, '''size:''' 240.5 MB, '''start date/time:''' Jul 20, 2017 13:07:40.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 58.7 MB, '''start date/time:''' Jul 25, 2017 11:48:33.90&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_1.bag test_12_1.bag]: '''duration:''' 43.6s, '''size:''' 11.3 MB, '''start date/time:''' Jul 20, 2017 13:33:23.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_2.bag test_12_2.bag]: '''duration:''' 44.3s, '''size:''' 11.5 MB, '''start date/time:''' Jul 20, 2017 13:34:24.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_3.bag test_12_3.bag]: '''duration:''' 37.3s, '''size:'''  9.7 MB, '''start date/time:''' Jul 20, 2017 13:35:31.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_1.bag test_13_1.bag]: '''duration:''' 57.8s, '''size:''' 14.9 MB, '''start date/time:''' Jul 25, 2017 11:01:15.23&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_2.bag test_13_2.bag]: '''duration:''' 59.9s, '''size:''' 15.4 MB, '''start date/time:''' Jul 25, 2017 11:02:37.85&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_3.bag test_13_3.bag]: '''duration:''' 54.0s, '''size:''' 13.9 MB, '''start date/time:''' Jul 25, 2017 11:04:30.69&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_14.bag test_14.bag]: '''duration:''' 05.57s, '''size:''' 90.5 MB, '''start date/time:''' Jul 25, 2017 11:10:32.97&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====Scenario 2. Living room====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_01.bag test_01.bag]: '''duration:''' 15:37s, '''size:''' 239.7 MB, '''start date/time:''' Nov 16, 2017 12:20:19.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_02.bag test_02.bag]: '''duration:''' 15:05s, '''size:''' 231.4 MB, '''start date/time:''' Nov 27, 2017 18:32:24.35 &lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_1.bag test_03_1.bag]: '''duration:''' 55.2s, '''size:''' 14.5 MB, '''start date/time:''' Nov 16, 2017 12:43:58.29 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_2.bag test_03_2.bag]: '''duration:''' 44.4s, '''size:''' 10.2 MB, '''start date/time:''' Nov 16, 2017 12:46:31.99&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_3.bag test_03_3.bag]: '''duration:''' 51.1s, '''size:''' 13.4 MB, '''start date/time:''' Nov 16, 2017 12:47:29.66&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_1.bag test_04_1.bag]: '''duration:''' 01:23s, '''size:''' 21.6 MB, '''start date/time:''' Nov 16, 2017 1:50:22.18 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_2.bag test_04_2.bag]: '''duration:''' 01:20s, '''size:''' 21.0 MB, '''start date/time:''' Nov 16, 2017 12:52:18.44&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_3.bag test_04_3.bag]: '''duration:''' 01:09s, '''size:''' 18.1 MB, '''start date/time:''' Nov 16, 2017 12:53:51.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_4.bag test_04_4.bag]: '''duration:''' 1:21s, '''size:''' 21.3 MB, '''start date/time:''' Nov 16, 2017 12:55:31.81&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_05.bag test_05.bag]: '''duration:''' 15:07s, '''size:''' 232.4 MB, '''start date/time:''' Nov 28 2017 18:33:16.24&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_06.bag test_06.bag]: '''duration:''' 09:14s, '''size:''' 136.2 MB, '''start date/time:''' Nov 29 2017 13:12:42.61 &lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_07.bag test_07.bag]: '''duration:''' 4:39s, '''size:''' 71.5 MB, '''start date/time:''' Nov 22 2017 12:48:33.86 &lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_08.bag test_08.bag]: '''duration:''' 03:04s, '''size:''' 47.2 MB, '''start date/time:''' Nov 22 2017 12:59:08.58&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_1.bag test_09_1.bag]: '''duration:''' 28.8s, '''size:''' 7.7 MB, '''start date/time:''' Nov 28, 2017 17:17:46.49 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_2.bag test_09_2.bag]: '''duration:''' 31.9s, '''size:''' 8.4 MB, '''start date/time:''' Nov 28, 2017 17:20:48.26&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Nov 28, 2017 17:20:05.96&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_4.bag test_09_4.bag]: '''duration:''' 29.9s, '''size:''' 7.9 MB, '''start date/time:''' Nov 28, 2017 17:19:26.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_5.bag test_09_5.bag]: '''duration:''' 30.8s, '''size:''' 8.1 MB, '''start date/time:''' Nov 28, 2017 17:21:20.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_10.bag test_10.bag]: '''duration:''' 16:05s, '''size:''' 246.6 MB, '''start date/time:''' Nov 22 2017 11:55:02.39&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_11.bag test_11.bag]: '''duration:''' 03:25s, '''size:''' 52.8 MB, '''start date/time:''' Nov 22 2017 13:08:11.99&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_1.bag test_12_1.bag]: '''duration:''' 01:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:55:26.25&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_2.bag test_12_2.bag]: '''duration:''' 1:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:56:52.47&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_3.bag test_12_3.bag]: '''duration:''' 50.2s, '''size:'''  13.1 MB, '''start date/time:''' Nov 16, 2017 11:44:00.10&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_1.bag test_13_1.bag]: '''duration:''' 58.2s, '''size:''' 15.1 MB, '''start date/time:''' Nov 22, 2017 11:48:35.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_2.bag test_13_2.bag]: '''duration:''' 01:05s, '''size:''' 16.9 MB, '''start date/time:''' Nov 22, 2017 11:49:48.18&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_3.bag test_13_3.bag]: '''duration:''' 01:14s, '''size:''' 19.2 MB, '''start date/time:''' Nov 22, 2017 11:51:22.68&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_14.bag test_14.bag]: '''duration:''' 06:17s, '''size:''' 96.3 MB, '''start date/time:''' Nov 22 2017 12:23:07.92&lt;br /&gt;
&lt;br /&gt;
====Scenario 3. Bedroom====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_01.bag test_01.bag]: '''duration:''' 15:17s, '''size:''' 234.6 MB, '''start date/time:''' Nov 21, 2017 18:56:54.87&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 17:37:34.08&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_1.bag test_03_1.bag]: '''duration:''' 44.7s, '''size:''' 11.7 MB, '''start date/time:''' Nov 21, 2017 13:16:51.65&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_2.bag test_03_2.bag]: '''duration:''' 39.9s, '''size:''' 10.5 MB, '''start date/time:''' Nov 21, 2017 19:19:06.27&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_3.bag test_03_3.bag]: '''duration:''' 41.2s, '''size:''' 10.8 MB, '''start date/time:''' Nov 21, 2017 19:18:06.86&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_1.bag test_04_1.bag]: '''duration:''' 01:01s, '''size:''' 16.0 MB, '''start date/time:''' Nov 21, 2017 19:21:30.99 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_2.bag test_04_2.bag]: '''duration:''' 01:01s, '''size:''' 15.9 MB, '''start date/time:''' Nov 21, 2017 19:25:38.00&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_3.bag test_04_3.bag]: '''duration:''' 58.0s, '''size:''' 15.1 MB, '''start date/time:''' Nov 21, 2017 19:27:16.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_4.bag test_04_4.bag]: '''duration:''' 1:00s, '''size:''' 15.8 MB, '''start date/time:''' Nov 21, 2017 19:29:39.28&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_05.bag test_05.bag]: '''duration:''' 15:11s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 19:05:20.19&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_06.bag test_06.bag]: '''duration:''' 09:03s, '''size:''' 133.6 MB, '''start date/time:''' Nov 26, 2017 13:24:25.64&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Nov 27, 2017 17:34:50.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_08.bag test_08.bag]: '''duration:''' 03:47s, '''size:''' 58.3 MB, '''start date/time:''' Nov 27, 2017 17:41:48.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_1.bag test_09_1.bag]: '''duration:''' 31.3s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:23:58.31 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_2.bag test_09_2.bag]: '''duration:''' 30.4s, '''size:''' 8.0 MB, '''start date/time:''' Nov 28, 2017 17:24:41.09&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_3.bag test_09_3.bag]: '''duration:''' 31.2s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:25:22.22&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_4.bag test_09_4.bag]: '''duration:''' 31.4s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:26:06.98&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_5.bag test_09_5.bag]: '''duration:''' 33.2s, '''size:''' 8.8 MB, '''start date/time:''' Nov 28, 2017 17:26:51.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_10.bag test_10.bag]: '''duration:''' 10:00s, '''size:''' 153.7 MB, '''start date/time:''' Nov 22, 2017 18:15:29.53&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 59.1 MB, '''start date/time:''' Nov 22, 2017 18:25:54.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_1.bag test_12_1.bag]: '''duration:''' 41.3s, '''size:''' 10.8 MB, '''start date/time:''' Nov 22, 2017 18:35:11.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_2.bag test_12_2.bag]: '''duration:''' 41.8s, '''size:''' 10.9 MB, '''start date/time:''' Nov 22, 2017 18:36:12.11&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_3.bag test_12_3.bag]: '''duration:''' 39.8s, '''size:'''  10.5 MB, '''start date/time:''' Nov 22, 2017 18:37:05.06&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_1.bag test_13_1.bag]: '''duration:''' 01:01s, '''size:''' 14.9 MB, '''start date/time:''' Nov 22, 2017 18:40:13.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_2.bag test_13_2.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:41:28.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_3.bag test_13_3.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:42:39.52&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_14.bag test_14.bag]: '''duration:''' 06:12s, '''size:''' 95.1 MB, '''start date/time:''' Nov 23, 2017 12:53:06.48&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5207</id>
		<title>Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5207"/>
				<updated>2018-02-13T11:38:59Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;blockquote style=&amp;quot;background-color:#ffe;border:1px solid #fb0;padding:5px 10px&amp;quot;&amp;gt;SciCrunch! reference: '''Range-based people tracker classifiers Benchmark Dataset, RRID:SCR_015743.'''&amp;lt;/blockquote&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to evaluate the performance of different approaches for detecting and tracking people by using LIDAR sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers. &lt;br /&gt;
&lt;br /&gt;
Further information at [https://www.frontiersin.org/articles/10.3389/fnbot.2017.00072/full Álvarez-Aparicio et al. (2018)].&lt;br /&gt;
&lt;br /&gt;
== Materials and methods ==&lt;br /&gt;
&lt;br /&gt;
The following section describes the materials (shown in Figure 1) used to gather data, which include: a certified study area, an autonomous robot with an on-board LIDAR sensor, and a real-time location system (RTLS) to obtain ground-truth data about person location. Recorded data include location estimates calculated by two people trackers, LD and PeTra, also described below. Finally, the recording procedure used to build the dataset is explained.&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig2.jpg|frame|center|'''Fig. 1''': Materials used in the experiments. (A) Orbi-One robot. (B) Leon@Home Testbed plan. (C) Floor plan of the apartment. Red dots show the location of KIO anchors. Black numbered dots denote Orbi-One locations during the data gathering. (D) KIO anchors. (E) General view of the apartment. (F) Furniture in mock-up apartment.]]&lt;br /&gt;
&lt;br /&gt;
=== Leon@Home Testbed ===&lt;br /&gt;
&lt;br /&gt;
Data have been gathered at Leon@Home Testbed. This is a Certified Testbed of the European Robotics league (ERL). Its main purpose is to benchmark service robots in a realistic home environment. Our testbed is made up of four parts, shown in Figure 1B: a mock-up apartment, a control zone with direct vision (glass wall) into the apartment, a small workshop, and a larger development zone, where researchers work.&lt;br /&gt;
&lt;br /&gt;
Leon@Home Testbed is located on the second floor of the Módulo de Investigación en Cibernética (Building for Research in Cybernetics) on the Vegazana Campus of the University of León (Spain). The apartment is a single bedroom mock-up home built in an 8 m × 7 m space. Figure 1C shows a plan of the apartment. 60 cm high walls divide it into a kitchen, living room, bathroom, and bedroom. The furniture (Figures 1E,F) has been chosen to test different robot abilities. For instance, the kitchen cabinets all have different types of handles.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-one ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One (Figure 1A) is an assistant robot manufactured by Robotnik. It has several sensors, among them, a RGBD camera, a LIDAR sensor, and an inertial unit. It can operate a manipulator arm attached to its torso and has a wheeled base for moving around the room. Orbi-One includes a wireless access point, which allows WiFi communications with other robots and computers.&lt;br /&gt;
&lt;br /&gt;
The software to control the robot hardware is based on a ROS framework. ROS is basically a set of libraries for robotics similar to operating system services, providing hardware abstraction for sensors and actuators, low-level device control, and inter-process communication. Computation takes place in processes named Nodes, which can receive and send Messages. Nodes publish Messages into information buffers called Topics.&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
In order to acquire ground-truth data about person location in the study area, we need an RTLS for indoor environments. The KIO RTLS commercial solution by Eliko has been used. KIO is a precise RTLS for tracking any object in 2- or 3-dimensional space. The Ultra Wideband technology enables to micro-position objects through obstructions. KIO also works in non-line-of-sight conditions and both indoors and outdoors.&lt;br /&gt;
&lt;br /&gt;
KIO comes in two main configurations. The Regular Cell configuration guarantees a reliable accuracy of ±30 cm, according to the manufacturer’s specifications. The Small Cell configuration is designed for location-critical applications and provides reliable ±5 cm accuracy, according to the manufacturer’s specifications. Calibration done by the authors of this paper on the mock-up apartment shows that the error is higher in some areas, and lower in others, but on average, the claims of the manufacturer are correct.&lt;br /&gt;
&lt;br /&gt;
KIO calculates the position of a mobile transceiver, called a Tag. In order to do so, KIO uses radio beacons, called Anchors, distributed in known positions in the surroundings. Figure 1D shows a KIO anchor. KIO tags are the same size and must be placed on-board the tracking subject, in our case people. The red dots in Figure 1C show the location of the six anchors used in these experiments. They are placed on the ceiling. The distribution of the anchors has been chosen following the method shown in Guerrero-Higueras et al. (2017).&lt;br /&gt;
&lt;br /&gt;
=== Leg Detector (D) ===&lt;br /&gt;
&lt;br /&gt;
LD is a ROS package, which takes messages published by a LIDAR sensor as input and uses a machine-learning-trained classifier to detect groups of laser readings as possible legs. The code is available in a public repository, but is unsupported at this time.&lt;br /&gt;
&lt;br /&gt;
LD publishes the location for the individual legs. It can also attempt to pair the legs and publish their average as an estimate of where the center of a person is. LD may optionally also publish visualization marker messages to indicate where detections happened.&lt;br /&gt;
&lt;br /&gt;
=== PeTra ===&lt;br /&gt;
&lt;br /&gt;
PeTra is a person-tracker tool for detecting and tracking, developed by the Robotics Group at the University of León. The system is based on a Convolutional Neural Network (CNN) using a configuration based on the U-Net architecture by Ronneberger et al. (2015).&lt;br /&gt;
&lt;br /&gt;
The system performs the following steps in real time:&lt;br /&gt;
&lt;br /&gt;
1. First, the data provided by the LIDAR sensor are processed to build a two dimensional occupancy map centered around the robot. This occupancy map is represented as a binary matrix, where 1s denote positions where the LIDAR scan found an obstacle, and 0s denote positions where the LIDAR scan either went through without detecting any obstacle or did not go through that position.&lt;br /&gt;
&lt;br /&gt;
2. Then, the occupancy map is relayed to the network as input data. The network produces a second occupancy map representing the zones where legs have been detected.&lt;br /&gt;
&lt;br /&gt;
3. Finally, center of mass calculations return the location of persons. PeTra also publishes locations for the individual legs and Marker messages for visualization.&lt;br /&gt;
&lt;br /&gt;
=== Recording procedure ===&lt;br /&gt;
&lt;br /&gt;
The data were gathered in 14 different situations. In all of them, Orbi-One stood still as one or more people, carrying a KIO tag, moved around it. Three different locations for Orbi-One were defined (see Figure 1C) resulting in 42 scenarios (14 situations × 3 Orbi-One locations). Figure 2 shows the 14 different recognition scenarios recorded. These scenarios have been chosen according to different situations that may occur in robotics competitions such as ERL or RoboCup.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig2.png|frame|center|'''Fig. 2''': recognition scenarios recorded.]]&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
A rosbag file was created for each scenario (except for situations 3, 12, and 13 where 3 rosbag files where recorded, for situation 4 where 4 rosbag files where recorded, and for situation 9 where 5 rosbag files where recorded), recording LIDAR sensor measurements, location estimates from PeTra and LD, locations from KIO RTLS, and other useful data. Specifically, the following data were included in the rosbag files:&lt;br /&gt;
&lt;br /&gt;
* LIDAR sensor data. Data from LIDAR sensors are provided as ROS LaserScan Messages, which include, among other information, the following: acquisition time of the first ray in the scan, start/end angle of the scan, angular distance between measurements, and range data.&lt;br /&gt;
* PeTra location estimates provided as ROS PointStamped Messages, which include a position [x, y, z] and a timestamp.&lt;br /&gt;
* Location estimates calculated by LD. It publishes data for individual legs (as ROS PositionMeasurementArray Messages). It also attempts to pair the legs and publishes their average as an estimate of where the center of a person is as a ROS PositionMeasurement Message.&lt;br /&gt;
* Locations provided by KIO RTLS also provided as ROS PointStamped Messages.&lt;br /&gt;
* Messages from /map, /odom, and /tf ROS topics, which include map information, odometry of the robot base, and transformation information, respectively.&lt;br /&gt;
&lt;br /&gt;
Differents versions of the dataset are enumerated below.&lt;br /&gt;
&lt;br /&gt;
=== v1.0 [Nov-2017] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a first version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Scenario 1. Kitchen====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_01.bag test_01.bag]: '''duration:''' 14:56s, '''size:''' 227.8 MB, '''start date/time:''' Jul 20, 2017 12:49:21.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Jul 26, 2017 11:01:24.72&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_1.bag test_03_1.bag]: '''duration:''' 39.9s, '''size:''' 10.4 MB, '''start date/time:''' Jul 20, 2017 13:27:25.50&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_2.bag test_03_2.bag]: '''duration:''' 39.1s, '''size:''' 10.2 MB, '''start date/time:''' Jul 20, 2017 13:28:56.41&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_3.bag test_03_3.bag]: '''duration:''' 40.5s, '''size:''' 10.5 MB, '''start date/time:''' Jul 20, 2017 13:30:04.94&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_1.bag test_04_1.bag]: '''duration:''' 58.3s, '''size:''' 15.0 MB, '''start date/time:''' Jul 25, 2017 10:39:52.62 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_2.bag test_04_2.bag]: '''duration:''' 57.2s, '''size:''' 14.7 MB, '''start date/time:''' Jul 25, 2017 10:41:16.31&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_3.bag test_04_3.bag]: '''duration:''' 50.5s, '''size:''' 13.0 MB, '''start date/time:''' Jul 25, 2017 10:42:44.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_4.bag test_04_4.bag]: '''duration:''' 1:01s, '''size:''' 15.7 MB, '''start date/time:''' Jul 25, 2017 10:43:52.44&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_05.bag test_05.bag]: '''duration:''' 15:15s, '''size:''' 236.8 MB, '''start date/time:''' Jul 26, 2017 11:33:13.31&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_06.bag test_06.bag]: '''duration:''' 09:18s, '''size:''' 143.3 MB, '''start date/time:''' Jul 26, 2017 12:25:45.12&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Jul 25, 2017 11:40:01.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_08.bag test_08.bag]: '''duration:''' 03:39s, '''size:''' 55.8 MB, '''start date/time:''' Jul 25, 2017 12:25:29.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_1.bag test_09_1.bag]: '''duration:''' 22.9s, '''size:''' 6.1 MB, '''start date/time:''' Jul 25, 2017 10:50:02.95 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_2.bag test_09_2.bag]: '''duration:''' 20.8s, '''size:''' 5.6 MB, '''start date/time:''' Jul 25, 2017 10:51:02.62&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Jul 25, 2017 10:51:45.96 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_4.bag test_09_4.bag]: '''duration:''' 29.3s, '''size:''' 7.9 MB, '''start date/time:''' Jul 25, 2017 10:52:51.24&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_5.bag test_09_5.bag]: '''duration:''' 36.9s, '''size:''' 9.7 MB, '''start date/time:''' Jul 25, 2017 10:54:00.13&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_10.bag test_10.bag]: '''duration:''' 15:50s, '''size:''' 240.5 MB, '''start date/time:''' Jul 20, 2017 13:07:40.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 58.7 MB, '''start date/time:''' Jul 25, 2017 11:48:33.90&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_1.bag test_12_1.bag]: '''duration:''' 43.6s, '''size:''' 11.3 MB, '''start date/time:''' Jul 20, 2017 13:33:23.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_2.bag test_12_2.bag]: '''duration:''' 44.3s, '''size:''' 11.5 MB, '''start date/time:''' Jul 20, 2017 13:34:24.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_3.bag test_12_3.bag]: '''duration:''' 37.3s, '''size:'''  9.7 MB, '''start date/time:''' Jul 20, 2017 13:35:31.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_1.bag test_13_1.bag]: '''duration:''' 57.8s, '''size:''' 14.9 MB, '''start date/time:''' Jul 25, 2017 11:01:15.23&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_2.bag test_13_2.bag]: '''duration:''' 59.9s, '''size:''' 15.4 MB, '''start date/time:''' Jul 25, 2017 11:02:37.85&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_3.bag test_13_3.bag]: '''duration:''' 54.0s, '''size:''' 13.9 MB, '''start date/time:''' Jul 25, 2017 11:04:30.69&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_14.bag test_14.bag]: '''duration:''' 05.57s, '''size:''' 90.5 MB, '''start date/time:''' Jul 25, 2017 11:10:32.97&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====Scenario 2. Living room====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_01.bag test_01.bag]: '''duration:''' 15:37s, '''size:''' 239.7 MB, '''start date/time:''' Nov 16, 2017 12:20:19.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_02.bag test_02.bag]: '''duration:''' 15:05s, '''size:''' 231.4 MB, '''start date/time:''' Nov 27, 2017 18:32:24.35 &lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_1.bag test_03_1.bag]: '''duration:''' 55.2s, '''size:''' 14.5 MB, '''start date/time:''' Nov 16, 2017 12:43:58.29 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_2.bag test_03_2.bag]: '''duration:''' 44.4s, '''size:''' 10.2 MB, '''start date/time:''' Nov 16, 2017 12:46:31.99&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_3.bag test_03_3.bag]: '''duration:''' 51.1s, '''size:''' 13.4 MB, '''start date/time:''' Nov 16, 2017 12:47:29.66&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_1.bag test_04_1.bag]: '''duration:''' 01:23s, '''size:''' 21.6 MB, '''start date/time:''' Nov 16, 2017 1:50:22.18 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_2.bag test_04_2.bag]: '''duration:''' 01:20s, '''size:''' 21.0 MB, '''start date/time:''' Nov 16, 2017 12:52:18.44&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_3.bag test_04_3.bag]: '''duration:''' 01:09s, '''size:''' 18.1 MB, '''start date/time:''' Nov 16, 2017 12:53:51.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_4.bag test_04_4.bag]: '''duration:''' 1:21s, '''size:''' 21.3 MB, '''start date/time:''' Nov 16, 2017 12:55:31.81&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_05.bag test_05.bag]: '''duration:''' 15:07s, '''size:''' 232.4 MB, '''start date/time:''' Nov 28 2017 18:33:16.24&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_06.bag test_06.bag]: '''duration:''' 09:14s, '''size:''' 136.2 MB, '''start date/time:''' Nov 29 2017 13:12:42.61 &lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_07.bag test_07.bag]: '''duration:''' 4:39s, '''size:''' 71.5 MB, '''start date/time:''' Nov 22 2017 12:48:33.86 &lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_08.bag test_08.bag]: '''duration:''' 03:04s, '''size:''' 47.2 MB, '''start date/time:''' Nov 22 2017 12:59:08.58&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_1.bag test_09_1.bag]: '''duration:''' 28.8s, '''size:''' 7.7 MB, '''start date/time:''' Nov 28, 2017 17:17:46.49 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_2.bag test_09_2.bag]: '''duration:''' 31.9s, '''size:''' 8.4 MB, '''start date/time:''' Nov 28, 2017 17:20:48.26&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Nov 28, 2017 17:20:05.96&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_4.bag test_09_4.bag]: '''duration:''' 29.9s, '''size:''' 7.9 MB, '''start date/time:''' Nov 28, 2017 17:19:26.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_5.bag test_09_5.bag]: '''duration:''' 30.8s, '''size:''' 8.1 MB, '''start date/time:''' Nov 28, 2017 17:21:20.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_10.bag test_10.bag]: '''duration:''' 16:05s, '''size:''' 246.6 MB, '''start date/time:''' Nov 22 2017 11:55:02.39&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_11.bag test_11.bag]: '''duration:''' 03:25s, '''size:''' 52.8 MB, '''start date/time:''' Nov 22 2017 13:08:11.99&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_1.bag test_12_1.bag]: '''duration:''' 01:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:55:26.25&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_2.bag test_12_2.bag]: '''duration:''' 1:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:56:52.47&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_3.bag test_12_3.bag]: '''duration:''' 50.2s, '''size:'''  13.1 MB, '''start date/time:''' Nov 16, 2017 11:44:00.10&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_1.bag test_13_1.bag]: '''duration:''' 58.2s, '''size:''' 15.1 MB, '''start date/time:''' Nov 22, 2017 11:48:35.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_2.bag test_13_2.bag]: '''duration:''' 01:05s, '''size:''' 16.9 MB, '''start date/time:''' Nov 22, 2017 11:49:48.18&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_3.bag test_13_3.bag]: '''duration:''' 01:14s, '''size:''' 19.2 MB, '''start date/time:''' Nov 22, 2017 11:51:22.68&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_14.bag test_14.bag]: '''duration:''' 06:17s, '''size:''' 96.3 MB, '''start date/time:''' Nov 22 2017 12:23:07.92&lt;br /&gt;
&lt;br /&gt;
====Scenario 3. Bedroom====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_01.bag test_01.bag]: '''duration:''' 15:17s, '''size:''' 234.6 MB, '''start date/time:''' Nov 21, 2017 18:56:54.87&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 17:37:34.08&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_1.bag test_03_1.bag]: '''duration:''' 44.7s, '''size:''' 11.7 MB, '''start date/time:''' Nov 21, 2017 13:16:51.65&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_2.bag test_03_2.bag]: '''duration:''' 39.9s, '''size:''' 10.5 MB, '''start date/time:''' Nov 21, 2017 19:19:06.27&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_3.bag test_03_3.bag]: '''duration:''' 41.2s, '''size:''' 10.8 MB, '''start date/time:''' Nov 21, 2017 19:18:06.86&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_1.bag test_04_1.bag]: '''duration:''' 01:01s, '''size:''' 16.0 MB, '''start date/time:''' Nov 21, 2017 19:21:30.99 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_2.bag test_04_2.bag]: '''duration:''' 01:01s, '''size:''' 15.9 MB, '''start date/time:''' Nov 21, 2017 19:25:38.00&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_3.bag test_04_3.bag]: '''duration:''' 58.0s, '''size:''' 15.1 MB, '''start date/time:''' Nov 21, 2017 19:27:16.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_4.bag test_04_4.bag]: '''duration:''' 1:00s, '''size:''' 15.8 MB, '''start date/time:''' Nov 21, 2017 19:29:39.28&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_05.bag test_05.bag]: '''duration:''' 15:11s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 19:05:20.19&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_06.bag test_06.bag]: '''duration:''' 09:03s, '''size:''' 133.6 MB, '''start date/time:''' Nov 26, 2017 13:24:25.64&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Nov 27, 2017 17:34:50.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_08.bag test_08.bag]: '''duration:''' 03:47s, '''size:''' 58.3 MB, '''start date/time:''' Nov 27, 2017 17:41:48.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_1.bag test_09_1.bag]: '''duration:''' 31.3s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:23:58.31 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_2.bag test_09_2.bag]: '''duration:''' 30.4s, '''size:''' 8.0 MB, '''start date/time:''' Nov 28, 2017 17:24:41.09&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_3.bag test_09_3.bag]: '''duration:''' 31.2s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:25:22.22&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_4.bag test_09_4.bag]: '''duration:''' 31.4s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:26:06.98&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_5.bag test_09_5.bag]: '''duration:''' 33.2s, '''size:''' 8.8 MB, '''start date/time:''' Nov 28, 2017 17:26:51.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_10.bag test_10.bag]: '''duration:''' 10:00s, '''size:''' 153.7 MB, '''start date/time:''' Nov 22, 2017 18:15:29.53&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 59.1 MB, '''start date/time:''' Nov 22, 2017 18:25:54.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_1.bag test_12_1.bag]: '''duration:''' 41.3s, '''size:''' 10.8 MB, '''start date/time:''' Nov 22, 2017 18:35:11.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_2.bag test_12_2.bag]: '''duration:''' 41.8s, '''size:''' 10.9 MB, '''start date/time:''' Nov 22, 2017 18:36:12.11&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_3.bag test_12_3.bag]: '''duration:''' 39.8s, '''size:'''  10.5 MB, '''start date/time:''' Nov 22, 2017 18:37:05.06&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_1.bag test_13_1.bag]: '''duration:''' 01:01s, '''size:''' 14.9 MB, '''start date/time:''' Nov 22, 2017 18:40:13.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_2.bag test_13_2.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:41:28.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_3.bag test_13_3.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:42:39.52&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_14.bag test_14.bag]: '''duration:''' 06:12s, '''size:''' 95.1 MB, '''start date/time:''' Nov 23, 2017 12:53:06.48&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=File:PeopleTrackingFig1.jpg&amp;diff=5206</id>
		<title>File:PeopleTrackingFig1.jpg</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=File:PeopleTrackingFig1.jpg&amp;diff=5206"/>
				<updated>2018-02-13T11:31:33Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5205</id>
		<title>Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5205"/>
				<updated>2018-02-13T11:21:21Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;blockquote style=&amp;quot;background-color:#ffe;border:1px solid #fb0;padding:5px 10px&amp;quot;&amp;gt;SciCrunch! reference: '''Range-based people tracker classifiers Benchmark Dataset, RRID:SCR_015743.'''&amp;lt;/blockquote&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to evaluate the performance of different approaches for detecting and tracking people by using LIDAR sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers. &lt;br /&gt;
&lt;br /&gt;
Further information at [https://www.frontiersin.org/articles/10.3389/fnbot.2017.00072/full Álvarez-Aparicio et al. (2018)].&lt;br /&gt;
&lt;br /&gt;
== Materials ==&lt;br /&gt;
&lt;br /&gt;
The following section describes the materials (shown in Figure 1) used to gather data, which include: a certified study area, an autonomous robot with an on-board LIDAR sensor, and a real-time location system (RTLS) to obtain ground-truth data about person location. Recorded data include location estimates calculated by two people trackers, LD and PeTra, also described below. Finally, the recording procedure used to build the dataset is explained.&lt;br /&gt;
&lt;br /&gt;
=== Leon@Home Testbed ===&lt;br /&gt;
&lt;br /&gt;
Data have been gathered at Leon@Home Testbed. This is a Certified Testbed of the European Robotics league (ERL). Its main purpose is to benchmark service robots in a realistic home environment. Our testbed is made up of four parts, shown in Figure 1B: a mock-up apartment, a control zone with direct vision (glass wall) into the apartment, a small workshop, and a larger development zone, where researchers work.&lt;br /&gt;
&lt;br /&gt;
Leon@Home Testbed is located on the second floor of the Módulo de Investigación en Cibernética (Building for Research in Cybernetics) on the Vegazana Campus of the University of León (Spain). The apartment is a single bedroom mock-up home built in an 8 m × 7 m space. Figure 1C shows a plan of the apartment. 60 cm high walls divide it into a kitchen, living room, bathroom, and bedroom. The furniture (Figures 1E,F) has been chosen to test different robot abilities. For instance, the kitchen cabinets all have different types of handles.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-one ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One (Figure 1A) is an assistant robot manufactured by Robotnik. It has several sensors, among them, a RGBD camera, a LIDAR sensor, and an inertial unit. It can operate a manipulator arm attached to its torso and has a wheeled base for moving around the room. Orbi-One includes a wireless access point, which allows WiFi communications with other robots and computers.&lt;br /&gt;
&lt;br /&gt;
The software to control the robot hardware is based on a ROS framework. ROS is basically a set of libraries for robotics similar to operating system services, providing hardware abstraction for sensors and actuators, low-level device control, and inter-process communication. Computation takes place in processes named Nodes, which can receive and send Messages. Nodes publish Messages into information buffers called Topics.&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
In order to acquire ground-truth data about person location in the study area, we need an RTLS for indoor environments. The KIO RTLS commercial solution by Eliko has been used. KIO is a precise RTLS for tracking any object in 2- or 3-dimensional space. The Ultra Wideband technology enables to micro-position objects through obstructions. KIO also works in non-line-of-sight conditions and both indoors and outdoors.&lt;br /&gt;
&lt;br /&gt;
KIO comes in two main configurations. The Regular Cell configuration guarantees a reliable accuracy of ±30 cm, according to the manufacturer’s specifications. The Small Cell configuration is designed for location-critical applications and provides reliable ±5 cm accuracy, according to the manufacturer’s specifications. Calibration done by the authors of this paper on the mock-up apartment shows that the error is higher in some areas, and lower in others, but on average, the claims of the manufacturer are correct.&lt;br /&gt;
&lt;br /&gt;
KIO calculates the position of a mobile transceiver, called a Tag. In order to do so, KIO uses radio beacons, called Anchors, distributed in known positions in the surroundings. Figure 1D shows a KIO anchor. KIO tags are the same size and must be placed on-board the tracking subject, in our case people. The red dots in Figure 1C show the location of the six anchors used in these experiments. They are placed on the ceiling. The distribution of the anchors has been chosen following the method shown in Guerrero-Higueras et al. (2017).&lt;br /&gt;
&lt;br /&gt;
=== Leg Detector (D) ===&lt;br /&gt;
&lt;br /&gt;
LD is a ROS package, which takes messages published by a LIDAR sensor as input and uses a machine-learning-trained classifier to detect groups of laser readings as possible legs. The code is available in a public repository, but is unsupported at this time.&lt;br /&gt;
&lt;br /&gt;
LD publishes the location for the individual legs. It can also attempt to pair the legs and publish their average as an estimate of where the center of a person is. LD may optionally also publish visualization marker messages to indicate where detections happened.&lt;br /&gt;
&lt;br /&gt;
=== PeTra ===&lt;br /&gt;
&lt;br /&gt;
PeTra is a person-tracker tool for detecting and tracking, developed by the Robotics Group at the University of León. The system is based on a Convolutional Neural Network (CNN) using a configuration based on the U-Net architecture by Ronneberger et al. (2015).&lt;br /&gt;
&lt;br /&gt;
The system performs the following steps in real time:&lt;br /&gt;
&lt;br /&gt;
1. First, the data provided by the LIDAR sensor are processed to build a two dimensional occupancy map centered around the robot. This occupancy map is represented as a binary matrix, where 1s denote positions where the LIDAR scan found an obstacle, and 0s denote positions where the LIDAR scan either went through without detecting any obstacle or did not go through that position.&lt;br /&gt;
&lt;br /&gt;
2. Then, the occupancy map is relayed to the network as input data. The network produces a second occupancy map representing the zones where legs have been detected.&lt;br /&gt;
&lt;br /&gt;
3. Finally, center of mass calculations return the location of persons. PeTra also publishes locations for the individual legs and Marker messages for visualization.&lt;br /&gt;
&lt;br /&gt;
=== Recording procedure ===&lt;br /&gt;
&lt;br /&gt;
The data were gathered in 14 different situations. In all of them, Orbi-One stood still as one or more people, carrying a KIO tag, moved around it. Three different locations for Orbi-One were defined (see Figure 1C) resulting in 42 scenarios (14 situations × 3 Orbi-One locations). Figure 2 shows the 14 different recognition scenarios recorded. These scenarios have been chosen according to different situations that may occur in robotics competitions such as ERL or RoboCup.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig2.png|frame|center|'''Fig. 2''': recognition scenarios recorded.]]&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
A rosbag file was created for each scenario (except for situations 3, 12, and 13 where 3 rosbag files where recorded, for situation 4 where 4 rosbag files where recorded, and for situation 9 where 5 rosbag files where recorded), recording LIDAR sensor measurements, location estimates from PeTra and LD, locations from KIO RTLS, and other useful data. Specifically, the following data were included in the rosbag files:&lt;br /&gt;
&lt;br /&gt;
* LIDAR sensor data. Data from LIDAR sensors are provided as ROS LaserScan Messages, which include, among other information, the following: acquisition time of the first ray in the scan, start/end angle of the scan, angular distance between measurements, and range data.&lt;br /&gt;
* PeTra location estimates provided as ROS PointStamped Messages, which include a position [x, y, z] and a timestamp.&lt;br /&gt;
* Location estimates calculated by LD. It publishes data for individual legs (as ROS PositionMeasurementArray Messages). It also attempts to pair the legs and publishes their average as an estimate of where the center of a person is as a ROS PositionMeasurement Message.&lt;br /&gt;
* Locations provided by KIO RTLS also provided as ROS PointStamped Messages.&lt;br /&gt;
* Messages from /map, /odom, and /tf ROS topics, which include map information, odometry of the robot base, and transformation information, respectively.&lt;br /&gt;
&lt;br /&gt;
Differents versions of the dataset are enumerated below.&lt;br /&gt;
&lt;br /&gt;
=== v1.0 [Nov-2017] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a first version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
====Scenario 1. Kitchen====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_01.bag test_01.bag]: '''duration:''' 14:56s, '''size:''' 227.8 MB, '''start date/time:''' Jul 20, 2017 12:49:21.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Jul 26, 2017 11:01:24.72&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_1.bag test_03_1.bag]: '''duration:''' 39.9s, '''size:''' 10.4 MB, '''start date/time:''' Jul 20, 2017 13:27:25.50&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_2.bag test_03_2.bag]: '''duration:''' 39.1s, '''size:''' 10.2 MB, '''start date/time:''' Jul 20, 2017 13:28:56.41&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_03_3.bag test_03_3.bag]: '''duration:''' 40.5s, '''size:''' 10.5 MB, '''start date/time:''' Jul 20, 2017 13:30:04.94&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_1.bag test_04_1.bag]: '''duration:''' 58.3s, '''size:''' 15.0 MB, '''start date/time:''' Jul 25, 2017 10:39:52.62 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_2.bag test_04_2.bag]: '''duration:''' 57.2s, '''size:''' 14.7 MB, '''start date/time:''' Jul 25, 2017 10:41:16.31&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_3.bag test_04_3.bag]: '''duration:''' 50.5s, '''size:''' 13.0 MB, '''start date/time:''' Jul 25, 2017 10:42:44.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_04_4.bag test_04_4.bag]: '''duration:''' 1:01s, '''size:''' 15.7 MB, '''start date/time:''' Jul 25, 2017 10:43:52.44&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_05.bag test_05.bag]: '''duration:''' 15:15s, '''size:''' 236.8 MB, '''start date/time:''' Jul 26, 2017 11:33:13.31&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_06.bag test_06.bag]: '''duration:''' 09:18s, '''size:''' 143.3 MB, '''start date/time:''' Jul 26, 2017 12:25:45.12&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Jul 25, 2017 11:40:01.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_08.bag test_08.bag]: '''duration:''' 03:39s, '''size:''' 55.8 MB, '''start date/time:''' Jul 25, 2017 12:25:29.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_1.bag test_09_1.bag]: '''duration:''' 22.9s, '''size:''' 6.1 MB, '''start date/time:''' Jul 25, 2017 10:50:02.95 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_2.bag test_09_2.bag]: '''duration:''' 20.8s, '''size:''' 5.6 MB, '''start date/time:''' Jul 25, 2017 10:51:02.62&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Jul 25, 2017 10:51:45.96 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_4.bag test_09_4.bag]: '''duration:''' 29.3s, '''size:''' 7.9 MB, '''start date/time:''' Jul 25, 2017 10:52:51.24&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_09_5.bag test_09_5.bag]: '''duration:''' 36.9s, '''size:''' 9.7 MB, '''start date/time:''' Jul 25, 2017 10:54:00.13&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_10.bag test_10.bag]: '''duration:''' 15:50s, '''size:''' 240.5 MB, '''start date/time:''' Jul 20, 2017 13:07:40.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 58.7 MB, '''start date/time:''' Jul 25, 2017 11:48:33.90&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_1.bag test_12_1.bag]: '''duration:''' 43.6s, '''size:''' 11.3 MB, '''start date/time:''' Jul 20, 2017 13:33:23.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_2.bag test_12_2.bag]: '''duration:''' 44.3s, '''size:''' 11.5 MB, '''start date/time:''' Jul 20, 2017 13:34:24.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_12_3.bag test_12_3.bag]: '''duration:''' 37.3s, '''size:'''  9.7 MB, '''start date/time:''' Jul 20, 2017 13:35:31.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_1.bag test_13_1.bag]: '''duration:''' 57.8s, '''size:''' 14.9 MB, '''start date/time:''' Jul 25, 2017 11:01:15.23&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_2.bag test_13_2.bag]: '''duration:''' 59.9s, '''size:''' 15.4 MB, '''start date/time:''' Jul 25, 2017 11:02:37.85&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_13_3.bag test_13_3.bag]: '''duration:''' 54.0s, '''size:''' 13.9 MB, '''start date/time:''' Jul 25, 2017 11:04:30.69&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario01/test_14.bag test_14.bag]: '''duration:''' 05.57s, '''size:''' 90.5 MB, '''start date/time:''' Jul 25, 2017 11:10:32.97&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====Scenario 2. Living room====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_01.bag test_01.bag]: '''duration:''' 15:37s, '''size:''' 239.7 MB, '''start date/time:''' Nov 16, 2017 12:20:19.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_02.bag test_02.bag]: '''duration:''' 15:05s, '''size:''' 231.4 MB, '''start date/time:''' Nov 27, 2017 18:32:24.35 &lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_1.bag test_03_1.bag]: '''duration:''' 55.2s, '''size:''' 14.5 MB, '''start date/time:''' Nov 16, 2017 12:43:58.29 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_2.bag test_03_2.bag]: '''duration:''' 44.4s, '''size:''' 10.2 MB, '''start date/time:''' Nov 16, 2017 12:46:31.99&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_03_3.bag test_03_3.bag]: '''duration:''' 51.1s, '''size:''' 13.4 MB, '''start date/time:''' Nov 16, 2017 12:47:29.66&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_1.bag test_04_1.bag]: '''duration:''' 01:23s, '''size:''' 21.6 MB, '''start date/time:''' Nov 16, 2017 1:50:22.18 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_2.bag test_04_2.bag]: '''duration:''' 01:20s, '''size:''' 21.0 MB, '''start date/time:''' Nov 16, 2017 12:52:18.44&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_3.bag test_04_3.bag]: '''duration:''' 01:09s, '''size:''' 18.1 MB, '''start date/time:''' Nov 16, 2017 12:53:51.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_04_4.bag test_04_4.bag]: '''duration:''' 1:21s, '''size:''' 21.3 MB, '''start date/time:''' Nov 16, 2017 12:55:31.81&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_05.bag test_05.bag]: '''duration:''' 15:07s, '''size:''' 232.4 MB, '''start date/time:''' Nov 28 2017 18:33:16.24&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_06.bag test_06.bag]: '''duration:''' 09:14s, '''size:''' 136.2 MB, '''start date/time:''' Nov 29 2017 13:12:42.61 &lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_07.bag test_07.bag]: '''duration:''' 4:39s, '''size:''' 71.5 MB, '''start date/time:''' Nov 22 2017 12:48:33.86 &lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_08.bag test_08.bag]: '''duration:''' 03:04s, '''size:''' 47.2 MB, '''start date/time:''' Nov 22 2017 12:59:08.58&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_1.bag test_09_1.bag]: '''duration:''' 28.8s, '''size:''' 7.7 MB, '''start date/time:''' Nov 28, 2017 17:17:46.49 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_2.bag test_09_2.bag]: '''duration:''' 31.9s, '''size:''' 8.4 MB, '''start date/time:''' Nov 28, 2017 17:20:48.26&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Nov 28, 2017 17:20:05.96&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_4.bag test_09_4.bag]: '''duration:''' 29.9s, '''size:''' 7.9 MB, '''start date/time:''' Nov 28, 2017 17:19:26.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_09_5.bag test_09_5.bag]: '''duration:''' 30.8s, '''size:''' 8.1 MB, '''start date/time:''' Nov 28, 2017 17:21:20.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_10.bag test_10.bag]: '''duration:''' 16:05s, '''size:''' 246.6 MB, '''start date/time:''' Nov 22 2017 11:55:02.39&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_11.bag test_11.bag]: '''duration:''' 03:25s, '''size:''' 52.8 MB, '''start date/time:''' Nov 22 2017 13:08:11.99&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_1.bag test_12_1.bag]: '''duration:''' 01:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:55:26.25&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_2.bag test_12_2.bag]: '''duration:''' 1:06s, '''size:''' 17.4 MB, '''start date/time:''' Nov 16, 2017 12:56:52.47&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_12_3.bag test_12_3.bag]: '''duration:''' 50.2s, '''size:'''  13.1 MB, '''start date/time:''' Nov 16, 2017 11:44:00.10&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_1.bag test_13_1.bag]: '''duration:''' 58.2s, '''size:''' 15.1 MB, '''start date/time:''' Nov 22, 2017 11:48:35.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_2.bag test_13_2.bag]: '''duration:''' 01:05s, '''size:''' 16.9 MB, '''start date/time:''' Nov 22, 2017 11:49:48.18&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_13_3.bag test_13_3.bag]: '''duration:''' 01:14s, '''size:''' 19.2 MB, '''start date/time:''' Nov 22, 2017 11:51:22.68&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario02/test_14.bag test_14.bag]: '''duration:''' 06:17s, '''size:''' 96.3 MB, '''start date/time:''' Nov 22 2017 12:23:07.92&lt;br /&gt;
&lt;br /&gt;
====Scenario 3. Bedroom====&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_01.bag test_01.bag]: '''duration:''' 15:17s, '''size:''' 234.6 MB, '''start date/time:''' Nov 21, 2017 18:56:54.87&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 17:37:34.08&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_1.bag test_03_1.bag]: '''duration:''' 44.7s, '''size:''' 11.7 MB, '''start date/time:''' Nov 21, 2017 13:16:51.65&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_2.bag test_03_2.bag]: '''duration:''' 39.9s, '''size:''' 10.5 MB, '''start date/time:''' Nov 21, 2017 19:19:06.27&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_03_3.bag test_03_3.bag]: '''duration:''' 41.2s, '''size:''' 10.8 MB, '''start date/time:''' Nov 21, 2017 19:18:06.86&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_1.bag test_04_1.bag]: '''duration:''' 01:01s, '''size:''' 16.0 MB, '''start date/time:''' Nov 21, 2017 19:21:30.99 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_2.bag test_04_2.bag]: '''duration:''' 01:01s, '''size:''' 15.9 MB, '''start date/time:''' Nov 21, 2017 19:25:38.00&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_3.bag test_04_3.bag]: '''duration:''' 58.0s, '''size:''' 15.1 MB, '''start date/time:''' Nov 21, 2017 19:27:16.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_04_4.bag test_04_4.bag]: '''duration:''' 1:00s, '''size:''' 15.8 MB, '''start date/time:''' Nov 21, 2017 19:29:39.28&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_05.bag test_05.bag]: '''duration:''' 15:11s, '''size:''' 233.0 MB, '''start date/time:''' Nov 28, 2017 19:05:20.19&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_06.bag test_06.bag]: '''duration:''' 09:03s, '''size:''' 133.6 MB, '''start date/time:''' Nov 26, 2017 13:24:25.64&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Nov 27, 2017 17:34:50.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_08.bag test_08.bag]: '''duration:''' 03:47s, '''size:''' 58.3 MB, '''start date/time:''' Nov 27, 2017 17:41:48.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_1.bag test_09_1.bag]: '''duration:''' 31.3s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:23:58.31 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_2.bag test_09_2.bag]: '''duration:''' 30.4s, '''size:''' 8.0 MB, '''start date/time:''' Nov 28, 2017 17:24:41.09&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_3.bag test_09_3.bag]: '''duration:''' 31.2s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:25:22.22&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_4.bag test_09_4.bag]: '''duration:''' 31.4s, '''size:''' 8.3 MB, '''start date/time:''' Nov 28, 2017 17:26:06.98&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_09_5.bag test_09_5.bag]: '''duration:''' 33.2s, '''size:''' 8.8 MB, '''start date/time:''' Nov 28, 2017 17:26:51.32&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_10.bag test_10.bag]: '''duration:''' 10:00s, '''size:''' 153.7 MB, '''start date/time:''' Nov 22, 2017 18:15:29.53&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 59.1 MB, '''start date/time:''' Nov 22, 2017 18:25:54.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_1.bag test_12_1.bag]: '''duration:''' 41.3s, '''size:''' 10.8 MB, '''start date/time:''' Nov 22, 2017 18:35:11.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_2.bag test_12_2.bag]: '''duration:''' 41.8s, '''size:''' 10.9 MB, '''start date/time:''' Nov 22, 2017 18:36:12.11&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_12_3.bag test_12_3.bag]: '''duration:''' 39.8s, '''size:'''  10.5 MB, '''start date/time:''' Nov 22, 2017 18:37:05.06&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_1.bag test_13_1.bag]: '''duration:''' 01:01s, '''size:''' 14.9 MB, '''start date/time:''' Nov 22, 2017 18:40:13.36&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_2.bag test_13_2.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:41:28.33&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_13_3.bag test_13_3.bag]: '''duration:''' 01:00s, '''size:''' 15.7 MB, '''start date/time:''' Nov 22, 2017 18:42:39.52&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/scenario03/test_14.bag test_14.bag]: '''duration:''' 06:12s, '''size:''' 95.1 MB, '''start date/time:''' Nov 23, 2017 12:53:06.48&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_analysis_of_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots&amp;diff=5156</id>
		<title>Benchmark dataset for analysis of cyber-attacks to an indoor real time localization system for autonomous robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_analysis_of_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots&amp;diff=5156"/>
				<updated>2017-10-16T08:55:47Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;blockquote style=&amp;quot;background-color:#ffe;border:1px solid #fb0;padding:5px 10px&amp;quot;&amp;gt;SciCrunch! reference: '''Real time localization system for autonomous robots Benchmark Dataset, RRID:SCR_015756.'''&amp;lt;/blockquote&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to analyze cyber-attacks to an indoor real time localization system for autonomous robots. Data have been gathered in an indoor mock-up apartment, shown in Fig 2 , located at the Robotics Lab of the University of León (Spain). An autonomous robot, called Karen and shown in Fig 1, with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
Further information can be found at: ''Guerrero-Higueras, Á. M., DeCastro-García, N., Rodríguez-Lera, F. J., &amp;amp; Matellán, V. (2017). '''Empirical analysis of cyber-attacks to an indoor real time localization system for autonomous robots'''. Computers &amp;amp; Security, 70, 422-435.''&lt;br /&gt;
&lt;br /&gt;
== Materials ==&lt;br /&gt;
&lt;br /&gt;
Data gathered by Karen robot include: &lt;br /&gt;
* Karen location estimates provided by a commercial RTLS, called '''KIO'''.&lt;br /&gt;
&lt;br /&gt;
Additional information about Karen and the devices/packages used to get data is given below.&lt;br /&gt;
&lt;br /&gt;
=== Karen robot ===&lt;br /&gt;
&lt;br /&gt;
Fig. 1 shows the autonomous robot Karen, used in the experiments. Karen is a mobile manipulator built by the Robotics Group of the Universidad de León (Spain). It has two arms with 7 degrees of freedom, 3 fingers each, and a mobile base. The control of the robot is based on the Robot Operating System (ROS) framework.&lt;br /&gt;
&lt;br /&gt;
[[File:KarenWithKIO.jpg|thumb|'''Fig. 1''': Karen and KIO RTLS: a beacon (1), and tag on the robot (2).]]&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
KIO RTLS commercial solution by [https://www.eliko.ee/products/kio-rtls/ Eliko] has been used to provide people location at the study area. Fig 1 shows a KIO beacon (1), and a KIO tag on the robot (2).&lt;br /&gt;
&lt;br /&gt;
== Recording procedure ==&lt;br /&gt;
&lt;br /&gt;
Three different distributions of the KIO beacons were defined. Fig. 2 presents the experimental area map and shows the location of the beacons for the three cases. The red dots show the placement of the six beacons for Distribution #1, blue dots show the placement of the six beacons for Distribution #2, and green dots show the placement of the six beacons for Distribution #3. 12 checkpoints were determined in the experimental area. Karen was placed in all of them and the location estimates gathered by the tag were recorded for later analysis. Checkpoint locations are shown as black rounded numbered points in Fig. 2. There are nine checkpoints inside the mock up apartment to obtain location estimates in all rooms. Three other checkpoints are in the corridor for obtaining location estimates outside the mock up apartment.&lt;br /&gt;
&lt;br /&gt;
[[File:apartamento-leon.jpg|thumb|'''Fig. 2''': Robotics mobile lab plane. Red dots indicate beacons for Distribution #1, blue dots for Distribution #2 and green dots for Distribution #3. Black rounded numbered dots indicate checkpoints.]]&lt;br /&gt;
&lt;br /&gt;
cyber-attacks were performed on the beacons of KIO RTLS used by Karen robot in order to estimate its location. Two types of attacks were performed. The first one (A1i) simulated three DoS attacks, by interfering with the signal emitted by beacons 411A, 501C, and 408A, thus making it impossible for the tag to obtain distances from them. The DoS attack on beacon 411A is labeled A1a, the one on 501C, A1b, and the one on 501C, A1c. The second type of attack (A2j) carried out four Spoofing attacks, changing the signal emitted by beacons 411A and 501C introducing either a fixed and a variable error, making the tag to calculate a wrong location. Spoofing attacks over the beacon 411A with either a fixed and variable error are labelled A2a and A2b respectively. The ones over 501C with either a fixed and variable error are labelled A2c and A2d respectively.&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
Differents versions of the dataset are enumerated below.&lt;br /&gt;
&lt;br /&gt;
=== v1.0 ===&lt;br /&gt;
&lt;br /&gt;
==== Distribution 1 (96 files) ====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-01-moving411a_2016-07-14-10-55-04.bag            kio-point-01-moving411a_2016-07-14-10-55-04.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-01-moving411a-Randomly_2016-07-14-10-54-11.bag   kio-point-01-moving411a-Randomly_2016-07-14-10-54-11.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-01-moving501c-Randomly_2016-07-14-10-53-13.bag   kio-point-01-moving501c-Randomly_2016-07-14-10-53-13.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-01-moving501c_2016-07-14-10-51-35.bag            kio-point-01-moving501c_2016-07-14-10-51-35.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-01-without408a_2016-07-14-10-49-27.bag           kio-point-01-without408a_2016-07-14-10-49-27.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-01-without411a_2016-07-14-10-47-06.bag           kio-point-01-without411a_2016-07-14-10-47-06.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-01-without501c_2016-07-14-10-48-24.bag           kio-point-01-without501c_2016-07-14-10-48-24.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-01_2016-07-14-10-44-29.bag                       kio-point-01_2016-07-14-10-44-29.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-02-moving411a-Randomly_2016-07-14-10-57-46.bag   kio-point-02-moving411a-Randomly_2016-07-14-10-57-46.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-02-moving411a_2016-07-14-10-56-38.bag            kio-point-02-moving411a_2016-07-14-10-56-38.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-02-moving501c-Randomly_2016-07-14-11-01-47.bag   kio-point-02-moving501c-Randomly_2016-07-14-11-01-47.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-02-moving501c_2016-07-14-11-02-40.bag            kio-point-02-moving501c_2016-07-14-11-02-40.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-02-without408a_2016-07-14-11-12-24.bag           kio-point-02-without408a_2016-07-14-11-12-24.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-02-without411a_2016-07-14-10-59-11.bag           kio-point-02-without411a_2016-07-14-10-59-11.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-02-without501c_2016-07-14-11-00-28.bag           kio-point-02-without501c_2016-07-14-11-00-28.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-02_2016-07-14-11-35-29.bag                       kio-point-02_2016-07-14-11-35-29.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-03-moving411a-Randomly_2016-07-14-11-06-22.bag   kio-point-03-moving411a-Randomly_2016-07-14-11-06-22.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-03-moving411a_2016-07-14-11-07-16.bag            kio-point-03-moving411a_2016-07-14-11-07-16.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-03-moving501c-Randomly_2016-07-14-11-05-01.bag   kio-point-03-moving501c-Randomly_2016-07-14-11-05-01.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-03-moving501c_2016-07-14-11-04-02.bag            kio-point-03-moving501c_2016-07-14-11-04-02.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-03-without408a_2016-07-14-11-11-06.bag           kio-point-03-without408a_2016-07-14-11-11-06.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-03-without411a_2016-07-14-11-08-26.bag           kio-point-03-without411a_2016-07-14-11-08-26.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-03-without501c_2016-07-14-11-09-43.bag           kio-point-03-without501c_2016-07-14-11-09-43.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-03_2016-07-14-11-34-42.bag                       kio-point-03_2016-07-14-11-34-42.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-04-moving411a-Randomly_2016-07-14-11-21-07.bag   kio-point-04-moving411a-Randomly_2016-07-14-11-21-07.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-04-moving411a_2016-07-14-11-21-58.bag            kio-point-04-moving411a_2016-07-14-11-21-58.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-04-moving501c-Randomly_2016-07-14-11-20-03.bag   kio-point-04-moving501c-Randomly_2016-07-14-11-20-03.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-04-moving501c_2016-07-14-11-18-59.bag            kio-point-04-moving501c_2016-07-14-11-18-59.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-04-without408a_2016-07-14-11-14-10.bag           kio-point-04-without408a_2016-07-14-11-14-10.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-04-without411a_2016-07-14-11-16-19.bag           kio-point-04-without411a_2016-07-14-11-16-19.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-04-without501c_2016-07-14-11-17-37.bag           kio-point-04-without501c_2016-07-14-11-17-37.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-04_2016-07-14-11-33-58.bag                       kio-point-04_2016-07-14-11-33-58.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-05-moving411a-Randomly_2016-07-14-11-25-33.bag   kio-point-05-moving411a-Randomly_2016-07-14-11-25-33.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-05-moving411a_2016-07-14-11-24-33.bag            kio-point-05-moving411a_2016-07-14-11-24-33.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-05-moving501c-Randomly_2016-07-14-11-26-34.bag   kio-point-05-moving501c-Randomly_2016-07-14-11-26-34.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-05-moving501c_2016-07-14-11-27-26.bag            kio-point-05-moving501c_2016-07-14-11-27-26.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-05-without408a_2016-07-14-11-31-19.bag           kio-point-05-without408a_2016-07-14-11-31-19.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-05-without411a_2016-07-14-11-29-41.bag           kio-point-05-without411a_2016-07-14-11-29-41.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-05-without501c_2016-07-14-11-28-32.bag           kio-point-05-without501c_2016-07-14-11-28-32.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-05_2016-07-14-11-33-11.bag                       kio-point-05_2016-07-14-11-33-11.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-06-moving411a-Randomly_2016-07-14-11-44-06.bag   kio-point-06-moving411a-Randomly_2016-07-14-11-44-06.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-06-moving411a_2016-07-14-11-44-59.bag            kio-point-06-moving411a_2016-07-14-11-44-59.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-06-moving501c-Randomly_2016-07-14-11-43-01.bag   kio-point-06-moving501c-Randomly_2016-07-14-11-43-01.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-06-moving501c_2016-07-14-11-42-04.bag            kio-point-06-moving501c_2016-07-14-11-42-04.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-06-without408a_2016-07-14-11-39-19.bag           kio-point-06-without408a_2016-07-14-11-39-19.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-06-without411a_2016-07-14-11-37-58.bag           kio-point-06-without411a_2016-07-14-11-37-58.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-06-without501c_2016-07-14-11-40-49.bag           kio-point-06-without501c_2016-07-14-11-40-49.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-06_2016-07-14-11-36-58.bag                       kio-point-06_2016-07-14-11-36-58.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-07-moving411a-Randomly_2016-07-15-10-27-02.bag   kio-point-07-moving411a-Randomly_2016-07-15-10-27-02.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-07-moving411a_2016-07-15-10-24-57.bag            kio-point-07-moving411a_2016-07-15-10-24-57.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-07-moving501c-Randomly_2016-07-15-10-28-10.bag   kio-point-07-moving501c-Randomly_2016-07-15-10-28-10.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-07-moving501c_2016-07-15-10-29-01.bag            kio-point-07-moving501c_2016-07-15-10-29-01.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-07-without408a_2016-07-15-10-22-45.bag           kio-point-07-without408a_2016-07-15-10-22-45.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-07-without411A_2016-07-15-10-17-59.bag           kio-point-07-without411A_2016-07-15-10-17-59.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-07-without501c_2016-07-15-10-20-27.bag           kio-point-07-without501c_2016-07-15-10-20-27.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-07_2016-07-15-10-16-24.bag                       kio-point-07_2016-07-15-10-16-24.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-08-moving411a-Randomly_2016-07-15-10-52-00.bag   kio-point-08-moving411a-Randomly_2016-07-15-10-52-00.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-08-moving411a_2016-07-15-10-51-18.bag            kio-point-08-moving411a_2016-07-15-10-51-18.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-08-moving501c-Randomly_2016-07-15-10-43-31.bag   kio-point-08-moving501c-Randomly_2016-07-15-10-43-31.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-08-moving501c_2016-07-15-10-44-19.bag            kio-point-08-moving501c_2016-07-15-10-44-19.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-08-without408a_2016-07-15-10-49-32.bag           kio-point-08-without408a_2016-07-15-10-49-32.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-08-without411a_2016-07-15-10-48-21.bag           kio-point-08-without411a_2016-07-15-10-48-21.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-08-without501c_2016-07-15-10-46-32.bag           kio-point-08-without501c_2016-07-15-10-46-32.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-08_2016-07-15-10-41-05.bag                       kio-point-08_2016-07-15-10-41-05.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-09-moving411a-Randomly_2016-07-15-10-32-43.bag   kio-point-09-moving411a-Randomly_2016-07-15-10-32-43.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-09-moving411a_2016-07-15-10-33-32.bag            kio-point-09-moving411a_2016-07-15-10-33-32.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-09-moving501c-Randomly_2016-07-15-10-31-50.bag   kio-point-09-moving501c-Randomly_2016-07-15-10-31-50.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-09-moving501c_2016-07-15-10-30-26.bag            kio-point-09-moving501c_2016-07-15-10-30-26.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-09-without408a_2016-07-15-10-38-23.bag           kio-point-09-without408a_2016-07-15-10-38-23.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-09-without411a_2016-07-15-10-35-35.bag           kio-point-09-without411a_2016-07-15-10-35-35.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-09-without501c_2016-07-15-10-36-50.bag           kio-point-09-without501c_2016-07-15-10-36-50.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-09_2016-07-15-10-39-52.bag                       kio-point-09_2016-07-15-10-39-52.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-10-moving411a-Randomly_2016-07-15-12-26-12.bag   kio-point-10-moving411a-Randomly_2016-07-15-12-26-12.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-10-moving411a_2016-07-15-12-24-51.bag            kio-point-10-moving411a_2016-07-15-12-24-51.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-10-moving501c-Randomly_2016-07-15-12-28-02.bag   kio-point-10-moving501c-Randomly_2016-07-15-12-28-02.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-10-moving501c_2016-07-15-12-23-10.bag            kio-point-10-moving501c_2016-07-15-12-23-10.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-10-without408a_2016-07-15-12-21-13.bag           kio-point-10-without408a_2016-07-15-12-21-13.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-10-without411a_2016-07-15-12-19-02.bag           kio-point-10-without411a_2016-07-15-12-19-02.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-10-without501c_2016-07-15-12-20-12.bag           kio-point-10-without501c_2016-07-15-12-20-12.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-10_2016-07-15-12-22-26.bag                       kio-point-10_2016-07-15-12-22-26.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-11-moving411a-Randomly_2016-07-15-13-05-05.bag   kio-point-11-moving411a-Randomly_2016-07-15-13-05-05.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-11-moving411a_2016-07-15-13-04-04.bag            kio-point-11-moving411a_2016-07-15-13-04-04.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-11-moving501c-Randomly_2016-07-15-13-06-01.bag   kio-point-11-moving501c-Randomly_2016-07-15-13-06-01.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-11-moving501c_2016-07-15-13-06-54.bag            kio-point-11-moving501c_2016-07-15-13-06-54.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-11-without408a_2016-07-15-13-00-25.bag           kio-point-11-without408a_2016-07-15-13-00-25.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-11-without411a_2016-07-15-13-01-37.bag           kio-point-11-without411a_2016-07-15-13-01-37.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-11-without501c_2016-07-15-13-02-47.bag           kio-point-11-without501c_2016-07-15-13-02-47.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-11_2016-07-15-12-59-31.bag                       kio-point-11_2016-07-15-12-59-31.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-12-moving411a-Randomly_2016-07-15-11-44-32.bag   kio-point-12-moving411a-Randomly_2016-07-15-11-44-32.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-12-moving411a_2016-07-15-11-43-14.bag            kio-point-12-moving411a_2016-07-15-11-43-14.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-12-moving501c-Randomly_2016-07-15-11-41-26.bag   kio-point-12-moving501c-Randomly_2016-07-15-11-41-26.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-12-moving501c_2016-07-15-11-42-20.bag            kio-point-12-moving501c_2016-07-15-11-42-20.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-12-without408a_2016-07-15-11-47-52.bag           kio-point-12-without408a_2016-07-15-11-47-52.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-12-without411a_2016-07-15-11-49-04.bag           kio-point-12-without411a_2016-07-15-11-49-04.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-12-without501c_2016-07-15-11-46-48.bag           kio-point-12-without501c_2016-07-15-11-46-48.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-12_2016-07-15-11-45-52.bag                       kio-point-12_2016-07-15-11-45-52.bag                    ]&lt;br /&gt;
&lt;br /&gt;
==== Distribution 2 (96 files) ====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-01-moving411a-Ramdomly_2016-09-09-13-04-48.bag   kio-point-01-moving411a-Ramdomly_2016-09-09-13-04-48.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-01-moving411a_2016-09-09-13-03-44.bag            kio-point-01-moving411a_2016-09-09-13-03-44.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-01-moving501c-Ramdomly_2016-09-09-13-00-15.bag   kio-point-01-moving501c-Ramdomly_2016-09-09-13-00-15.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-01-moving501c_2016-09-09-12-59-21.bag            kio-point-01-moving501c_2016-09-09-12-59-21.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-01-without408a_2016-09-09-12-53-22.bag           kio-point-01-without408a_2016-09-09-12-53-22.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-01-without411a_2016-09-09-12-49-51.bag           kio-point-01-without411a_2016-09-09-12-49-51.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-01-without501c_2016-09-09-12-52-24.bag           kio-point-01-without501c_2016-09-09-12-52-24.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-01_2016-09-09-12-48-25.bag                       kio-point-01_2016-09-09-12-48-25.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-02-moving411a-Ramdomly_2016-09-09-13-17-28.bag   kio-point-02-moving411a-Ramdomly_2016-09-09-13-17-28.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-02-moving411a_2016-09-09-13-16-42.bag            kio-point-02-moving411a_2016-09-09-13-16-42.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-02-moving501c-Ramdomly_2016-09-09-13-19-56.bag   kio-point-02-moving501c-Ramdomly_2016-09-09-13-19-56.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-02-moving501c_2016-09-09-13-19-15.bag            kio-point-02-moving501c_2016-09-09-13-19-15.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-02-without408a_2016-09-09-13-11-32.bag           kio-point-02-without408a_2016-09-09-13-11-32.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-02-without411a_2016-09-09-13-13-11.bag           kio-point-02-without411a_2016-09-09-13-13-11.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-02-without501c_2016-09-09-13-14-30.bag           kio-point-02-without501c_2016-09-09-13-14-30.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-02_2016-09-09-13-09-51.bag                       kio-point-02_2016-09-09-13-09-51.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-03-moving411a-Ramdomly_2016-09-09-13-31-12.bag   kio-point-03-moving411a-Ramdomly_2016-09-09-13-31-12.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-03-moving411a_2016-09-09-13-30-26.bag            kio-point-03-moving411a_2016-09-09-13-30-26.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-03-moving501c-Ramdomly_2016-09-09-13-34-14.bag   kio-point-03-moving501c-Ramdomly_2016-09-09-13-34-14.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-03-moving501c_2016-09-09-13-33-34.bag            kio-point-03-moving501c_2016-09-09-13-33-34.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-03-without408a_2016-09-09-13-25-57.bag           kio-point-03-without408a_2016-09-09-13-25-57.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-03-without411a_2016-09-09-13-27-42.bag           kio-point-03-without411a_2016-09-09-13-27-42.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-03-without501c_2016-09-09-13-28-57.bag           kio-point-03-without501c_2016-09-09-13-28-57.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-03_2016-09-09-13-23-01.bag                       kio-point-03_2016-09-09-13-23-01.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-04-moving411a-Ramdomly_2016-09-09-13-54-38.bag   kio-point-04-moving411a-Ramdomly_2016-09-09-13-54-38.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-04-moving411a_2016-09-09-13-53-58.bag            kio-point-04-moving411a_2016-09-09-13-53-58.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-04-moving501c-Ramdomly_2016-09-09-13-57-17.bag   kio-point-04-moving501c-Ramdomly_2016-09-09-13-57-17.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-04-moving501c_2016-09-09-13-56-27.bag            kio-point-04-moving501c_2016-09-09-13-56-27.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-04-without408a_2016-09-09-13-43-24.bag           kio-point-04-without408a_2016-09-09-13-43-24.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-04-without411a_2016-09-09-13-46-23.bag           kio-point-04-without411a_2016-09-09-13-46-23.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-04-without501c_2016-09-09-13-47-56.bag           kio-point-04-without501c_2016-09-09-13-47-56.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-04_2016-09-09-13-36-56.bag                       kio-point-04_2016-09-09-13-36-56.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-05-moving411a-Ramdomly_2016-09-12-12-05-51.bag   kio-point-05-moving411a-Ramdomly_2016-09-12-12-05-51.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-05-moving411a_2016-09-12-12-05-17.bag            kio-point-05-moving411a_2016-09-12-12-05-17.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-05-moving501c-Ramdomly_2016-09-12-12-08-29.bag   kio-point-05-moving501c-Ramdomly_2016-09-12-12-08-29.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-05-moving501c_2016-09-12-12-07-30.bag            kio-point-05-moving501c_2016-09-12-12-07-30.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-05-without408a_2016-09-12-11-56-20.bag           kio-point-05-without408a_2016-09-12-11-56-20.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-05-without411a_2016-09-12-11-58-20.bag           kio-point-05-without411a_2016-09-12-11-58-20.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-05-without501c_2016-09-12-11-59-38.bag           kio-point-05-without501c_2016-09-12-11-59-38.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-05_2016-09-12-11-54-32.bag                       kio-point-05_2016-09-12-11-54-32.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-06-moving411a-Ramdomly_2016-09-12-12-18-40.bag   kio-point-06-moving411a-Ramdomly_2016-09-12-12-18-40.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-06-moving411a_2016-09-12-12-17-56.bag            kio-point-06-moving411a_2016-09-12-12-17-56.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-06-moving501c-Ramdomly_2016-09-12-12-20-43.bag   kio-point-06-moving501c-Ramdomly_2016-09-12-12-20-43.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-06-moving501c_2016-09-12-12-19-57.bag            kio-point-06-moving501c_2016-09-12-12-19-57.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-06-without408a_2016-09-12-12-13-01.bag           kio-point-06-without408a_2016-09-12-12-13-01.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-06-without411a_2016-09-12-12-14-14.bag           kio-point-06-without411a_2016-09-12-12-14-14.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-06-without501c_2016-09-12-12-15-34.bag           kio-point-06-without501c_2016-09-12-12-15-34.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-06_2016-09-12-12-11-14.bag                       kio-point-06_2016-09-12-12-11-14.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-07-moving411a-Ramdomly_2016-09-12-12-34-34.bag   kio-point-07-moving411a-Ramdomly_2016-09-12-12-34-34.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-07-moving411a_2016-09-12-12-33-52.bag            kio-point-07-moving411a_2016-09-12-12-33-52.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-07-moving501c-Ramdomly_2016-09-12-12-38-30.bag   kio-point-07-moving501c-Ramdomly_2016-09-12-12-38-30.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-07-moving501c_2016-09-12-12-36-06.bag            kio-point-07-moving501c_2016-09-12-12-36-06.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-07-without408a_2016-09-12-12-24-19.bag           kio-point-07-without408a_2016-09-12-12-24-19.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-07-without411a_2016-09-12-12-25-55.bag           kio-point-07-without411a_2016-09-12-12-25-55.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-07-without501c_2016-09-12-12-32-07.bag           kio-point-07-without501c_2016-09-12-12-32-07.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-07_2016-09-12-12-22-33.bag                       kio-point-07_2016-09-12-12-22-33.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-08-moving411a-Ramdomly_2016-09-12-12-46-40.bag   kio-point-08-moving411a-Ramdomly_2016-09-12-12-46-40.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-08-moving411a_2016-09-12-12-45-58.bag            kio-point-08-moving411a_2016-09-12-12-45-58.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-08-moving501c-Ramdomly_2016-09-12-12-48-51.bag   kio-point-08-moving501c-Ramdomly_2016-09-12-12-48-51.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-08-moving501c_2016-09-12-12-47-54.bag            kio-point-08-moving501c_2016-09-12-12-47-54.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-08-without408a_2016-09-12-12-42-02.bag           kio-point-08-without408a_2016-09-12-12-42-02.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-08-without411a_2016-09-12-12-43-31.bag           kio-point-08-without411a_2016-09-12-12-43-31.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-08-without501c_2016-09-12-12-44-41.bag           kio-point-08-without501c_2016-09-12-12-44-41.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-08_2016-09-12-12-40-38.bag                       kio-point-08_2016-09-12-12-40-38.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-09-moving411a-Ramdomly_2016-09-12-12-59-31.bag   kio-point-09-moving411a-Ramdomly_2016-09-12-12-59-31.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-09-moving411a_2016-09-12-12-56-12.bag            kio-point-09-moving411a_2016-09-12-12-56-12.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-09-moving501c-Ramdomly_2016-09-12-13-02-13.bag   kio-point-09-moving501c-Ramdomly_2016-09-12-13-02-13.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-09-moving501c_2016-09-12-13-01-28.bag            kio-point-09-moving501c_2016-09-12-13-01-28.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-09-without408a_2016-09-12-12-52-20.bag           kio-point-09-without408a_2016-09-12-12-52-20.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-09-without411a_2016-09-12-12-53-37.bag           kio-point-09-without411a_2016-09-12-12-53-37.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-09-without501c_2016-09-12-12-54-48.bag           kio-point-09-without501c_2016-09-12-12-54-48.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-09_2016-09-12-12-50-52.bag                       kio-point-09_2016-09-12-12-50-52.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-10-moving411a-Randomly_2016-09-12-17-13-05.bag   kio-point-10-moving411a-Randomly_2016-09-12-17-13-05.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-10-moving411a_2016-09-12-17-12-24.bag            kio-point-10-moving411a_2016-09-12-17-12-24.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-10-moving501c-Ramdomly_2016-09-12-17-15-02.bag   kio-point-10-moving501c-Ramdomly_2016-09-12-17-15-02.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-10-moving501c_2016-09-12-17-14-18.bag            kio-point-10-moving501c_2016-09-12-17-14-18.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-10-without408a_2016-09-12-17-08-20.bag           kio-point-10-without408a_2016-09-12-17-08-20.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-10-without411a_2016-09-12-17-09-47.bag           kio-point-10-without411a_2016-09-12-17-09-47.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-10-without501c_2016-09-12-17-10-52.bag           kio-point-10-without501c_2016-09-12-17-10-52.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-10_2016-09-12-17-07-10.bag                       kio-point-10_2016-09-12-17-07-10.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-11-moving411a-Ramdomly_2016-09-12-17-24-25.bag   kio-point-11-moving411a-Ramdomly_2016-09-12-17-24-25.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-11-moving411a_2016-09-12-17-23-47.bag            kio-point-11-moving411a_2016-09-12-17-23-47.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-11-moving501c-Ramdomly_2016-09-12-17-26-08.bag   kio-point-11-moving501c-Ramdomly_2016-09-12-17-26-08.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-11-moving501c_2016-09-12-17-25-30.bag            kio-point-11-moving501c_2016-09-12-17-25-30.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-11-without408a_2016-09-12-17-18-28.bag           kio-point-11-without408a_2016-09-12-17-18-28.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-11-without411a_2016-09-12-17-20-56.bag           kio-point-11-without411a_2016-09-12-17-20-56.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-11-without501c_2016-09-12-17-22-12.bag           kio-point-11-without501c_2016-09-12-17-22-12.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-11_2016-09-12-17-17-16.bag                       kio-point-11_2016-09-12-17-17-16.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-12-moving411a-Ramdomly_2016-09-12-17-34-49.bag   kio-point-12-moving411a-Ramdomly_2016-09-12-17-34-49.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-12-moving411a_2016-09-12-17-34-09.bag            kio-point-12-moving411a_2016-09-12-17-34-09.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-12-moving501c-Ramdomly_2016-09-12-17-36-45.bag   kio-point-12-moving501c-Ramdomly_2016-09-12-17-36-45.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-12-moving501c_2016-09-12-17-36-04.bag            kio-point-12-moving501c_2016-09-12-17-36-04.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-12-without408a_2016-09-12-17-29-02.bag           kio-point-12-without408a_2016-09-12-17-29-02.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-12-without411a_2016-09-12-17-30-32.bag           kio-point-12-without411a_2016-09-12-17-30-32.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-12-without501c_2016-09-12-17-32-21.bag           kio-point-12-without501c_2016-09-12-17-32-21.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-12_2016-09-12-17-27-51.bag                       kio-point-12_2016-09-12-17-27-51.bag                    ]&lt;br /&gt;
&lt;br /&gt;
==== Distribution 3 (96 files) ====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-01-moving411a-Ramdomly_2016-09-15-14-22-20.bag   kio-point-01-moving411a-Ramdomly_2016-09-15-14-22-20.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-01-moving411a_2016-09-15-14-21-36.bag            kio-point-01-moving411a_2016-09-15-14-21-36.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-01-moving501c-Ramdomly_2016-09-15-14-24-32.bag   kio-point-01-moving501c-Ramdomly_2016-09-15-14-24-32.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-01-moving501c_2016-09-15-14-23-41.bag            kio-point-01-moving501c_2016-09-15-14-23-41.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-01-without408a_2016-09-15-14-17-30.bag           kio-point-01-without408a_2016-09-15-14-17-30.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-01-without411a_2016-09-15-14-18-50.bag           kio-point-01-without411a_2016-09-15-14-18-50.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-01-without501c_2016-09-15-14-19-57.bag           kio-point-01-without501c_2016-09-15-14-19-57.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-01_2016-09-15-14-16-31.bag                       kio-point-01_2016-09-15-14-16-31.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-02-moving411a-Ramdomly_2016-09-15-14-34-24.bag   kio-point-02-moving411a-Ramdomly_2016-09-15-14-34-24.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-02-moving411a_2016-09-15-14-32-43.bag            kio-point-02-moving411a_2016-09-15-14-32-43.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-02-moving501c-Ramdomly_2016-09-15-14-36-20.bag   kio-point-02-moving501c-Ramdomly_2016-09-15-14-36-20.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-02-moving501c_2016-09-15-14-35-43.bag            kio-point-02-moving501c_2016-09-15-14-35-43.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-02-without408a_2016-09-15-14-27-20.bag           kio-point-02-without408a_2016-09-15-14-27-20.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-02-without411a_2016-09-15-14-28-39.bag           kio-point-02-without411a_2016-09-15-14-28-39.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-02-without501c_2016-09-15-14-29-44.bag           kio-point-02-without501c_2016-09-15-14-29-44.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-02_2016-09-15-14-26-02.bag                       kio-point-02_2016-09-15-14-26-02.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-03-moving411a-Ramdomly_2016-09-16-09-41-26.bag   kio-point-03-moving411a-Ramdomly_2016-09-16-09-41-26.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-03-moving411a_2016-09-16-09-40-39.bag            kio-point-03-moving411a_2016-09-16-09-40-39.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-03-moving501c-Ramdomly_2016-09-16-09-44-42.bag   kio-point-03-moving501c-Ramdomly_2016-09-16-09-44-42.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-03-moving501c_2016-09-16-09-43-41.bag            kio-point-03-moving501c_2016-09-16-09-43-41.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-03-without408a_2016-09-16-09-35-55.bag           kio-point-03-without408a_2016-09-16-09-35-55.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-03-without411a_2016-09-16-09-38-20.bag           kio-point-03-without411a_2016-09-16-09-38-20.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-03-without501c_2016-09-16-09-39-22.bag           kio-point-03-without501c_2016-09-16-09-39-22.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-03_2016-09-16-09-34-45.bag                       kio-point-03_2016-09-16-09-34-45.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-04-moving411a-Ramdomly_2016-09-16-09-52-40.bag   kio-point-04-moving411a-Ramdomly_2016-09-16-09-52-40.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-04-moving411a_2016-09-16-09-51-45.bag            kio-point-04-moving411a_2016-09-16-09-51-45.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-04-moving501c-Ramdomly_2016-09-16-09-55-21.bag   kio-point-04-moving501c-Ramdomly_2016-09-16-09-55-21.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-04-moving501c_2016-09-16-09-54-22.bag            kio-point-04-moving501c_2016-09-16-09-54-22.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-04-without408a_2016-09-16-09-47-30.bag           kio-point-04-without408a_2016-09-16-09-47-30.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-04-without411a_2016-09-16-09-48-55.bag           kio-point-04-without411a_2016-09-16-09-48-55.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-04-without501c_2016-09-16-09-50-09.bag           kio-point-04-without501c_2016-09-16-09-50-09.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-04_2016-09-16-09-45-57.bag                       kio-point-04_2016-09-16-09-45-57.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-05-moving411a-Ramdomly_2016-09-16-10-03-02.bag   kio-point-05-moving411a-Ramdomly_2016-09-16-10-03-02.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-05-moving411a_2016-09-16-10-02-20.bag            kio-point-05-moving411a_2016-09-16-10-02-20.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-05-moving501c-Ramdomly_2016-09-16-10-05-25.bag   kio-point-05-moving501c-Ramdomly_2016-09-16-10-05-25.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-05-moving501c_2016-09-16-10-04-37.bag            kio-point-05-moving501c_2016-09-16-10-04-37.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-05-without408a_2016-09-16-09-57-36.bag           kio-point-05-without408a_2016-09-16-09-57-36.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-05-without411a_2016-09-16-09-59-16.bag           kio-point-05-without411a_2016-09-16-09-59-16.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-05-without501c_2016-09-16-10-00-26.bag           kio-point-05-without501c_2016-09-16-10-00-26.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-05_2016-09-16-09-56-33.bag                       kio-point-05_2016-09-16-09-56-33.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-06-moving411a-Ramdomly_2016-09-16-10-14-00.bag   kio-point-06-moving411a-Ramdomly_2016-09-16-10-14-00.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-06-moving411a_2016-09-16-10-13-12.bag            kio-point-06-moving411a_2016-09-16-10-13-12.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-06-moving501c-Ramdomly_2016-09-16-10-16-20.bag   kio-point-06-moving501c-Ramdomly_2016-09-16-10-16-20.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-06-moving501c_2016-09-16-10-15-27.bag            kio-point-06-moving501c_2016-09-16-10-15-27.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-06-without408a_2016-09-16-10-07-53.bag           kio-point-06-without408a_2016-09-16-10-07-53.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-06-without411a_2016-09-16-10-09-10.bag           kio-point-06-without411a_2016-09-16-10-09-10.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-06-without501c_2016-09-16-10-11-24.bag           kio-point-06-without501c_2016-09-16-10-11-24.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-06_2016-09-16-10-06-44.bag                       kio-point-06_2016-09-16-10-06-44.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-07-moving411a-Ramdomly_2016-09-16-10-30-01.bag   kio-point-07-moving411a-Ramdomly_2016-09-16-10-30-01.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-07-moving411a_2016-09-16-10-28-06.bag            kio-point-07-moving411a_2016-09-16-10-28-06.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-07-moving501c-Ramdomly_2016-09-16-10-32-20.bag   kio-point-07-moving501c-Ramdomly_2016-09-16-10-32-20.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-07-moving501c_2016-09-16-10-31-40.bag            kio-point-07-moving501c_2016-09-16-10-31-40.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-07-without408a_2016-09-16-10-21-38.bag           kio-point-07-without408a_2016-09-16-10-21-38.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-07-without411a_2016-09-16-10-22-50.bag           kio-point-07-without411a_2016-09-16-10-22-50.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-07-without501c_2016-09-16-10-23-57.bag           kio-point-07-without501c_2016-09-16-10-23-57.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-07_2016-09-16-10-20-09.bag                       kio-point-07_2016-09-16-10-20-09.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-08-moving411a-Ramdomly_2016-09-16-12-46-34.bag   kio-point-08-moving411a-Ramdomly_2016-09-16-12-46-34.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-08-moving411a_2016-09-16-12-45-51.bag            kio-point-08-moving411a_2016-09-16-12-45-51.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-08-moving501c-Ramdomly_2016-09-16-12-48-49.bag   kio-point-08-moving501c-Ramdomly_2016-09-16-12-48-49.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-08-moving501c_2016-09-16-12-48-03.bag            kio-point-08-moving501c_2016-09-16-12-48-03.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-08-without408a_2016-09-16-12-42-05.bag           kio-point-08-without408a_2016-09-16-12-42-05.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-08-without411a_2016-09-16-12-43-23.bag           kio-point-08-without411a_2016-09-16-12-43-23.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-08-without501c_2016-09-16-12-44-33.bag           kio-point-08-without501c_2016-09-16-12-44-33.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-08_2016-09-16-12-41-01.bag                       kio-point-08_2016-09-16-12-41-01.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-09-moving411a-Ramdomly_2016-09-16-10-44-49.bag   kio-point-09-moving411a-Ramdomly_2016-09-16-10-44-49.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-09-moving411a_2016-09-16-10-44-09.bag            kio-point-09-moving411a_2016-09-16-10-44-09.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-09-moving501c-Ramdomly_2016-09-16-10-43-14.bag   kio-point-09-moving501c-Ramdomly_2016-09-16-10-43-14.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-09-moving501c_2016-09-16-10-42-38.bag            kio-point-09-moving501c_2016-09-16-10-42-38.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-09-without408a_2016-09-16-10-35-54.bag           kio-point-09-without408a_2016-09-16-10-35-54.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-09-without411a_2016-09-16-10-37-42.bag           kio-point-09-without411a_2016-09-16-10-37-42.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-09-without501c_2016-09-16-10-38-58.bag           kio-point-09-without501c_2016-09-16-10-38-58.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-09_2016-09-16-10-34-53.bag                       kio-point-09_2016-09-16-10-34-53.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-10-moving411a-Ramdomly_2016-09-16-12-57-41.bag   kio-point-10-moving411a-Ramdomly_2016-09-16-12-57-41.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-10-moving411a_2016-09-16-12-56-55.bag            kio-point-10-moving411a_2016-09-16-12-56-55.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-10-moving501c-Ramdomly_2016-09-16-12-59-32.bag   kio-point-10-moving501c-Ramdomly_2016-09-16-12-59-32.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-10-moving501c_2016-09-16-12-58-53.bag            kio-point-10-moving501c_2016-09-16-12-58-53.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-10-without408a_2016-09-16-12-53-06.bag           kio-point-10-without408a_2016-09-16-12-53-06.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-10-without411a_2016-09-16-12-54-05.bag           kio-point-10-without411a_2016-09-16-12-54-05.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-10-without501c_2016-09-16-12-55-07.bag           kio-point-10-without501c_2016-09-16-12-55-07.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-10_2016-09-16-12-51-33.bag                       kio-point-10_2016-09-16-12-51-33.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-11-moving411a-Ramdomly_2016-09-16-13-52-20.bag   kio-point-11-moving411a-Ramdomly_2016-09-16-13-52-20.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-11-moving411a_2016-09-16-13-51-41.bag            kio-point-11-moving411a_2016-09-16-13-51-41.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-11-moving501c-Ramdomly_2016-09-16-13-54-04.bag   kio-point-11-moving501c-Ramdomly_2016-09-16-13-54-04.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-11-moving501c_2016-09-16-13-53-19.bag            kio-point-11-moving501c_2016-09-16-13-53-19.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-11-without408a_2016-09-16-13-35-32.bag           kio-point-11-without408a_2016-09-16-13-35-32.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-11-without411a_2016-09-16-13-37-00.bag           kio-point-11-without411a_2016-09-16-13-37-00.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-11-without501c_2016-09-16-13-39-08.bag           kio-point-11-without501c_2016-09-16-13-39-08.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-11_2016-09-16-13-34-14.bag                       kio-point-11_2016-09-16-13-34-14.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-12-moving411a-Ramdomly_2016-09-16-13-59-34.bag   kio-point-12-moving411a-Ramdomly_2016-09-16-13-59-34.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-12-moving411a_2016-09-16-13-58-56.bag            kio-point-12-moving411a_2016-09-16-13-58-56.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-12-moving501c-Ramdomly_2016-09-16-14-01-04.bag   kio-point-12-moving501c-Ramdomly_2016-09-16-14-01-04.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-12-moving501c_2016-09-16-14-00-26.bag            kio-point-12-moving501c_2016-09-16-14-00-26.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-12-without408a_2016-09-16-13-56-22.bag           kio-point-12-without408a_2016-09-16-13-56-22.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-12-without411a_2016-09-16-13-57-12.bag           kio-point-12-without411a_2016-09-16-13-57-12.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-12-without501c_2016-09-16-13-58-02.bag           kio-point-12-without501c_2016-09-16-13-58-02.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-12_2016-09-16-13-55-29.bag                       kio-point-12_2016-09-16-13-55-29.bag                    ]&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_analysis_of_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots&amp;diff=5155</id>
		<title>Benchmark dataset for analysis of cyber-attacks to an indoor real time localization system for autonomous robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_analysis_of_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots&amp;diff=5155"/>
				<updated>2017-10-16T08:55:36Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;blockquote style=&amp;quot;background-color:#ffe;border:1px solid #fb0;padding:5px 10px&amp;quot;&amp;gt;SciCrunch! reference: '''&lt;br /&gt;
Real time localization system for autonomous robots Benchmark Dataset, RRID:SCR_015756.'''&amp;lt;/blockquote&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to analyze cyber-attacks to an indoor real time localization system for autonomous robots. Data have been gathered in an indoor mock-up apartment, shown in Fig 2 , located at the Robotics Lab of the University of León (Spain). An autonomous robot, called Karen and shown in Fig 1, with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
Further information can be found at: ''Guerrero-Higueras, Á. M., DeCastro-García, N., Rodríguez-Lera, F. J., &amp;amp; Matellán, V. (2017). '''Empirical analysis of cyber-attacks to an indoor real time localization system for autonomous robots'''. Computers &amp;amp; Security, 70, 422-435.''&lt;br /&gt;
&lt;br /&gt;
== Materials ==&lt;br /&gt;
&lt;br /&gt;
Data gathered by Karen robot include: &lt;br /&gt;
* Karen location estimates provided by a commercial RTLS, called '''KIO'''.&lt;br /&gt;
&lt;br /&gt;
Additional information about Karen and the devices/packages used to get data is given below.&lt;br /&gt;
&lt;br /&gt;
=== Karen robot ===&lt;br /&gt;
&lt;br /&gt;
Fig. 1 shows the autonomous robot Karen, used in the experiments. Karen is a mobile manipulator built by the Robotics Group of the Universidad de León (Spain). It has two arms with 7 degrees of freedom, 3 fingers each, and a mobile base. The control of the robot is based on the Robot Operating System (ROS) framework.&lt;br /&gt;
&lt;br /&gt;
[[File:KarenWithKIO.jpg|thumb|'''Fig. 1''': Karen and KIO RTLS: a beacon (1), and tag on the robot (2).]]&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
KIO RTLS commercial solution by [https://www.eliko.ee/products/kio-rtls/ Eliko] has been used to provide people location at the study area. Fig 1 shows a KIO beacon (1), and a KIO tag on the robot (2).&lt;br /&gt;
&lt;br /&gt;
== Recording procedure ==&lt;br /&gt;
&lt;br /&gt;
Three different distributions of the KIO beacons were defined. Fig. 2 presents the experimental area map and shows the location of the beacons for the three cases. The red dots show the placement of the six beacons for Distribution #1, blue dots show the placement of the six beacons for Distribution #2, and green dots show the placement of the six beacons for Distribution #3. 12 checkpoints were determined in the experimental area. Karen was placed in all of them and the location estimates gathered by the tag were recorded for later analysis. Checkpoint locations are shown as black rounded numbered points in Fig. 2. There are nine checkpoints inside the mock up apartment to obtain location estimates in all rooms. Three other checkpoints are in the corridor for obtaining location estimates outside the mock up apartment.&lt;br /&gt;
&lt;br /&gt;
[[File:apartamento-leon.jpg|thumb|'''Fig. 2''': Robotics mobile lab plane. Red dots indicate beacons for Distribution #1, blue dots for Distribution #2 and green dots for Distribution #3. Black rounded numbered dots indicate checkpoints.]]&lt;br /&gt;
&lt;br /&gt;
cyber-attacks were performed on the beacons of KIO RTLS used by Karen robot in order to estimate its location. Two types of attacks were performed. The first one (A1i) simulated three DoS attacks, by interfering with the signal emitted by beacons 411A, 501C, and 408A, thus making it impossible for the tag to obtain distances from them. The DoS attack on beacon 411A is labeled A1a, the one on 501C, A1b, and the one on 501C, A1c. The second type of attack (A2j) carried out four Spoofing attacks, changing the signal emitted by beacons 411A and 501C introducing either a fixed and a variable error, making the tag to calculate a wrong location. Spoofing attacks over the beacon 411A with either a fixed and variable error are labelled A2a and A2b respectively. The ones over 501C with either a fixed and variable error are labelled A2c and A2d respectively.&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
Differents versions of the dataset are enumerated below.&lt;br /&gt;
&lt;br /&gt;
=== v1.0 ===&lt;br /&gt;
&lt;br /&gt;
==== Distribution 1 (96 files) ====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-01-moving411a_2016-07-14-10-55-04.bag            kio-point-01-moving411a_2016-07-14-10-55-04.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-01-moving411a-Randomly_2016-07-14-10-54-11.bag   kio-point-01-moving411a-Randomly_2016-07-14-10-54-11.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-01-moving501c-Randomly_2016-07-14-10-53-13.bag   kio-point-01-moving501c-Randomly_2016-07-14-10-53-13.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-01-moving501c_2016-07-14-10-51-35.bag            kio-point-01-moving501c_2016-07-14-10-51-35.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-01-without408a_2016-07-14-10-49-27.bag           kio-point-01-without408a_2016-07-14-10-49-27.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-01-without411a_2016-07-14-10-47-06.bag           kio-point-01-without411a_2016-07-14-10-47-06.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-01-without501c_2016-07-14-10-48-24.bag           kio-point-01-without501c_2016-07-14-10-48-24.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-01_2016-07-14-10-44-29.bag                       kio-point-01_2016-07-14-10-44-29.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-02-moving411a-Randomly_2016-07-14-10-57-46.bag   kio-point-02-moving411a-Randomly_2016-07-14-10-57-46.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-02-moving411a_2016-07-14-10-56-38.bag            kio-point-02-moving411a_2016-07-14-10-56-38.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-02-moving501c-Randomly_2016-07-14-11-01-47.bag   kio-point-02-moving501c-Randomly_2016-07-14-11-01-47.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-02-moving501c_2016-07-14-11-02-40.bag            kio-point-02-moving501c_2016-07-14-11-02-40.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-02-without408a_2016-07-14-11-12-24.bag           kio-point-02-without408a_2016-07-14-11-12-24.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-02-without411a_2016-07-14-10-59-11.bag           kio-point-02-without411a_2016-07-14-10-59-11.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-02-without501c_2016-07-14-11-00-28.bag           kio-point-02-without501c_2016-07-14-11-00-28.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-02_2016-07-14-11-35-29.bag                       kio-point-02_2016-07-14-11-35-29.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-03-moving411a-Randomly_2016-07-14-11-06-22.bag   kio-point-03-moving411a-Randomly_2016-07-14-11-06-22.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-03-moving411a_2016-07-14-11-07-16.bag            kio-point-03-moving411a_2016-07-14-11-07-16.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-03-moving501c-Randomly_2016-07-14-11-05-01.bag   kio-point-03-moving501c-Randomly_2016-07-14-11-05-01.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-03-moving501c_2016-07-14-11-04-02.bag            kio-point-03-moving501c_2016-07-14-11-04-02.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-03-without408a_2016-07-14-11-11-06.bag           kio-point-03-without408a_2016-07-14-11-11-06.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-03-without411a_2016-07-14-11-08-26.bag           kio-point-03-without411a_2016-07-14-11-08-26.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-03-without501c_2016-07-14-11-09-43.bag           kio-point-03-without501c_2016-07-14-11-09-43.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-03_2016-07-14-11-34-42.bag                       kio-point-03_2016-07-14-11-34-42.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-04-moving411a-Randomly_2016-07-14-11-21-07.bag   kio-point-04-moving411a-Randomly_2016-07-14-11-21-07.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-04-moving411a_2016-07-14-11-21-58.bag            kio-point-04-moving411a_2016-07-14-11-21-58.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-04-moving501c-Randomly_2016-07-14-11-20-03.bag   kio-point-04-moving501c-Randomly_2016-07-14-11-20-03.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-04-moving501c_2016-07-14-11-18-59.bag            kio-point-04-moving501c_2016-07-14-11-18-59.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-04-without408a_2016-07-14-11-14-10.bag           kio-point-04-without408a_2016-07-14-11-14-10.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-04-without411a_2016-07-14-11-16-19.bag           kio-point-04-without411a_2016-07-14-11-16-19.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-04-without501c_2016-07-14-11-17-37.bag           kio-point-04-without501c_2016-07-14-11-17-37.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-04_2016-07-14-11-33-58.bag                       kio-point-04_2016-07-14-11-33-58.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-05-moving411a-Randomly_2016-07-14-11-25-33.bag   kio-point-05-moving411a-Randomly_2016-07-14-11-25-33.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-05-moving411a_2016-07-14-11-24-33.bag            kio-point-05-moving411a_2016-07-14-11-24-33.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-05-moving501c-Randomly_2016-07-14-11-26-34.bag   kio-point-05-moving501c-Randomly_2016-07-14-11-26-34.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-05-moving501c_2016-07-14-11-27-26.bag            kio-point-05-moving501c_2016-07-14-11-27-26.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-05-without408a_2016-07-14-11-31-19.bag           kio-point-05-without408a_2016-07-14-11-31-19.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-05-without411a_2016-07-14-11-29-41.bag           kio-point-05-without411a_2016-07-14-11-29-41.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-05-without501c_2016-07-14-11-28-32.bag           kio-point-05-without501c_2016-07-14-11-28-32.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-05_2016-07-14-11-33-11.bag                       kio-point-05_2016-07-14-11-33-11.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-06-moving411a-Randomly_2016-07-14-11-44-06.bag   kio-point-06-moving411a-Randomly_2016-07-14-11-44-06.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-06-moving411a_2016-07-14-11-44-59.bag            kio-point-06-moving411a_2016-07-14-11-44-59.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-06-moving501c-Randomly_2016-07-14-11-43-01.bag   kio-point-06-moving501c-Randomly_2016-07-14-11-43-01.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-06-moving501c_2016-07-14-11-42-04.bag            kio-point-06-moving501c_2016-07-14-11-42-04.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-06-without408a_2016-07-14-11-39-19.bag           kio-point-06-without408a_2016-07-14-11-39-19.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-06-without411a_2016-07-14-11-37-58.bag           kio-point-06-without411a_2016-07-14-11-37-58.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-06-without501c_2016-07-14-11-40-49.bag           kio-point-06-without501c_2016-07-14-11-40-49.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-06_2016-07-14-11-36-58.bag                       kio-point-06_2016-07-14-11-36-58.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-07-moving411a-Randomly_2016-07-15-10-27-02.bag   kio-point-07-moving411a-Randomly_2016-07-15-10-27-02.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-07-moving411a_2016-07-15-10-24-57.bag            kio-point-07-moving411a_2016-07-15-10-24-57.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-07-moving501c-Randomly_2016-07-15-10-28-10.bag   kio-point-07-moving501c-Randomly_2016-07-15-10-28-10.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-07-moving501c_2016-07-15-10-29-01.bag            kio-point-07-moving501c_2016-07-15-10-29-01.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-07-without408a_2016-07-15-10-22-45.bag           kio-point-07-without408a_2016-07-15-10-22-45.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-07-without411A_2016-07-15-10-17-59.bag           kio-point-07-without411A_2016-07-15-10-17-59.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-07-without501c_2016-07-15-10-20-27.bag           kio-point-07-without501c_2016-07-15-10-20-27.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-07_2016-07-15-10-16-24.bag                       kio-point-07_2016-07-15-10-16-24.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-08-moving411a-Randomly_2016-07-15-10-52-00.bag   kio-point-08-moving411a-Randomly_2016-07-15-10-52-00.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-08-moving411a_2016-07-15-10-51-18.bag            kio-point-08-moving411a_2016-07-15-10-51-18.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-08-moving501c-Randomly_2016-07-15-10-43-31.bag   kio-point-08-moving501c-Randomly_2016-07-15-10-43-31.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-08-moving501c_2016-07-15-10-44-19.bag            kio-point-08-moving501c_2016-07-15-10-44-19.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-08-without408a_2016-07-15-10-49-32.bag           kio-point-08-without408a_2016-07-15-10-49-32.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-08-without411a_2016-07-15-10-48-21.bag           kio-point-08-without411a_2016-07-15-10-48-21.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-08-without501c_2016-07-15-10-46-32.bag           kio-point-08-without501c_2016-07-15-10-46-32.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-08_2016-07-15-10-41-05.bag                       kio-point-08_2016-07-15-10-41-05.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-09-moving411a-Randomly_2016-07-15-10-32-43.bag   kio-point-09-moving411a-Randomly_2016-07-15-10-32-43.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-09-moving411a_2016-07-15-10-33-32.bag            kio-point-09-moving411a_2016-07-15-10-33-32.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-09-moving501c-Randomly_2016-07-15-10-31-50.bag   kio-point-09-moving501c-Randomly_2016-07-15-10-31-50.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-09-moving501c_2016-07-15-10-30-26.bag            kio-point-09-moving501c_2016-07-15-10-30-26.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-09-without408a_2016-07-15-10-38-23.bag           kio-point-09-without408a_2016-07-15-10-38-23.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-09-without411a_2016-07-15-10-35-35.bag           kio-point-09-without411a_2016-07-15-10-35-35.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-09-without501c_2016-07-15-10-36-50.bag           kio-point-09-without501c_2016-07-15-10-36-50.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-09_2016-07-15-10-39-52.bag                       kio-point-09_2016-07-15-10-39-52.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-10-moving411a-Randomly_2016-07-15-12-26-12.bag   kio-point-10-moving411a-Randomly_2016-07-15-12-26-12.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-10-moving411a_2016-07-15-12-24-51.bag            kio-point-10-moving411a_2016-07-15-12-24-51.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-10-moving501c-Randomly_2016-07-15-12-28-02.bag   kio-point-10-moving501c-Randomly_2016-07-15-12-28-02.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-10-moving501c_2016-07-15-12-23-10.bag            kio-point-10-moving501c_2016-07-15-12-23-10.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-10-without408a_2016-07-15-12-21-13.bag           kio-point-10-without408a_2016-07-15-12-21-13.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-10-without411a_2016-07-15-12-19-02.bag           kio-point-10-without411a_2016-07-15-12-19-02.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-10-without501c_2016-07-15-12-20-12.bag           kio-point-10-without501c_2016-07-15-12-20-12.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-10_2016-07-15-12-22-26.bag                       kio-point-10_2016-07-15-12-22-26.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-11-moving411a-Randomly_2016-07-15-13-05-05.bag   kio-point-11-moving411a-Randomly_2016-07-15-13-05-05.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-11-moving411a_2016-07-15-13-04-04.bag            kio-point-11-moving411a_2016-07-15-13-04-04.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-11-moving501c-Randomly_2016-07-15-13-06-01.bag   kio-point-11-moving501c-Randomly_2016-07-15-13-06-01.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-11-moving501c_2016-07-15-13-06-54.bag            kio-point-11-moving501c_2016-07-15-13-06-54.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-11-without408a_2016-07-15-13-00-25.bag           kio-point-11-without408a_2016-07-15-13-00-25.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-11-without411a_2016-07-15-13-01-37.bag           kio-point-11-without411a_2016-07-15-13-01-37.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-11-without501c_2016-07-15-13-02-47.bag           kio-point-11-without501c_2016-07-15-13-02-47.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-11_2016-07-15-12-59-31.bag                       kio-point-11_2016-07-15-12-59-31.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-12-moving411a-Randomly_2016-07-15-11-44-32.bag   kio-point-12-moving411a-Randomly_2016-07-15-11-44-32.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-12-moving411a_2016-07-15-11-43-14.bag            kio-point-12-moving411a_2016-07-15-11-43-14.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-12-moving501c-Randomly_2016-07-15-11-41-26.bag   kio-point-12-moving501c-Randomly_2016-07-15-11-41-26.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-12-moving501c_2016-07-15-11-42-20.bag            kio-point-12-moving501c_2016-07-15-11-42-20.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-12-without408a_2016-07-15-11-47-52.bag           kio-point-12-without408a_2016-07-15-11-47-52.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-12-without411a_2016-07-15-11-49-04.bag           kio-point-12-without411a_2016-07-15-11-49-04.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-12-without501c_2016-07-15-11-46-48.bag           kio-point-12-without501c_2016-07-15-11-46-48.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-1/kio-point-12_2016-07-15-11-45-52.bag                       kio-point-12_2016-07-15-11-45-52.bag                    ]&lt;br /&gt;
&lt;br /&gt;
==== Distribution 2 (96 files) ====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-01-moving411a-Ramdomly_2016-09-09-13-04-48.bag   kio-point-01-moving411a-Ramdomly_2016-09-09-13-04-48.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-01-moving411a_2016-09-09-13-03-44.bag            kio-point-01-moving411a_2016-09-09-13-03-44.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-01-moving501c-Ramdomly_2016-09-09-13-00-15.bag   kio-point-01-moving501c-Ramdomly_2016-09-09-13-00-15.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-01-moving501c_2016-09-09-12-59-21.bag            kio-point-01-moving501c_2016-09-09-12-59-21.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-01-without408a_2016-09-09-12-53-22.bag           kio-point-01-without408a_2016-09-09-12-53-22.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-01-without411a_2016-09-09-12-49-51.bag           kio-point-01-without411a_2016-09-09-12-49-51.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-01-without501c_2016-09-09-12-52-24.bag           kio-point-01-without501c_2016-09-09-12-52-24.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-01_2016-09-09-12-48-25.bag                       kio-point-01_2016-09-09-12-48-25.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-02-moving411a-Ramdomly_2016-09-09-13-17-28.bag   kio-point-02-moving411a-Ramdomly_2016-09-09-13-17-28.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-02-moving411a_2016-09-09-13-16-42.bag            kio-point-02-moving411a_2016-09-09-13-16-42.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-02-moving501c-Ramdomly_2016-09-09-13-19-56.bag   kio-point-02-moving501c-Ramdomly_2016-09-09-13-19-56.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-02-moving501c_2016-09-09-13-19-15.bag            kio-point-02-moving501c_2016-09-09-13-19-15.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-02-without408a_2016-09-09-13-11-32.bag           kio-point-02-without408a_2016-09-09-13-11-32.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-02-without411a_2016-09-09-13-13-11.bag           kio-point-02-without411a_2016-09-09-13-13-11.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-02-without501c_2016-09-09-13-14-30.bag           kio-point-02-without501c_2016-09-09-13-14-30.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-02_2016-09-09-13-09-51.bag                       kio-point-02_2016-09-09-13-09-51.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-03-moving411a-Ramdomly_2016-09-09-13-31-12.bag   kio-point-03-moving411a-Ramdomly_2016-09-09-13-31-12.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-03-moving411a_2016-09-09-13-30-26.bag            kio-point-03-moving411a_2016-09-09-13-30-26.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-03-moving501c-Ramdomly_2016-09-09-13-34-14.bag   kio-point-03-moving501c-Ramdomly_2016-09-09-13-34-14.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-03-moving501c_2016-09-09-13-33-34.bag            kio-point-03-moving501c_2016-09-09-13-33-34.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-03-without408a_2016-09-09-13-25-57.bag           kio-point-03-without408a_2016-09-09-13-25-57.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-03-without411a_2016-09-09-13-27-42.bag           kio-point-03-without411a_2016-09-09-13-27-42.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-03-without501c_2016-09-09-13-28-57.bag           kio-point-03-without501c_2016-09-09-13-28-57.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-03_2016-09-09-13-23-01.bag                       kio-point-03_2016-09-09-13-23-01.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-04-moving411a-Ramdomly_2016-09-09-13-54-38.bag   kio-point-04-moving411a-Ramdomly_2016-09-09-13-54-38.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-04-moving411a_2016-09-09-13-53-58.bag            kio-point-04-moving411a_2016-09-09-13-53-58.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-04-moving501c-Ramdomly_2016-09-09-13-57-17.bag   kio-point-04-moving501c-Ramdomly_2016-09-09-13-57-17.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-04-moving501c_2016-09-09-13-56-27.bag            kio-point-04-moving501c_2016-09-09-13-56-27.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-04-without408a_2016-09-09-13-43-24.bag           kio-point-04-without408a_2016-09-09-13-43-24.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-04-without411a_2016-09-09-13-46-23.bag           kio-point-04-without411a_2016-09-09-13-46-23.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-04-without501c_2016-09-09-13-47-56.bag           kio-point-04-without501c_2016-09-09-13-47-56.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-04_2016-09-09-13-36-56.bag                       kio-point-04_2016-09-09-13-36-56.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-05-moving411a-Ramdomly_2016-09-12-12-05-51.bag   kio-point-05-moving411a-Ramdomly_2016-09-12-12-05-51.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-05-moving411a_2016-09-12-12-05-17.bag            kio-point-05-moving411a_2016-09-12-12-05-17.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-05-moving501c-Ramdomly_2016-09-12-12-08-29.bag   kio-point-05-moving501c-Ramdomly_2016-09-12-12-08-29.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-05-moving501c_2016-09-12-12-07-30.bag            kio-point-05-moving501c_2016-09-12-12-07-30.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-05-without408a_2016-09-12-11-56-20.bag           kio-point-05-without408a_2016-09-12-11-56-20.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-05-without411a_2016-09-12-11-58-20.bag           kio-point-05-without411a_2016-09-12-11-58-20.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-05-without501c_2016-09-12-11-59-38.bag           kio-point-05-without501c_2016-09-12-11-59-38.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-05_2016-09-12-11-54-32.bag                       kio-point-05_2016-09-12-11-54-32.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-06-moving411a-Ramdomly_2016-09-12-12-18-40.bag   kio-point-06-moving411a-Ramdomly_2016-09-12-12-18-40.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-06-moving411a_2016-09-12-12-17-56.bag            kio-point-06-moving411a_2016-09-12-12-17-56.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-06-moving501c-Ramdomly_2016-09-12-12-20-43.bag   kio-point-06-moving501c-Ramdomly_2016-09-12-12-20-43.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-06-moving501c_2016-09-12-12-19-57.bag            kio-point-06-moving501c_2016-09-12-12-19-57.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-06-without408a_2016-09-12-12-13-01.bag           kio-point-06-without408a_2016-09-12-12-13-01.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-06-without411a_2016-09-12-12-14-14.bag           kio-point-06-without411a_2016-09-12-12-14-14.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-06-without501c_2016-09-12-12-15-34.bag           kio-point-06-without501c_2016-09-12-12-15-34.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-06_2016-09-12-12-11-14.bag                       kio-point-06_2016-09-12-12-11-14.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-07-moving411a-Ramdomly_2016-09-12-12-34-34.bag   kio-point-07-moving411a-Ramdomly_2016-09-12-12-34-34.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-07-moving411a_2016-09-12-12-33-52.bag            kio-point-07-moving411a_2016-09-12-12-33-52.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-07-moving501c-Ramdomly_2016-09-12-12-38-30.bag   kio-point-07-moving501c-Ramdomly_2016-09-12-12-38-30.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-07-moving501c_2016-09-12-12-36-06.bag            kio-point-07-moving501c_2016-09-12-12-36-06.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-07-without408a_2016-09-12-12-24-19.bag           kio-point-07-without408a_2016-09-12-12-24-19.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-07-without411a_2016-09-12-12-25-55.bag           kio-point-07-without411a_2016-09-12-12-25-55.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-07-without501c_2016-09-12-12-32-07.bag           kio-point-07-without501c_2016-09-12-12-32-07.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-07_2016-09-12-12-22-33.bag                       kio-point-07_2016-09-12-12-22-33.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-08-moving411a-Ramdomly_2016-09-12-12-46-40.bag   kio-point-08-moving411a-Ramdomly_2016-09-12-12-46-40.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-08-moving411a_2016-09-12-12-45-58.bag            kio-point-08-moving411a_2016-09-12-12-45-58.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-08-moving501c-Ramdomly_2016-09-12-12-48-51.bag   kio-point-08-moving501c-Ramdomly_2016-09-12-12-48-51.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-08-moving501c_2016-09-12-12-47-54.bag            kio-point-08-moving501c_2016-09-12-12-47-54.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-08-without408a_2016-09-12-12-42-02.bag           kio-point-08-without408a_2016-09-12-12-42-02.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-08-without411a_2016-09-12-12-43-31.bag           kio-point-08-without411a_2016-09-12-12-43-31.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-08-without501c_2016-09-12-12-44-41.bag           kio-point-08-without501c_2016-09-12-12-44-41.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-08_2016-09-12-12-40-38.bag                       kio-point-08_2016-09-12-12-40-38.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-09-moving411a-Ramdomly_2016-09-12-12-59-31.bag   kio-point-09-moving411a-Ramdomly_2016-09-12-12-59-31.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-09-moving411a_2016-09-12-12-56-12.bag            kio-point-09-moving411a_2016-09-12-12-56-12.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-09-moving501c-Ramdomly_2016-09-12-13-02-13.bag   kio-point-09-moving501c-Ramdomly_2016-09-12-13-02-13.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-09-moving501c_2016-09-12-13-01-28.bag            kio-point-09-moving501c_2016-09-12-13-01-28.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-09-without408a_2016-09-12-12-52-20.bag           kio-point-09-without408a_2016-09-12-12-52-20.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-09-without411a_2016-09-12-12-53-37.bag           kio-point-09-without411a_2016-09-12-12-53-37.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-09-without501c_2016-09-12-12-54-48.bag           kio-point-09-without501c_2016-09-12-12-54-48.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-09_2016-09-12-12-50-52.bag                       kio-point-09_2016-09-12-12-50-52.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-10-moving411a-Randomly_2016-09-12-17-13-05.bag   kio-point-10-moving411a-Randomly_2016-09-12-17-13-05.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-10-moving411a_2016-09-12-17-12-24.bag            kio-point-10-moving411a_2016-09-12-17-12-24.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-10-moving501c-Ramdomly_2016-09-12-17-15-02.bag   kio-point-10-moving501c-Ramdomly_2016-09-12-17-15-02.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-10-moving501c_2016-09-12-17-14-18.bag            kio-point-10-moving501c_2016-09-12-17-14-18.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-10-without408a_2016-09-12-17-08-20.bag           kio-point-10-without408a_2016-09-12-17-08-20.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-10-without411a_2016-09-12-17-09-47.bag           kio-point-10-without411a_2016-09-12-17-09-47.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-10-without501c_2016-09-12-17-10-52.bag           kio-point-10-without501c_2016-09-12-17-10-52.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-10_2016-09-12-17-07-10.bag                       kio-point-10_2016-09-12-17-07-10.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-11-moving411a-Ramdomly_2016-09-12-17-24-25.bag   kio-point-11-moving411a-Ramdomly_2016-09-12-17-24-25.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-11-moving411a_2016-09-12-17-23-47.bag            kio-point-11-moving411a_2016-09-12-17-23-47.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-11-moving501c-Ramdomly_2016-09-12-17-26-08.bag   kio-point-11-moving501c-Ramdomly_2016-09-12-17-26-08.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-11-moving501c_2016-09-12-17-25-30.bag            kio-point-11-moving501c_2016-09-12-17-25-30.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-11-without408a_2016-09-12-17-18-28.bag           kio-point-11-without408a_2016-09-12-17-18-28.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-11-without411a_2016-09-12-17-20-56.bag           kio-point-11-without411a_2016-09-12-17-20-56.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-11-without501c_2016-09-12-17-22-12.bag           kio-point-11-without501c_2016-09-12-17-22-12.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-11_2016-09-12-17-17-16.bag                       kio-point-11_2016-09-12-17-17-16.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-12-moving411a-Ramdomly_2016-09-12-17-34-49.bag   kio-point-12-moving411a-Ramdomly_2016-09-12-17-34-49.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-12-moving411a_2016-09-12-17-34-09.bag            kio-point-12-moving411a_2016-09-12-17-34-09.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-12-moving501c-Ramdomly_2016-09-12-17-36-45.bag   kio-point-12-moving501c-Ramdomly_2016-09-12-17-36-45.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-12-moving501c_2016-09-12-17-36-04.bag            kio-point-12-moving501c_2016-09-12-17-36-04.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-12-without408a_2016-09-12-17-29-02.bag           kio-point-12-without408a_2016-09-12-17-29-02.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-12-without411a_2016-09-12-17-30-32.bag           kio-point-12-without411a_2016-09-12-17-30-32.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-12-without501c_2016-09-12-17-32-21.bag           kio-point-12-without501c_2016-09-12-17-32-21.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-2/kio-point-12_2016-09-12-17-27-51.bag                       kio-point-12_2016-09-12-17-27-51.bag                    ]&lt;br /&gt;
&lt;br /&gt;
==== Distribution 3 (96 files) ====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-01-moving411a-Ramdomly_2016-09-15-14-22-20.bag   kio-point-01-moving411a-Ramdomly_2016-09-15-14-22-20.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-01-moving411a_2016-09-15-14-21-36.bag            kio-point-01-moving411a_2016-09-15-14-21-36.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-01-moving501c-Ramdomly_2016-09-15-14-24-32.bag   kio-point-01-moving501c-Ramdomly_2016-09-15-14-24-32.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-01-moving501c_2016-09-15-14-23-41.bag            kio-point-01-moving501c_2016-09-15-14-23-41.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-01-without408a_2016-09-15-14-17-30.bag           kio-point-01-without408a_2016-09-15-14-17-30.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-01-without411a_2016-09-15-14-18-50.bag           kio-point-01-without411a_2016-09-15-14-18-50.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-01-without501c_2016-09-15-14-19-57.bag           kio-point-01-without501c_2016-09-15-14-19-57.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-01_2016-09-15-14-16-31.bag                       kio-point-01_2016-09-15-14-16-31.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-02-moving411a-Ramdomly_2016-09-15-14-34-24.bag   kio-point-02-moving411a-Ramdomly_2016-09-15-14-34-24.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-02-moving411a_2016-09-15-14-32-43.bag            kio-point-02-moving411a_2016-09-15-14-32-43.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-02-moving501c-Ramdomly_2016-09-15-14-36-20.bag   kio-point-02-moving501c-Ramdomly_2016-09-15-14-36-20.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-02-moving501c_2016-09-15-14-35-43.bag            kio-point-02-moving501c_2016-09-15-14-35-43.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-02-without408a_2016-09-15-14-27-20.bag           kio-point-02-without408a_2016-09-15-14-27-20.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-02-without411a_2016-09-15-14-28-39.bag           kio-point-02-without411a_2016-09-15-14-28-39.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-02-without501c_2016-09-15-14-29-44.bag           kio-point-02-without501c_2016-09-15-14-29-44.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-02_2016-09-15-14-26-02.bag                       kio-point-02_2016-09-15-14-26-02.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-03-moving411a-Ramdomly_2016-09-16-09-41-26.bag   kio-point-03-moving411a-Ramdomly_2016-09-16-09-41-26.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-03-moving411a_2016-09-16-09-40-39.bag            kio-point-03-moving411a_2016-09-16-09-40-39.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-03-moving501c-Ramdomly_2016-09-16-09-44-42.bag   kio-point-03-moving501c-Ramdomly_2016-09-16-09-44-42.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-03-moving501c_2016-09-16-09-43-41.bag            kio-point-03-moving501c_2016-09-16-09-43-41.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-03-without408a_2016-09-16-09-35-55.bag           kio-point-03-without408a_2016-09-16-09-35-55.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-03-without411a_2016-09-16-09-38-20.bag           kio-point-03-without411a_2016-09-16-09-38-20.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-03-without501c_2016-09-16-09-39-22.bag           kio-point-03-without501c_2016-09-16-09-39-22.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-03_2016-09-16-09-34-45.bag                       kio-point-03_2016-09-16-09-34-45.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-04-moving411a-Ramdomly_2016-09-16-09-52-40.bag   kio-point-04-moving411a-Ramdomly_2016-09-16-09-52-40.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-04-moving411a_2016-09-16-09-51-45.bag            kio-point-04-moving411a_2016-09-16-09-51-45.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-04-moving501c-Ramdomly_2016-09-16-09-55-21.bag   kio-point-04-moving501c-Ramdomly_2016-09-16-09-55-21.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-04-moving501c_2016-09-16-09-54-22.bag            kio-point-04-moving501c_2016-09-16-09-54-22.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-04-without408a_2016-09-16-09-47-30.bag           kio-point-04-without408a_2016-09-16-09-47-30.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-04-without411a_2016-09-16-09-48-55.bag           kio-point-04-without411a_2016-09-16-09-48-55.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-04-without501c_2016-09-16-09-50-09.bag           kio-point-04-without501c_2016-09-16-09-50-09.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-04_2016-09-16-09-45-57.bag                       kio-point-04_2016-09-16-09-45-57.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-05-moving411a-Ramdomly_2016-09-16-10-03-02.bag   kio-point-05-moving411a-Ramdomly_2016-09-16-10-03-02.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-05-moving411a_2016-09-16-10-02-20.bag            kio-point-05-moving411a_2016-09-16-10-02-20.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-05-moving501c-Ramdomly_2016-09-16-10-05-25.bag   kio-point-05-moving501c-Ramdomly_2016-09-16-10-05-25.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-05-moving501c_2016-09-16-10-04-37.bag            kio-point-05-moving501c_2016-09-16-10-04-37.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-05-without408a_2016-09-16-09-57-36.bag           kio-point-05-without408a_2016-09-16-09-57-36.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-05-without411a_2016-09-16-09-59-16.bag           kio-point-05-without411a_2016-09-16-09-59-16.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-05-without501c_2016-09-16-10-00-26.bag           kio-point-05-without501c_2016-09-16-10-00-26.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-05_2016-09-16-09-56-33.bag                       kio-point-05_2016-09-16-09-56-33.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-06-moving411a-Ramdomly_2016-09-16-10-14-00.bag   kio-point-06-moving411a-Ramdomly_2016-09-16-10-14-00.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-06-moving411a_2016-09-16-10-13-12.bag            kio-point-06-moving411a_2016-09-16-10-13-12.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-06-moving501c-Ramdomly_2016-09-16-10-16-20.bag   kio-point-06-moving501c-Ramdomly_2016-09-16-10-16-20.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-06-moving501c_2016-09-16-10-15-27.bag            kio-point-06-moving501c_2016-09-16-10-15-27.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-06-without408a_2016-09-16-10-07-53.bag           kio-point-06-without408a_2016-09-16-10-07-53.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-06-without411a_2016-09-16-10-09-10.bag           kio-point-06-without411a_2016-09-16-10-09-10.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-06-without501c_2016-09-16-10-11-24.bag           kio-point-06-without501c_2016-09-16-10-11-24.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-06_2016-09-16-10-06-44.bag                       kio-point-06_2016-09-16-10-06-44.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-07-moving411a-Ramdomly_2016-09-16-10-30-01.bag   kio-point-07-moving411a-Ramdomly_2016-09-16-10-30-01.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-07-moving411a_2016-09-16-10-28-06.bag            kio-point-07-moving411a_2016-09-16-10-28-06.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-07-moving501c-Ramdomly_2016-09-16-10-32-20.bag   kio-point-07-moving501c-Ramdomly_2016-09-16-10-32-20.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-07-moving501c_2016-09-16-10-31-40.bag            kio-point-07-moving501c_2016-09-16-10-31-40.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-07-without408a_2016-09-16-10-21-38.bag           kio-point-07-without408a_2016-09-16-10-21-38.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-07-without411a_2016-09-16-10-22-50.bag           kio-point-07-without411a_2016-09-16-10-22-50.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-07-without501c_2016-09-16-10-23-57.bag           kio-point-07-without501c_2016-09-16-10-23-57.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-07_2016-09-16-10-20-09.bag                       kio-point-07_2016-09-16-10-20-09.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-08-moving411a-Ramdomly_2016-09-16-12-46-34.bag   kio-point-08-moving411a-Ramdomly_2016-09-16-12-46-34.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-08-moving411a_2016-09-16-12-45-51.bag            kio-point-08-moving411a_2016-09-16-12-45-51.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-08-moving501c-Ramdomly_2016-09-16-12-48-49.bag   kio-point-08-moving501c-Ramdomly_2016-09-16-12-48-49.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-08-moving501c_2016-09-16-12-48-03.bag            kio-point-08-moving501c_2016-09-16-12-48-03.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-08-without408a_2016-09-16-12-42-05.bag           kio-point-08-without408a_2016-09-16-12-42-05.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-08-without411a_2016-09-16-12-43-23.bag           kio-point-08-without411a_2016-09-16-12-43-23.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-08-without501c_2016-09-16-12-44-33.bag           kio-point-08-without501c_2016-09-16-12-44-33.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-08_2016-09-16-12-41-01.bag                       kio-point-08_2016-09-16-12-41-01.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-09-moving411a-Ramdomly_2016-09-16-10-44-49.bag   kio-point-09-moving411a-Ramdomly_2016-09-16-10-44-49.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-09-moving411a_2016-09-16-10-44-09.bag            kio-point-09-moving411a_2016-09-16-10-44-09.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-09-moving501c-Ramdomly_2016-09-16-10-43-14.bag   kio-point-09-moving501c-Ramdomly_2016-09-16-10-43-14.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-09-moving501c_2016-09-16-10-42-38.bag            kio-point-09-moving501c_2016-09-16-10-42-38.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-09-without408a_2016-09-16-10-35-54.bag           kio-point-09-without408a_2016-09-16-10-35-54.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-09-without411a_2016-09-16-10-37-42.bag           kio-point-09-without411a_2016-09-16-10-37-42.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-09-without501c_2016-09-16-10-38-58.bag           kio-point-09-without501c_2016-09-16-10-38-58.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-09_2016-09-16-10-34-53.bag                       kio-point-09_2016-09-16-10-34-53.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-10-moving411a-Ramdomly_2016-09-16-12-57-41.bag   kio-point-10-moving411a-Ramdomly_2016-09-16-12-57-41.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-10-moving411a_2016-09-16-12-56-55.bag            kio-point-10-moving411a_2016-09-16-12-56-55.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-10-moving501c-Ramdomly_2016-09-16-12-59-32.bag   kio-point-10-moving501c-Ramdomly_2016-09-16-12-59-32.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-10-moving501c_2016-09-16-12-58-53.bag            kio-point-10-moving501c_2016-09-16-12-58-53.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-10-without408a_2016-09-16-12-53-06.bag           kio-point-10-without408a_2016-09-16-12-53-06.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-10-without411a_2016-09-16-12-54-05.bag           kio-point-10-without411a_2016-09-16-12-54-05.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-10-without501c_2016-09-16-12-55-07.bag           kio-point-10-without501c_2016-09-16-12-55-07.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-10_2016-09-16-12-51-33.bag                       kio-point-10_2016-09-16-12-51-33.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-11-moving411a-Ramdomly_2016-09-16-13-52-20.bag   kio-point-11-moving411a-Ramdomly_2016-09-16-13-52-20.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-11-moving411a_2016-09-16-13-51-41.bag            kio-point-11-moving411a_2016-09-16-13-51-41.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-11-moving501c-Ramdomly_2016-09-16-13-54-04.bag   kio-point-11-moving501c-Ramdomly_2016-09-16-13-54-04.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-11-moving501c_2016-09-16-13-53-19.bag            kio-point-11-moving501c_2016-09-16-13-53-19.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-11-without408a_2016-09-16-13-35-32.bag           kio-point-11-without408a_2016-09-16-13-35-32.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-11-without411a_2016-09-16-13-37-00.bag           kio-point-11-without411a_2016-09-16-13-37-00.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-11-without501c_2016-09-16-13-39-08.bag           kio-point-11-without501c_2016-09-16-13-39-08.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-11_2016-09-16-13-34-14.bag                       kio-point-11_2016-09-16-13-34-14.bag                    ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-12-moving411a-Ramdomly_2016-09-16-13-59-34.bag   kio-point-12-moving411a-Ramdomly_2016-09-16-13-59-34.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-12-moving411a_2016-09-16-13-58-56.bag            kio-point-12-moving411a_2016-09-16-13-58-56.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-12-moving501c-Ramdomly_2016-09-16-14-01-04.bag   kio-point-12-moving501c-Ramdomly_2016-09-16-14-01-04.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-12-moving501c_2016-09-16-14-00-26.bag            kio-point-12-moving501c_2016-09-16-14-00-26.bag         ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-12-without408a_2016-09-16-13-56-22.bag           kio-point-12-without408a_2016-09-16-13-56-22.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-12-without411a_2016-09-16-13-57-12.bag           kio-point-12-without411a_2016-09-16-13-57-12.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-12-without501c_2016-09-16-13-58-02.bag           kio-point-12-without501c_2016-09-16-13-58-02.bag        ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/AttacksOnRTLSs/v1.0/dist-3/kio-point-12_2016-09-16-13-55-29.bag                       kio-point-12_2016-09-16-13-55-29.bag                    ]&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots&amp;diff=5154</id>
		<title>Benchmark dataset for training/testing of Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots&amp;diff=5154"/>
				<updated>2017-10-16T08:54:35Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;blockquote style=&amp;quot;background-color:#ffe;border:1px solid #fb0;padding:5px 10px&amp;quot;&amp;gt;SciCrunch! reference: '''Machine learning models cyber-attack detection Benchmark Dataset, RRID:SCR_015757.'''&amp;lt;/blockquote&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to train and test Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots. Data have been gathered in an indoor mock-up apartment, shown in Fig 2, located at the Robotics Lab of the University of León (Spain). An autonomous robot, called Orbi-One and shown in Fig 1, with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
== Materials ==&lt;br /&gt;
&lt;br /&gt;
Data gathered by Orbi-One robot include: &lt;br /&gt;
* Orbi-One location estimates provided by a commercial RTLS, called '''KIO'''.&lt;br /&gt;
&lt;br /&gt;
Additional information about Karen and the devices/packages used to get data is given below.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-One robot ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One, shown at Fig 1, is an assistant robot manufactured by [http://www.robotnik.es/manipuladores-roboticos-moviles/rb-one/ Robotnik]. The software to control the robot hardware is based on [http://www.ros.org/ ROS].&lt;br /&gt;
&lt;br /&gt;
[[File:Orbi-Obe_and_KIO.jpg|thumb|'''Fig. 1''': Orbi-One and KIO RTLS.]]&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
KIO RTLS commercial solution by [https://www.eliko.ee/products/kio-rtls/ Eliko] has been used to provide people location at the study area. Fig 1 shows a KIO beacon (1), and a KIO tag on the robot (2).&lt;br /&gt;
&lt;br /&gt;
== Recording procedure ==&lt;br /&gt;
&lt;br /&gt;
Two predefined trajectories were set for the robot in the study area, as shown in Fig 2: a test trajectory (light gray line in Fig 2) was used to build a training dataset, for training and testing the models; and a validation trajectory (dark gray line in Fig 2) was used to generate a different dataset to validate models in a different location ensuring generalization. The robot started in both cases at the point marked &amp;quot;0&amp;quot; and finished at the point marked &amp;quot;1&amp;quot;&amp;quot;. Data were recorded by the Orby-One robot moving through the apartment, remotely controlled, following the test and validation trajectories respectively. We created a different rosbag file every time Orby-One made the walk, saving the location estimates gathered by the KIO device for a later analysis.&lt;br /&gt;
&lt;br /&gt;
[[File:apartamento_leon_robotics2017.jpg|thumb|'''Fig. 2''': Robotics mobile lab plane. Light gray line shows the test trajectory. Dark gray line shows the validation trajectory. Red dots show the location of anchors.]]&lt;br /&gt;
&lt;br /&gt;
We repeated the test and validation trajectories 10 times each, so that 20 rosbag files were recorded. Orby-One takes about 72 seconds to finish the walk following the test trajectory, and about 40 seconds following the validation trajectory. Each test run yielded 270 location estimates on average, validation runs 150.&lt;br /&gt;
&lt;br /&gt;
The runs were recorded in three different scenarios: without suffering any attack (labeled as WA), suffering a DoS attack (labeled as A1), and suffering a Spoofing attack (labeled as A2). DoS attacks were carried out by interrupting the signal of one or more radio beacons. Spoofing attacks were carried out by changing the signal of the radio beacons. The affected radio beacons were selected by looking for anchors with redundancy (A-anchors) and anchors without (C- and D-anchors), at different locations.&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
=== v1.0 ===&lt;br /&gt;
&lt;br /&gt;
==== Without attack (WA) ====&lt;br /&gt;
&lt;br /&gt;
===== Test route =====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-06-17-48-50.bag                    WA_2017-03-06-17-48-50.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-08-46-41.bag                    WA_2017-03-07-08-46-41.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-08-48-56.bag                    WA_2017-03-07-08-48-56.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-08-50-50.bag                    WA_2017-03-07-08-50-50.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-14-53-42.bag                    WA_2017-03-07-14-53-42.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-14-55-29.bag                    WA_2017-03-07-14-55-29.bag                ]&lt;br /&gt;
&lt;br /&gt;
===== Validation route =====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-06-18-05-27.bag         WA-validation_2017-03-06-18-05-27.bag     ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-07-08-53-28.bag         WA-validation_2017-03-07-08-53-28.bag     ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-07-08-54-40.bag         WA-validation_2017-03-07-08-54-40.bag     ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-07-08-55-48.bag         WA-validation_2017-03-07-08-55-48.bag     ]&lt;br /&gt;
&lt;br /&gt;
==== DoS attack (A1) ====&lt;br /&gt;
&lt;br /&gt;
===== Test route =====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-401d_2017-03-07-14-15-12.bag               A1-401d_2017-03-07-14-15-12.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408a_2017-03-06-17-53-12.bag               A1-408a_2017-03-06-17-53-12.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408b_2017-03-07-14-08-25.bag               A1-408b_2017-03-07-14-08-25.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-501c_2017-03-07-09-00-44.bag               A1-501c_2017-03-07-09-00-44.bag           ]&lt;br /&gt;
&lt;br /&gt;
===== Validation route =====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-401d-validation_2017-03-07-09-32-49.bag    A1-401d-validation_2017-03-07-09-32-49.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408a-validation_2017-03-06-18-03-44.bag    A1-408a-validation_2017-03-06-18-03-44.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408b-validation_2017-03-07-14-10-42.bag    A1-408b-validation_2017-03-07-14-10-42.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-501c-validation_2017-03-07-09-04-18.bag    A1-501c-validation_2017-03-07-09-04-18.bag]&lt;br /&gt;
&lt;br /&gt;
==== Spoofing attack (A2) ====&lt;br /&gt;
&lt;br /&gt;
===== Test route =====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-408a_2017-03-06-17-58-28.bag               A2-408a_2017-03-06-17-58-28.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-501c_2017-03-07-09-10-55.bag               A2-501c_2017-03-07-09-10-55.bag           ]&lt;br /&gt;
&lt;br /&gt;
===== Validation route =====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-401d-validation_2017-03-07-09-35-07.bag    A2-401d-validation_2017-03-07-09-35-07.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-408a-validation_2017-03-06-18-01-09.bag    A2-408a-validation_2017-03-06-18-01-09.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-501c-validation_2017-03-07-09-13-14.bag    A2-501c-validation_2017-03-07-09-13-14.bag]&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5153</id>
		<title>Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5153"/>
				<updated>2017-10-16T08:53:50Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;blockquote style=&amp;quot;background-color:#ffe;border:1px solid #fb0;padding:5px 10px&amp;quot;&amp;gt;SciCrunch! reference: '''Range-based people tracker classifiers Benchmark Dataset, RRID:SCR_015743.'''&amp;lt;/blockquote&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to evaluate the performance of different approaches for detecting and tracking people by using lidar sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers. Data have been gathered in an indoor mock-up apartment, shown in Fig 1 (B), located at the Robotics Lab of the University of León (Spain). An autonomous robot, called Orbi-One and shown in Fig 1 (A), with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
== Materials ==&lt;br /&gt;
&lt;br /&gt;
Data gathered by Orbi-One robot include: &lt;br /&gt;
* Lidar sensor measures. &lt;br /&gt;
* Location estimates from two people trackers: '''ROS-LD''' and '''PeTra'''. &lt;br /&gt;
* People location provided by a commercial RTLS, called '''KIO''', which can be used as ground-truth.&lt;br /&gt;
* Some other useful data gathered by the Orbi-One robot such as map information, odometry, and transform data.&lt;br /&gt;
&lt;br /&gt;
Additional information about Orbi-One and the devices/packages used to get data is given below.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-one ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One, shown at Fig 1 (A), is an assistant robot manufactured by [http://www.robotnik.es/manipuladores-roboticos-moviles/rb-one/ Robotnik]. The software to control the robot hardware is based on [http://www.ros.org/ ROS].&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig1.png|frame|center|'''Fig. 1''': From left to right: (A) Orbi-One carrying KIO tag and a KIO anchor attached in the ceiling; (B) tobotics mobile lab plane, red dots show the location of KIO anchors; (C) occupancy map generated using lidar sensor measures; and (D) Network output.]]&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
KIO RTLS commercial solution by [https://www.eliko.ee/products/kio-rtls/ Eliko] has been used to provide people location at the study area.&lt;br /&gt;
&lt;br /&gt;
=== ROS Leg Detector (ROS-LD) ===&lt;br /&gt;
&lt;br /&gt;
ROS-LD is a ROS package which takes messages published by a lidar sensor as input and uses a machine-learning-trained classifier to detect groups of laser readings as possible legs. The code is available in a public [http://wiki.ros.org/leg_detector repository], but is unsupported at this time.&lt;br /&gt;
&lt;br /&gt;
=== PeTra ===&lt;br /&gt;
&lt;br /&gt;
PeTra is a people tracker tool for detecting and tracking people developed by the Robotics Group from the University of León.&lt;br /&gt;
&lt;br /&gt;
== Recording procedure ==&lt;br /&gt;
&lt;br /&gt;
The data were gathered under 14 different scenarios. In all of them, Orbi-One was standing still as one or more people, carrying a KIO tag, moved around him. Fig. 2 shows the 14 different recognition scenarios recorded. These scenarios have been chosen according to different situations that may occurs on robotics competitions such as [https://www.eu-robotics.net/robotics_league/ ERL] or [http://www.robocup.org/ RoboCup].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig2.png|frame|center|'''Fig. 2''': recognition scenarios recorded.]]&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
A rosbag file was created for each scenario (3 for scenarios 3 and 12), recording lidar sensor measures, location estimates from PeTra and ROS-LD, locations from KIO RTLS and other useful data. Specifically, the following data were included in the rosbag files:&lt;br /&gt;
&lt;br /&gt;
* Laser sensor messages (''sensor\_msgs/LaserScan'') published at the ''/scan'' topic.&lt;br /&gt;
* Location estimates calculated by PeTra published at the ''/person'' topic.&lt;br /&gt;
* Location estimates calculated by ROS-LD at the ''/people\_tracker\_measurements'' topic.&lt;br /&gt;
* Location estimates calculated by the KIO RTLS published at the ''/kio/PointStamped/4037/out'' topic. &lt;br /&gt;
* Messages from the ''/map'', ''/odom'', and ''/tf'' topics which includes map information, odometry of the robot base, and transform information respectively.&lt;br /&gt;
&lt;br /&gt;
Differents versions of the dataset are enumerated below.&lt;br /&gt;
&lt;br /&gt;
=== v1.0 [Jul-2017] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a first version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_01.bag test_01.bag]: '''duration:''' 14:56s, '''size:''' 227.8 MB, '''start date/time:''' Jul 20, 2017 12:49:21.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Jul 26, 2017 11:01:24.72&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_03_1.bag test_03_1.bag]: '''duration:''' 39.9s, '''size:''' 10.4 MB, '''start date/time:''' Jul 20, 2017 13:27:25.50&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_03_2.bag test_03_2.bag]: '''duration:''' 39.1s, '''size:''' 10.2 MB, '''start date/time:''' Jul 20, 2017 13:28:56.41&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_03_3.bag test_03_3.bag]: '''duration:''' 40.5s, '''size:''' 10.5 MB, '''start date/time:''' Jul 20, 2017 13:30:04.94&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_04_1.bag test_04_1.bag]: '''duration:''' 58.3s, '''size:''' 15.0 MB, '''start date/time:''' Jul 25, 2017 10:39:52.62 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_04_2.bag test_04_2.bag]: '''duration:''' 57.2s, '''size:''' 14.7 MB, '''start date/time:''' Jul 25, 2017 10:41:16.31&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_04_3.bag test_04_3.bag]: '''duration:''' 50.5s, '''size:''' 13.0 MB, '''start date/time:''' Jul 25, 2017 10:42:44.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_04_4.bag test_04_4.bag]: '''duration:''' 1:01s, '''size:''' 15.7 MB, '''start date/time:''' Jul 25, 2017 10:43:52.44&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_05.bag test_05.bag]: '''duration:''' 15:15s, '''size:''' 236.8 MB, '''start date/time:''' Jul 26, 2017 11:33:13.31&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_06.bag test_06.bag]: '''duration:''' 09:18s, '''size:''' 143.3 MB, '''start date/time:''' Jul 26, 2017 12:25:45.12&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Jul 25, 2017 11:40:01.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_08.bag test_08.bag]: '''duration:''' 03:39s, '''size:''' 55.8 MB, '''start date/time:''' Jul 25, 2017 12:25:29.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_09_1.bag test_09_1.bag]: '''duration:''' 22.9s, '''size:''' 6.1 MB, '''start date/time:''' Jul 25, 2017 10:50:02.95 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_09_2.bag test_09_2.bag]: '''duration:''' 20.8s, '''size:''' 5.6 MB, '''start date/time:''' Jul 25, 2017 10:51:02.62&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Jul 25, 2017 10:51:45.96 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_09_4.bag test_09_4.bag]: '''duration:''' 29.3s, '''size:''' 7.9 MB, '''start date/time:''' Jul 25, 2017 10:52:51.24&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_09_5.bag test_09_5.bag]: '''duration:''' 36.9s, '''size:''' 9.7 MB, '''start date/time:''' Jul 25, 2017 10:54:00.13&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_10.bag test_10.bag]: '''duration:''' 15:50s, '''size:''' 240.5 MB, '''start date/time:''' Jul 20, 2017 13:07:40.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 58.7 MB, '''start date/time:''' Jul 25, 2017 11:48:33.90&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_12_1.bag test_12_1.bag]: '''duration:''' 43.6s, '''size:''' 11.3 MB, '''start date/time:''' Jul 20, 2017 13:33:23.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_12_2.bag test_12_2.bag]: '''duration:''' 44.3s, '''size:''' 11.5 MB, '''start date/time:''' Jul 20, 2017 13:34:24.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_12_3.bag test_12_3.bag]: '''duration:''' 37.3s, '''size:'''  9.7 MB, '''start date/time:''' Jul 20, 2017 13:35:31.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_13_1.bag test_13_1.bag]: '''duration:''' 57.8s, '''size:''' 14.9 MB, '''start date/time:''' Jul 25, 2017 11:01:15.23&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_13_2.bag test_13_2.bag]: '''duration:''' 59.9s, '''size:''' 15.4 MB, '''start date/time:''' Jul 25, 2017 11:02:37.85&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_13_3.bag test_13_3.bag]: '''duration:''' 54.0s, '''size:''' 13.9 MB, '''start date/time:''' Jul 25, 2017 11:04:30.69&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_14.bag test_14.bag]: '''duration:''' 05.57s, '''size:''' 90.5 MB, '''start date/time:''' Jul 25, 2017 11:10:32.97&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5152</id>
		<title>Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5152"/>
				<updated>2017-10-16T08:47:54Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;blockquote&amp;gt;SciCrunch! reference: '''Range-based people tracker classifiers Benchmark Dataset, RRID:SCR_015743.'''&amp;lt;/blockquote&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to evaluate the performance of different approaches for detecting and tracking people by using lidar sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers. Data have been gathered in an indoor mock-up apartment, shown in Fig 1 (B), located at the Robotics Lab of the University of León (Spain). An autonomous robot, called Orbi-One and shown in Fig 1 (A), with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
== Materials ==&lt;br /&gt;
&lt;br /&gt;
Data gathered by Orbi-One robot include: &lt;br /&gt;
* Lidar sensor measures. &lt;br /&gt;
* Location estimates from two people trackers: '''ROS-LD''' and '''PeTra'''. &lt;br /&gt;
* People location provided by a commercial RTLS, called '''KIO''', which can be used as ground-truth.&lt;br /&gt;
* Some other useful data gathered by the Orbi-One robot such as map information, odometry, and transform data.&lt;br /&gt;
&lt;br /&gt;
Additional information about Orbi-One and the devices/packages used to get data is given below.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-one ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One, shown at Fig 1 (A), is an assistant robot manufactured by [http://www.robotnik.es/manipuladores-roboticos-moviles/rb-one/ Robotnik]. The software to control the robot hardware is based on [http://www.ros.org/ ROS].&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig1.png|frame|center|'''Fig. 1''': From left to right: (A) Orbi-One carrying KIO tag and a KIO anchor attached in the ceiling; (B) tobotics mobile lab plane, red dots show the location of KIO anchors; (C) occupancy map generated using lidar sensor measures; and (D) Network output.]]&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
KIO RTLS commercial solution by [https://www.eliko.ee/products/kio-rtls/ Eliko] has been used to provide people location at the study area.&lt;br /&gt;
&lt;br /&gt;
=== ROS Leg Detector (ROS-LD) ===&lt;br /&gt;
&lt;br /&gt;
ROS-LD is a ROS package which takes messages published by a lidar sensor as input and uses a machine-learning-trained classifier to detect groups of laser readings as possible legs. The code is available in a public [http://wiki.ros.org/leg_detector repository], but is unsupported at this time.&lt;br /&gt;
&lt;br /&gt;
=== PeTra ===&lt;br /&gt;
&lt;br /&gt;
PeTra is a people tracker tool for detecting and tracking people developed by the Robotics Group from the University of León.&lt;br /&gt;
&lt;br /&gt;
== Recording procedure ==&lt;br /&gt;
&lt;br /&gt;
The data were gathered under 14 different scenarios. In all of them, Orbi-One was standing still as one or more people, carrying a KIO tag, moved around him. Fig. 2 shows the 14 different recognition scenarios recorded. These scenarios have been chosen according to different situations that may occurs on robotics competitions such as [https://www.eu-robotics.net/robotics_league/ ERL] or [http://www.robocup.org/ RoboCup].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig2.png|frame|center|'''Fig. 2''': recognition scenarios recorded.]]&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
A rosbag file was created for each scenario (3 for scenarios 3 and 12), recording lidar sensor measures, location estimates from PeTra and ROS-LD, locations from KIO RTLS and other useful data. Specifically, the following data were included in the rosbag files:&lt;br /&gt;
&lt;br /&gt;
* Laser sensor messages (''sensor\_msgs/LaserScan'') published at the ''/scan'' topic.&lt;br /&gt;
* Location estimates calculated by PeTra published at the ''/person'' topic.&lt;br /&gt;
* Location estimates calculated by ROS-LD at the ''/people\_tracker\_measurements'' topic.&lt;br /&gt;
* Location estimates calculated by the KIO RTLS published at the ''/kio/PointStamped/4037/out'' topic. &lt;br /&gt;
* Messages from the ''/map'', ''/odom'', and ''/tf'' topics which includes map information, odometry of the robot base, and transform information respectively.&lt;br /&gt;
&lt;br /&gt;
Differents versions of the dataset are enumerated below.&lt;br /&gt;
&lt;br /&gt;
=== v1.0 [Jul-2017] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a first version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_01.bag test_01.bag]: '''duration:''' 14:56s, '''size:''' 227.8 MB, '''start date/time:''' Jul 20, 2017 12:49:21.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Jul 26, 2017 11:01:24.72&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_03_1.bag test_03_1.bag]: '''duration:''' 39.9s, '''size:''' 10.4 MB, '''start date/time:''' Jul 20, 2017 13:27:25.50&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_03_2.bag test_03_2.bag]: '''duration:''' 39.1s, '''size:''' 10.2 MB, '''start date/time:''' Jul 20, 2017 13:28:56.41&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_03_3.bag test_03_3.bag]: '''duration:''' 40.5s, '''size:''' 10.5 MB, '''start date/time:''' Jul 20, 2017 13:30:04.94&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_04_1.bag test_04_1.bag]: '''duration:''' 58.3s, '''size:''' 15.0 MB, '''start date/time:''' Jul 25, 2017 10:39:52.62 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_04_2.bag test_04_2.bag]: '''duration:''' 57.2s, '''size:''' 14.7 MB, '''start date/time:''' Jul 25, 2017 10:41:16.31&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_04_3.bag test_04_3.bag]: '''duration:''' 50.5s, '''size:''' 13.0 MB, '''start date/time:''' Jul 25, 2017 10:42:44.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_04_4.bag test_04_4.bag]: '''duration:''' 1:01s, '''size:''' 15.7 MB, '''start date/time:''' Jul 25, 2017 10:43:52.44&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_05.bag test_05.bag]: '''duration:''' 15:15s, '''size:''' 236.8 MB, '''start date/time:''' Jul 26, 2017 11:33:13.31&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_06.bag test_06.bag]: '''duration:''' 09:18s, '''size:''' 143.3 MB, '''start date/time:''' Jul 26, 2017 12:25:45.12&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Jul 25, 2017 11:40:01.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_08.bag test_08.bag]: '''duration:''' 03:39s, '''size:''' 55.8 MB, '''start date/time:''' Jul 25, 2017 12:25:29.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_09_1.bag test_09_1.bag]: '''duration:''' 22.9s, '''size:''' 6.1 MB, '''start date/time:''' Jul 25, 2017 10:50:02.95 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_09_2.bag test_09_2.bag]: '''duration:''' 20.8s, '''size:''' 5.6 MB, '''start date/time:''' Jul 25, 2017 10:51:02.62&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Jul 25, 2017 10:51:45.96 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_09_4.bag test_09_4.bag]: '''duration:''' 29.3s, '''size:''' 7.9 MB, '''start date/time:''' Jul 25, 2017 10:52:51.24&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_09_5.bag test_09_5.bag]: '''duration:''' 36.9s, '''size:''' 9.7 MB, '''start date/time:''' Jul 25, 2017 10:54:00.13&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_10.bag test_10.bag]: '''duration:''' 15:50s, '''size:''' 240.5 MB, '''start date/time:''' Jul 20, 2017 13:07:40.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 58.7 MB, '''start date/time:''' Jul 25, 2017 11:48:33.90&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_12_1.bag test_12_1.bag]: '''duration:''' 43.6s, '''size:''' 11.3 MB, '''start date/time:''' Jul 20, 2017 13:33:23.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_12_2.bag test_12_2.bag]: '''duration:''' 44.3s, '''size:''' 11.5 MB, '''start date/time:''' Jul 20, 2017 13:34:24.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_12_3.bag test_12_3.bag]: '''duration:''' 37.3s, '''size:'''  9.7 MB, '''start date/time:''' Jul 20, 2017 13:35:31.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_13_1.bag test_13_1.bag]: '''duration:''' 57.8s, '''size:''' 14.9 MB, '''start date/time:''' Jul 25, 2017 11:01:15.23&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_13_2.bag test_13_2.bag]: '''duration:''' 59.9s, '''size:''' 15.4 MB, '''start date/time:''' Jul 25, 2017 11:02:37.85&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_13_3.bag test_13_3.bag]: '''duration:''' 54.0s, '''size:''' 13.9 MB, '''start date/time:''' Jul 25, 2017 11:04:30.69&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_14.bag test_14.bag]: '''duration:''' 05.57s, '''size:''' 90.5 MB, '''start date/time:''' Jul 25, 2017 11:10:32.97&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5151</id>
		<title>Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5151"/>
				<updated>2017-10-16T08:46:32Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;SciCrunch! reference: '''Range-based people tracker classifiers Benchmark Dataset, RRID:SCR_015743.'''&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to evaluate the performance of different approaches for detecting and tracking people by using lidar sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers. Data have been gathered in an indoor mock-up apartment, shown in Fig 1 (B), located at the Robotics Lab of the University of León (Spain). An autonomous robot, called Orbi-One and shown in Fig 1 (A), with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
== Materials ==&lt;br /&gt;
&lt;br /&gt;
Data gathered by Orbi-One robot include: &lt;br /&gt;
* Lidar sensor measures. &lt;br /&gt;
* Location estimates from two people trackers: '''ROS-LD''' and '''PeTra'''. &lt;br /&gt;
* People location provided by a commercial RTLS, called '''KIO''', which can be used as ground-truth.&lt;br /&gt;
* Some other useful data gathered by the Orbi-One robot such as map information, odometry, and transform data.&lt;br /&gt;
&lt;br /&gt;
Additional information about Orbi-One and the devices/packages used to get data is given below.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-one ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One, shown at Fig 1 (A), is an assistant robot manufactured by [http://www.robotnik.es/manipuladores-roboticos-moviles/rb-one/ Robotnik]. The software to control the robot hardware is based on [http://www.ros.org/ ROS].&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig1.png|frame|center|'''Fig. 1''': From left to right: (A) Orbi-One carrying KIO tag and a KIO anchor attached in the ceiling; (B) tobotics mobile lab plane, red dots show the location of KIO anchors; (C) occupancy map generated using lidar sensor measures; and (D) Network output.]]&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
KIO RTLS commercial solution by [https://www.eliko.ee/products/kio-rtls/ Eliko] has been used to provide people location at the study area.&lt;br /&gt;
&lt;br /&gt;
=== ROS Leg Detector (ROS-LD) ===&lt;br /&gt;
&lt;br /&gt;
ROS-LD is a ROS package which takes messages published by a lidar sensor as input and uses a machine-learning-trained classifier to detect groups of laser readings as possible legs. The code is available in a public [http://wiki.ros.org/leg_detector repository], but is unsupported at this time.&lt;br /&gt;
&lt;br /&gt;
=== PeTra ===&lt;br /&gt;
&lt;br /&gt;
PeTra is a people tracker tool for detecting and tracking people developed by the Robotics Group from the University of León.&lt;br /&gt;
&lt;br /&gt;
== Recording procedure ==&lt;br /&gt;
&lt;br /&gt;
The data were gathered under 14 different scenarios. In all of them, Orbi-One was standing still as one or more people, carrying a KIO tag, moved around him. Fig. 2 shows the 14 different recognition scenarios recorded. These scenarios have been chosen according to different situations that may occurs on robotics competitions such as [https://www.eu-robotics.net/robotics_league/ ERL] or [http://www.robocup.org/ RoboCup].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig2.png|frame|center|'''Fig. 2''': recognition scenarios recorded.]]&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
A rosbag file was created for each scenario (3 for scenarios 3 and 12), recording lidar sensor measures, location estimates from PeTra and ROS-LD, locations from KIO RTLS and other useful data. Specifically, the following data were included in the rosbag files:&lt;br /&gt;
&lt;br /&gt;
* Laser sensor messages (''sensor\_msgs/LaserScan'') published at the ''/scan'' topic.&lt;br /&gt;
* Location estimates calculated by PeTra published at the ''/person'' topic.&lt;br /&gt;
* Location estimates calculated by ROS-LD at the ''/people\_tracker\_measurements'' topic.&lt;br /&gt;
* Location estimates calculated by the KIO RTLS published at the ''/kio/PointStamped/4037/out'' topic. &lt;br /&gt;
* Messages from the ''/map'', ''/odom'', and ''/tf'' topics which includes map information, odometry of the robot base, and transform information respectively.&lt;br /&gt;
&lt;br /&gt;
Differents versions of the dataset are enumerated below.&lt;br /&gt;
&lt;br /&gt;
=== v1.0 [Jul-2017] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a first version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_01.bag test_01.bag]: '''duration:''' 14:56s, '''size:''' 227.8 MB, '''start date/time:''' Jul 20, 2017 12:49:21.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Jul 26, 2017 11:01:24.72&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_03_1.bag test_03_1.bag]: '''duration:''' 39.9s, '''size:''' 10.4 MB, '''start date/time:''' Jul 20, 2017 13:27:25.50&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_03_2.bag test_03_2.bag]: '''duration:''' 39.1s, '''size:''' 10.2 MB, '''start date/time:''' Jul 20, 2017 13:28:56.41&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_03_3.bag test_03_3.bag]: '''duration:''' 40.5s, '''size:''' 10.5 MB, '''start date/time:''' Jul 20, 2017 13:30:04.94&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_04_1.bag test_04_1.bag]: '''duration:''' 58.3s, '''size:''' 15.0 MB, '''start date/time:''' Jul 25, 2017 10:39:52.62 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_04_2.bag test_04_2.bag]: '''duration:''' 57.2s, '''size:''' 14.7 MB, '''start date/time:''' Jul 25, 2017 10:41:16.31&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_04_3.bag test_04_3.bag]: '''duration:''' 50.5s, '''size:''' 13.0 MB, '''start date/time:''' Jul 25, 2017 10:42:44.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_04_4.bag test_04_4.bag]: '''duration:''' 1:01s, '''size:''' 15.7 MB, '''start date/time:''' Jul 25, 2017 10:43:52.44&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_05.bag test_05.bag]: '''duration:''' 15:15s, '''size:''' 236.8 MB, '''start date/time:''' Jul 26, 2017 11:33:13.31&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_06.bag test_06.bag]: '''duration:''' 09:18s, '''size:''' 143.3 MB, '''start date/time:''' Jul 26, 2017 12:25:45.12&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Jul 25, 2017 11:40:01.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_08.bag test_08.bag]: '''duration:''' 03:39s, '''size:''' 55.8 MB, '''start date/time:''' Jul 25, 2017 12:25:29.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_09_1.bag test_09_1.bag]: '''duration:''' 22.9s, '''size:''' 6.1 MB, '''start date/time:''' Jul 25, 2017 10:50:02.95 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_09_2.bag test_09_2.bag]: '''duration:''' 20.8s, '''size:''' 5.6 MB, '''start date/time:''' Jul 25, 2017 10:51:02.62&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Jul 25, 2017 10:51:45.96 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_09_4.bag test_09_4.bag]: '''duration:''' 29.3s, '''size:''' 7.9 MB, '''start date/time:''' Jul 25, 2017 10:52:51.24&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_09_5.bag test_09_5.bag]: '''duration:''' 36.9s, '''size:''' 9.7 MB, '''start date/time:''' Jul 25, 2017 10:54:00.13&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_10.bag test_10.bag]: '''duration:''' 15:50s, '''size:''' 240.5 MB, '''start date/time:''' Jul 20, 2017 13:07:40.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 58.7 MB, '''start date/time:''' Jul 25, 2017 11:48:33.90&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_12_1.bag test_12_1.bag]: '''duration:''' 43.6s, '''size:''' 11.3 MB, '''start date/time:''' Jul 20, 2017 13:33:23.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_12_2.bag test_12_2.bag]: '''duration:''' 44.3s, '''size:''' 11.5 MB, '''start date/time:''' Jul 20, 2017 13:34:24.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_12_3.bag test_12_3.bag]: '''duration:''' 37.3s, '''size:'''  9.7 MB, '''start date/time:''' Jul 20, 2017 13:35:31.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_13_1.bag test_13_1.bag]: '''duration:''' 57.8s, '''size:''' 14.9 MB, '''start date/time:''' Jul 25, 2017 11:01:15.23&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_13_2.bag test_13_2.bag]: '''duration:''' 59.9s, '''size:''' 15.4 MB, '''start date/time:''' Jul 25, 2017 11:02:37.85&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_13_3.bag test_13_3.bag]: '''duration:''' 54.0s, '''size:''' 13.9 MB, '''start date/time:''' Jul 25, 2017 11:04:30.69&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_14.bag test_14.bag]: '''duration:''' 05.57s, '''size:''' 90.5 MB, '''start date/time:''' Jul 25, 2017 11:10:32.97&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5150</id>
		<title>Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5150"/>
				<updated>2017-10-16T08:45:25Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Grupo de Robótica Benchmark Dataset, '''RRID:SCR_015743'''.&lt;br /&gt;
&lt;br /&gt;
Range-based people tracker classifiers Benchmark Dataset, RRID:SCR_015743&lt;br /&gt;
&lt;br /&gt;
{{quote|text=phrase|sign=person|source=source}}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to evaluate the performance of different approaches for detecting and tracking people by using lidar sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers. Data have been gathered in an indoor mock-up apartment, shown in Fig 1 (B), located at the Robotics Lab of the University of León (Spain). An autonomous robot, called Orbi-One and shown in Fig 1 (A), with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
== Materials ==&lt;br /&gt;
&lt;br /&gt;
Data gathered by Orbi-One robot include: &lt;br /&gt;
* Lidar sensor measures. &lt;br /&gt;
* Location estimates from two people trackers: '''ROS-LD''' and '''PeTra'''. &lt;br /&gt;
* People location provided by a commercial RTLS, called '''KIO''', which can be used as ground-truth.&lt;br /&gt;
* Some other useful data gathered by the Orbi-One robot such as map information, odometry, and transform data.&lt;br /&gt;
&lt;br /&gt;
Additional information about Orbi-One and the devices/packages used to get data is given below.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-one ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One, shown at Fig 1 (A), is an assistant robot manufactured by [http://www.robotnik.es/manipuladores-roboticos-moviles/rb-one/ Robotnik]. The software to control the robot hardware is based on [http://www.ros.org/ ROS].&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig1.png|frame|center|'''Fig. 1''': From left to right: (A) Orbi-One carrying KIO tag and a KIO anchor attached in the ceiling; (B) tobotics mobile lab plane, red dots show the location of KIO anchors; (C) occupancy map generated using lidar sensor measures; and (D) Network output.]]&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
KIO RTLS commercial solution by [https://www.eliko.ee/products/kio-rtls/ Eliko] has been used to provide people location at the study area.&lt;br /&gt;
&lt;br /&gt;
=== ROS Leg Detector (ROS-LD) ===&lt;br /&gt;
&lt;br /&gt;
ROS-LD is a ROS package which takes messages published by a lidar sensor as input and uses a machine-learning-trained classifier to detect groups of laser readings as possible legs. The code is available in a public [http://wiki.ros.org/leg_detector repository], but is unsupported at this time.&lt;br /&gt;
&lt;br /&gt;
=== PeTra ===&lt;br /&gt;
&lt;br /&gt;
PeTra is a people tracker tool for detecting and tracking people developed by the Robotics Group from the University of León.&lt;br /&gt;
&lt;br /&gt;
== Recording procedure ==&lt;br /&gt;
&lt;br /&gt;
The data were gathered under 14 different scenarios. In all of them, Orbi-One was standing still as one or more people, carrying a KIO tag, moved around him. Fig. 2 shows the 14 different recognition scenarios recorded. These scenarios have been chosen according to different situations that may occurs on robotics competitions such as [https://www.eu-robotics.net/robotics_league/ ERL] or [http://www.robocup.org/ RoboCup].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig2.png|frame|center|'''Fig. 2''': recognition scenarios recorded.]]&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
A rosbag file was created for each scenario (3 for scenarios 3 and 12), recording lidar sensor measures, location estimates from PeTra and ROS-LD, locations from KIO RTLS and other useful data. Specifically, the following data were included in the rosbag files:&lt;br /&gt;
&lt;br /&gt;
* Laser sensor messages (''sensor\_msgs/LaserScan'') published at the ''/scan'' topic.&lt;br /&gt;
* Location estimates calculated by PeTra published at the ''/person'' topic.&lt;br /&gt;
* Location estimates calculated by ROS-LD at the ''/people\_tracker\_measurements'' topic.&lt;br /&gt;
* Location estimates calculated by the KIO RTLS published at the ''/kio/PointStamped/4037/out'' topic. &lt;br /&gt;
* Messages from the ''/map'', ''/odom'', and ''/tf'' topics which includes map information, odometry of the robot base, and transform information respectively.&lt;br /&gt;
&lt;br /&gt;
Differents versions of the dataset are enumerated below.&lt;br /&gt;
&lt;br /&gt;
=== v1.0 [Jul-2017] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a first version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_01.bag test_01.bag]: '''duration:''' 14:56s, '''size:''' 227.8 MB, '''start date/time:''' Jul 20, 2017 12:49:21.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Jul 26, 2017 11:01:24.72&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_03_1.bag test_03_1.bag]: '''duration:''' 39.9s, '''size:''' 10.4 MB, '''start date/time:''' Jul 20, 2017 13:27:25.50&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_03_2.bag test_03_2.bag]: '''duration:''' 39.1s, '''size:''' 10.2 MB, '''start date/time:''' Jul 20, 2017 13:28:56.41&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_03_3.bag test_03_3.bag]: '''duration:''' 40.5s, '''size:''' 10.5 MB, '''start date/time:''' Jul 20, 2017 13:30:04.94&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_04_1.bag test_04_1.bag]: '''duration:''' 58.3s, '''size:''' 15.0 MB, '''start date/time:''' Jul 25, 2017 10:39:52.62 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_04_2.bag test_04_2.bag]: '''duration:''' 57.2s, '''size:''' 14.7 MB, '''start date/time:''' Jul 25, 2017 10:41:16.31&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_04_3.bag test_04_3.bag]: '''duration:''' 50.5s, '''size:''' 13.0 MB, '''start date/time:''' Jul 25, 2017 10:42:44.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_04_4.bag test_04_4.bag]: '''duration:''' 1:01s, '''size:''' 15.7 MB, '''start date/time:''' Jul 25, 2017 10:43:52.44&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_05.bag test_05.bag]: '''duration:''' 15:15s, '''size:''' 236.8 MB, '''start date/time:''' Jul 26, 2017 11:33:13.31&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_06.bag test_06.bag]: '''duration:''' 09:18s, '''size:''' 143.3 MB, '''start date/time:''' Jul 26, 2017 12:25:45.12&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Jul 25, 2017 11:40:01.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_08.bag test_08.bag]: '''duration:''' 03:39s, '''size:''' 55.8 MB, '''start date/time:''' Jul 25, 2017 12:25:29.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_09_1.bag test_09_1.bag]: '''duration:''' 22.9s, '''size:''' 6.1 MB, '''start date/time:''' Jul 25, 2017 10:50:02.95 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_09_2.bag test_09_2.bag]: '''duration:''' 20.8s, '''size:''' 5.6 MB, '''start date/time:''' Jul 25, 2017 10:51:02.62&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Jul 25, 2017 10:51:45.96 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_09_4.bag test_09_4.bag]: '''duration:''' 29.3s, '''size:''' 7.9 MB, '''start date/time:''' Jul 25, 2017 10:52:51.24&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_09_5.bag test_09_5.bag]: '''duration:''' 36.9s, '''size:''' 9.7 MB, '''start date/time:''' Jul 25, 2017 10:54:00.13&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_10.bag test_10.bag]: '''duration:''' 15:50s, '''size:''' 240.5 MB, '''start date/time:''' Jul 20, 2017 13:07:40.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 58.7 MB, '''start date/time:''' Jul 25, 2017 11:48:33.90&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_12_1.bag test_12_1.bag]: '''duration:''' 43.6s, '''size:''' 11.3 MB, '''start date/time:''' Jul 20, 2017 13:33:23.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_12_2.bag test_12_2.bag]: '''duration:''' 44.3s, '''size:''' 11.5 MB, '''start date/time:''' Jul 20, 2017 13:34:24.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_12_3.bag test_12_3.bag]: '''duration:''' 37.3s, '''size:'''  9.7 MB, '''start date/time:''' Jul 20, 2017 13:35:31.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_13_1.bag test_13_1.bag]: '''duration:''' 57.8s, '''size:''' 14.9 MB, '''start date/time:''' Jul 25, 2017 11:01:15.23&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_13_2.bag test_13_2.bag]: '''duration:''' 59.9s, '''size:''' 15.4 MB, '''start date/time:''' Jul 25, 2017 11:02:37.85&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_13_3.bag test_13_3.bag]: '''duration:''' 54.0s, '''size:''' 13.9 MB, '''start date/time:''' Jul 25, 2017 11:04:30.69&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_14.bag test_14.bag]: '''duration:''' 05.57s, '''size:''' 90.5 MB, '''start date/time:''' Jul 25, 2017 11:10:32.97&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Datasets&amp;diff=5149</id>
		<title>Datasets</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Datasets&amp;diff=5149"/>
				<updated>2017-10-16T08:43:25Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This site summarizes different datasets gathered by the Robotics Group during their researches.&lt;br /&gt;
&lt;br /&gt;
__FORCETOC__&lt;br /&gt;
&lt;br /&gt;
== Available datasets ==&lt;br /&gt;
&lt;br /&gt;
Currently the following datasets are available:&lt;br /&gt;
&lt;br /&gt;
=== (RRID:SCR_015756) Benchmark dataset for analysis of cyber-attacks to an indoor real time localization system for autonomous robots ===&lt;br /&gt;
&lt;br /&gt;
This dataset can be used to analyze cyber-attacks to an indoor real time localization system for autonomous robots. Data have been gathered in an indoor mock-up apartmentlocated at the Robotics Lab of the University of León (Spain). An autonomous robot, called Karen, with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
[http://robotica.unileon.es/index.php/Benchmark_dataset_for_analysis_of_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots  Go to dataset]&lt;br /&gt;
&lt;br /&gt;
=== (RRID:SCR_015757) Benchmark dataset for training/testing of Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots ===&lt;br /&gt;
&lt;br /&gt;
This dataset can be used to train and test Machine Learning Models to detect cyber-attacks  to an indoor real time localization system for autonomous robots. Data have been gathered in an indoor mock-up apartmentlocated at the Robotics Lab of the University of León (Spain). An autonomous robot, called Orbi-One, with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
[http://robotica.unileon.es/index.php/Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots  Go to dataset]&lt;br /&gt;
&lt;br /&gt;
=== (RRID:SCR_015743) Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots ===&lt;br /&gt;
&lt;br /&gt;
This dataset can be used to evaluate the performance of different approaches for detecting and tracking people by using lidar sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers.&lt;br /&gt;
&lt;br /&gt;
[http://robotica.unileon.es/index.php/Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots Go to dataset]&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots&amp;diff=5148</id>
		<title>Benchmark dataset for training/testing of Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots&amp;diff=5148"/>
				<updated>2017-10-11T15:07:53Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: /* Validation route === */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This data report summarizes a benchmark dataset which can be used to train and test Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots. Data have been gathered in an indoor mock-up apartment, shown in Fig 2, located at the Robotics Lab of the University of León (Spain). An autonomous robot, called Orbi-One and shown in Fig 1, with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
== Materials ==&lt;br /&gt;
&lt;br /&gt;
Data gathered by Orbi-One robot include: &lt;br /&gt;
* Orbi-One location estimates provided by a commercial RTLS, called '''KIO'''.&lt;br /&gt;
&lt;br /&gt;
Additional information about Karen and the devices/packages used to get data is given below.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-One robot ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One, shown at Fig 1, is an assistant robot manufactured by [http://www.robotnik.es/manipuladores-roboticos-moviles/rb-one/ Robotnik]. The software to control the robot hardware is based on [http://www.ros.org/ ROS].&lt;br /&gt;
&lt;br /&gt;
[[File:Orbi-Obe_and_KIO.jpg|thumb|'''Fig. 1''': Orbi-One and KIO RTLS.]]&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
KIO RTLS commercial solution by [https://www.eliko.ee/products/kio-rtls/ Eliko] has been used to provide people location at the study area. Fig 1 shows a KIO beacon (1), and a KIO tag on the robot (2).&lt;br /&gt;
&lt;br /&gt;
== Recording procedure ==&lt;br /&gt;
&lt;br /&gt;
Two predefined trajectories were set for the robot in the study area, as shown in Fig 2: a test trajectory (light gray line in Fig 2) was used to build a training dataset, for training and testing the models; and a validation trajectory (dark gray line in Fig 2) was used to generate a different dataset to validate models in a different location ensuring generalization. The robot started in both cases at the point marked &amp;quot;0&amp;quot; and finished at the point marked &amp;quot;1&amp;quot;&amp;quot;. Data were recorded by the Orby-One robot moving through the apartment, remotely controlled, following the test and validation trajectories respectively. We created a different rosbag file every time Orby-One made the walk, saving the location estimates gathered by the KIO device for a later analysis.&lt;br /&gt;
&lt;br /&gt;
[[File:apartamento_leon_robotics2017.jpg|thumb|'''Fig. 2''': Robotics mobile lab plane. Light gray line shows the test trajectory. Dark gray line shows the validation trajectory. Red dots show the location of anchors.]]&lt;br /&gt;
&lt;br /&gt;
We repeated the test and validation trajectories 10 times each, so that 20 rosbag files were recorded. Orby-One takes about 72 seconds to finish the walk following the test trajectory, and about 40 seconds following the validation trajectory. Each test run yielded 270 location estimates on average, validation runs 150.&lt;br /&gt;
&lt;br /&gt;
The runs were recorded in three different scenarios: without suffering any attack (labeled as WA), suffering a DoS attack (labeled as A1), and suffering a Spoofing attack (labeled as A2). DoS attacks were carried out by interrupting the signal of one or more radio beacons. Spoofing attacks were carried out by changing the signal of the radio beacons. The affected radio beacons were selected by looking for anchors with redundancy (A-anchors) and anchors without (C- and D-anchors), at different locations.&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
=== v1.0 ===&lt;br /&gt;
&lt;br /&gt;
==== Without attack (WA) ====&lt;br /&gt;
&lt;br /&gt;
===== Test route =====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-06-17-48-50.bag                    WA_2017-03-06-17-48-50.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-08-46-41.bag                    WA_2017-03-07-08-46-41.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-08-48-56.bag                    WA_2017-03-07-08-48-56.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-08-50-50.bag                    WA_2017-03-07-08-50-50.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-14-53-42.bag                    WA_2017-03-07-14-53-42.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-14-55-29.bag                    WA_2017-03-07-14-55-29.bag                ]&lt;br /&gt;
&lt;br /&gt;
===== Validation route =====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-06-18-05-27.bag         WA-validation_2017-03-06-18-05-27.bag     ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-07-08-53-28.bag         WA-validation_2017-03-07-08-53-28.bag     ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-07-08-54-40.bag         WA-validation_2017-03-07-08-54-40.bag     ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-07-08-55-48.bag         WA-validation_2017-03-07-08-55-48.bag     ]&lt;br /&gt;
&lt;br /&gt;
==== DoS attack (A1) ====&lt;br /&gt;
&lt;br /&gt;
===== Test route =====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-401d_2017-03-07-14-15-12.bag               A1-401d_2017-03-07-14-15-12.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408a_2017-03-06-17-53-12.bag               A1-408a_2017-03-06-17-53-12.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408b_2017-03-07-14-08-25.bag               A1-408b_2017-03-07-14-08-25.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-501c_2017-03-07-09-00-44.bag               A1-501c_2017-03-07-09-00-44.bag           ]&lt;br /&gt;
&lt;br /&gt;
===== Validation route =====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-401d-validation_2017-03-07-09-32-49.bag    A1-401d-validation_2017-03-07-09-32-49.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408a-validation_2017-03-06-18-03-44.bag    A1-408a-validation_2017-03-06-18-03-44.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408b-validation_2017-03-07-14-10-42.bag    A1-408b-validation_2017-03-07-14-10-42.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-501c-validation_2017-03-07-09-04-18.bag    A1-501c-validation_2017-03-07-09-04-18.bag]&lt;br /&gt;
&lt;br /&gt;
==== Spoofing attack (A2) ====&lt;br /&gt;
&lt;br /&gt;
===== Test route =====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-408a_2017-03-06-17-58-28.bag               A2-408a_2017-03-06-17-58-28.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-501c_2017-03-07-09-10-55.bag               A2-501c_2017-03-07-09-10-55.bag           ]&lt;br /&gt;
&lt;br /&gt;
===== Validation route =====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-401d-validation_2017-03-07-09-35-07.bag    A2-401d-validation_2017-03-07-09-35-07.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-408a-validation_2017-03-06-18-01-09.bag    A2-408a-validation_2017-03-06-18-01-09.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-501c-validation_2017-03-07-09-13-14.bag    A2-501c-validation_2017-03-07-09-13-14.bag]&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots&amp;diff=5147</id>
		<title>Benchmark dataset for training/testing of Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots&amp;diff=5147"/>
				<updated>2017-10-11T15:07:44Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: /* v1.0 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This data report summarizes a benchmark dataset which can be used to train and test Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots. Data have been gathered in an indoor mock-up apartment, shown in Fig 2, located at the Robotics Lab of the University of León (Spain). An autonomous robot, called Orbi-One and shown in Fig 1, with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
== Materials ==&lt;br /&gt;
&lt;br /&gt;
Data gathered by Orbi-One robot include: &lt;br /&gt;
* Orbi-One location estimates provided by a commercial RTLS, called '''KIO'''.&lt;br /&gt;
&lt;br /&gt;
Additional information about Karen and the devices/packages used to get data is given below.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-One robot ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One, shown at Fig 1, is an assistant robot manufactured by [http://www.robotnik.es/manipuladores-roboticos-moviles/rb-one/ Robotnik]. The software to control the robot hardware is based on [http://www.ros.org/ ROS].&lt;br /&gt;
&lt;br /&gt;
[[File:Orbi-Obe_and_KIO.jpg|thumb|'''Fig. 1''': Orbi-One and KIO RTLS.]]&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
KIO RTLS commercial solution by [https://www.eliko.ee/products/kio-rtls/ Eliko] has been used to provide people location at the study area. Fig 1 shows a KIO beacon (1), and a KIO tag on the robot (2).&lt;br /&gt;
&lt;br /&gt;
== Recording procedure ==&lt;br /&gt;
&lt;br /&gt;
Two predefined trajectories were set for the robot in the study area, as shown in Fig 2: a test trajectory (light gray line in Fig 2) was used to build a training dataset, for training and testing the models; and a validation trajectory (dark gray line in Fig 2) was used to generate a different dataset to validate models in a different location ensuring generalization. The robot started in both cases at the point marked &amp;quot;0&amp;quot; and finished at the point marked &amp;quot;1&amp;quot;&amp;quot;. Data were recorded by the Orby-One robot moving through the apartment, remotely controlled, following the test and validation trajectories respectively. We created a different rosbag file every time Orby-One made the walk, saving the location estimates gathered by the KIO device for a later analysis.&lt;br /&gt;
&lt;br /&gt;
[[File:apartamento_leon_robotics2017.jpg|thumb|'''Fig. 2''': Robotics mobile lab plane. Light gray line shows the test trajectory. Dark gray line shows the validation trajectory. Red dots show the location of anchors.]]&lt;br /&gt;
&lt;br /&gt;
We repeated the test and validation trajectories 10 times each, so that 20 rosbag files were recorded. Orby-One takes about 72 seconds to finish the walk following the test trajectory, and about 40 seconds following the validation trajectory. Each test run yielded 270 location estimates on average, validation runs 150.&lt;br /&gt;
&lt;br /&gt;
The runs were recorded in three different scenarios: without suffering any attack (labeled as WA), suffering a DoS attack (labeled as A1), and suffering a Spoofing attack (labeled as A2). DoS attacks were carried out by interrupting the signal of one or more radio beacons. Spoofing attacks were carried out by changing the signal of the radio beacons. The affected radio beacons were selected by looking for anchors with redundancy (A-anchors) and anchors without (C- and D-anchors), at different locations.&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
=== v1.0 ===&lt;br /&gt;
&lt;br /&gt;
==== Without attack (WA) ====&lt;br /&gt;
&lt;br /&gt;
===== Test route =====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-06-17-48-50.bag                    WA_2017-03-06-17-48-50.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-08-46-41.bag                    WA_2017-03-07-08-46-41.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-08-48-56.bag                    WA_2017-03-07-08-48-56.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-08-50-50.bag                    WA_2017-03-07-08-50-50.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-14-53-42.bag                    WA_2017-03-07-14-53-42.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-14-55-29.bag                    WA_2017-03-07-14-55-29.bag                ]&lt;br /&gt;
&lt;br /&gt;
===== Validation route =====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-06-18-05-27.bag         WA-validation_2017-03-06-18-05-27.bag     ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-07-08-53-28.bag         WA-validation_2017-03-07-08-53-28.bag     ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-07-08-54-40.bag         WA-validation_2017-03-07-08-54-40.bag     ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-07-08-55-48.bag         WA-validation_2017-03-07-08-55-48.bag     ]&lt;br /&gt;
&lt;br /&gt;
==== DoS attack (A1) ====&lt;br /&gt;
&lt;br /&gt;
===== Test route =====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-401d_2017-03-07-14-15-12.bag               A1-401d_2017-03-07-14-15-12.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408a_2017-03-06-17-53-12.bag               A1-408a_2017-03-06-17-53-12.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408b_2017-03-07-14-08-25.bag               A1-408b_2017-03-07-14-08-25.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-501c_2017-03-07-09-00-44.bag               A1-501c_2017-03-07-09-00-44.bag           ]&lt;br /&gt;
&lt;br /&gt;
===== Validation route ========&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-401d-validation_2017-03-07-09-32-49.bag    A1-401d-validation_2017-03-07-09-32-49.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408a-validation_2017-03-06-18-03-44.bag    A1-408a-validation_2017-03-06-18-03-44.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408b-validation_2017-03-07-14-10-42.bag    A1-408b-validation_2017-03-07-14-10-42.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-501c-validation_2017-03-07-09-04-18.bag    A1-501c-validation_2017-03-07-09-04-18.bag]&lt;br /&gt;
&lt;br /&gt;
==== Spoofing attack (A2) ====&lt;br /&gt;
&lt;br /&gt;
===== Test route =====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-408a_2017-03-06-17-58-28.bag               A2-408a_2017-03-06-17-58-28.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-501c_2017-03-07-09-10-55.bag               A2-501c_2017-03-07-09-10-55.bag           ]&lt;br /&gt;
&lt;br /&gt;
===== Validation route =====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-401d-validation_2017-03-07-09-35-07.bag    A2-401d-validation_2017-03-07-09-35-07.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-408a-validation_2017-03-06-18-01-09.bag    A2-408a-validation_2017-03-06-18-01-09.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-501c-validation_2017-03-07-09-13-14.bag    A2-501c-validation_2017-03-07-09-13-14.bag]&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots&amp;diff=5146</id>
		<title>Benchmark dataset for training/testing of Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots&amp;diff=5146"/>
				<updated>2017-10-11T15:06:52Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: /* v1.0 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This data report summarizes a benchmark dataset which can be used to train and test Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots. Data have been gathered in an indoor mock-up apartment, shown in Fig 2, located at the Robotics Lab of the University of León (Spain). An autonomous robot, called Orbi-One and shown in Fig 1, with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
== Materials ==&lt;br /&gt;
&lt;br /&gt;
Data gathered by Orbi-One robot include: &lt;br /&gt;
* Orbi-One location estimates provided by a commercial RTLS, called '''KIO'''.&lt;br /&gt;
&lt;br /&gt;
Additional information about Karen and the devices/packages used to get data is given below.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-One robot ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One, shown at Fig 1, is an assistant robot manufactured by [http://www.robotnik.es/manipuladores-roboticos-moviles/rb-one/ Robotnik]. The software to control the robot hardware is based on [http://www.ros.org/ ROS].&lt;br /&gt;
&lt;br /&gt;
[[File:Orbi-Obe_and_KIO.jpg|thumb|'''Fig. 1''': Orbi-One and KIO RTLS.]]&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
KIO RTLS commercial solution by [https://www.eliko.ee/products/kio-rtls/ Eliko] has been used to provide people location at the study area. Fig 1 shows a KIO beacon (1), and a KIO tag on the robot (2).&lt;br /&gt;
&lt;br /&gt;
== Recording procedure ==&lt;br /&gt;
&lt;br /&gt;
Two predefined trajectories were set for the robot in the study area, as shown in Fig 2: a test trajectory (light gray line in Fig 2) was used to build a training dataset, for training and testing the models; and a validation trajectory (dark gray line in Fig 2) was used to generate a different dataset to validate models in a different location ensuring generalization. The robot started in both cases at the point marked &amp;quot;0&amp;quot; and finished at the point marked &amp;quot;1&amp;quot;&amp;quot;. Data were recorded by the Orby-One robot moving through the apartment, remotely controlled, following the test and validation trajectories respectively. We created a different rosbag file every time Orby-One made the walk, saving the location estimates gathered by the KIO device for a later analysis.&lt;br /&gt;
&lt;br /&gt;
[[File:apartamento_leon_robotics2017.jpg|thumb|'''Fig. 2''': Robotics mobile lab plane. Light gray line shows the test trajectory. Dark gray line shows the validation trajectory. Red dots show the location of anchors.]]&lt;br /&gt;
&lt;br /&gt;
We repeated the test and validation trajectories 10 times each, so that 20 rosbag files were recorded. Orby-One takes about 72 seconds to finish the walk following the test trajectory, and about 40 seconds following the validation trajectory. Each test run yielded 270 location estimates on average, validation runs 150.&lt;br /&gt;
&lt;br /&gt;
The runs were recorded in three different scenarios: without suffering any attack (labeled as WA), suffering a DoS attack (labeled as A1), and suffering a Spoofing attack (labeled as A2). DoS attacks were carried out by interrupting the signal of one or more radio beacons. Spoofing attacks were carried out by changing the signal of the radio beacons. The affected radio beacons were selected by looking for anchors with redundancy (A-anchors) and anchors without (C- and D-anchors), at different locations.&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
=== v1.0 ===&lt;br /&gt;
&lt;br /&gt;
==== Without attack (WA) ====&lt;br /&gt;
&lt;br /&gt;
===== Test route =====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-06-17-48-50.bag                    WA_2017-03-06-17-48-50.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-08-46-41.bag                    WA_2017-03-07-08-46-41.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-08-48-56.bag                    WA_2017-03-07-08-48-56.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-08-50-50.bag                    WA_2017-03-07-08-50-50.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-14-53-42.bag                    WA_2017-03-07-14-53-42.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-14-55-29.bag                    WA_2017-03-07-14-55-29.bag                ]&lt;br /&gt;
&lt;br /&gt;
===== Validation route =====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-06-18-05-27.bag         WA-validation_2017-03-06-18-05-27.bag     ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-07-08-53-28.bag         WA-validation_2017-03-07-08-53-28.bag     ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-07-08-54-40.bag         WA-validation_2017-03-07-08-54-40.bag     ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-07-08-55-48.bag         WA-validation_2017-03-07-08-55-48.bag     ]&lt;br /&gt;
&lt;br /&gt;
==== DoS attack (test route) ====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-401d_2017-03-07-14-15-12.bag               A1-401d_2017-03-07-14-15-12.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408a_2017-03-06-17-53-12.bag               A1-408a_2017-03-06-17-53-12.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408b_2017-03-07-14-08-25.bag               A1-408b_2017-03-07-14-08-25.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-501c_2017-03-07-09-00-44.bag               A1-501c_2017-03-07-09-00-44.bag           ]&lt;br /&gt;
&lt;br /&gt;
==== DoS attack (validation route) ====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-401d-validation_2017-03-07-09-32-49.bag    A1-401d-validation_2017-03-07-09-32-49.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408a-validation_2017-03-06-18-03-44.bag    A1-408a-validation_2017-03-06-18-03-44.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408b-validation_2017-03-07-14-10-42.bag    A1-408b-validation_2017-03-07-14-10-42.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-501c-validation_2017-03-07-09-04-18.bag    A1-501c-validation_2017-03-07-09-04-18.bag]&lt;br /&gt;
&lt;br /&gt;
==== Spoofing attack (test route) ====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-408a_2017-03-06-17-58-28.bag               A2-408a_2017-03-06-17-58-28.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-501c_2017-03-07-09-10-55.bag               A2-501c_2017-03-07-09-10-55.bag           ]&lt;br /&gt;
&lt;br /&gt;
==== Spoofing attack (validation route) ====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-401d-validation_2017-03-07-09-35-07.bag    A2-401d-validation_2017-03-07-09-35-07.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-408a-validation_2017-03-06-18-01-09.bag    A2-408a-validation_2017-03-06-18-01-09.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-501c-validation_2017-03-07-09-13-14.bag    A2-501c-validation_2017-03-07-09-13-14.bag]&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots&amp;diff=5145</id>
		<title>Benchmark dataset for training/testing of Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots&amp;diff=5145"/>
				<updated>2017-10-11T15:05:43Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This data report summarizes a benchmark dataset which can be used to train and test Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots. Data have been gathered in an indoor mock-up apartment, shown in Fig 2, located at the Robotics Lab of the University of León (Spain). An autonomous robot, called Orbi-One and shown in Fig 1, with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
== Materials ==&lt;br /&gt;
&lt;br /&gt;
Data gathered by Orbi-One robot include: &lt;br /&gt;
* Orbi-One location estimates provided by a commercial RTLS, called '''KIO'''.&lt;br /&gt;
&lt;br /&gt;
Additional information about Karen and the devices/packages used to get data is given below.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-One robot ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One, shown at Fig 1, is an assistant robot manufactured by [http://www.robotnik.es/manipuladores-roboticos-moviles/rb-one/ Robotnik]. The software to control the robot hardware is based on [http://www.ros.org/ ROS].&lt;br /&gt;
&lt;br /&gt;
[[File:Orbi-Obe_and_KIO.jpg|thumb|'''Fig. 1''': Orbi-One and KIO RTLS.]]&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
KIO RTLS commercial solution by [https://www.eliko.ee/products/kio-rtls/ Eliko] has been used to provide people location at the study area. Fig 1 shows a KIO beacon (1), and a KIO tag on the robot (2).&lt;br /&gt;
&lt;br /&gt;
== Recording procedure ==&lt;br /&gt;
&lt;br /&gt;
Two predefined trajectories were set for the robot in the study area, as shown in Fig 2: a test trajectory (light gray line in Fig 2) was used to build a training dataset, for training and testing the models; and a validation trajectory (dark gray line in Fig 2) was used to generate a different dataset to validate models in a different location ensuring generalization. The robot started in both cases at the point marked &amp;quot;0&amp;quot; and finished at the point marked &amp;quot;1&amp;quot;&amp;quot;. Data were recorded by the Orby-One robot moving through the apartment, remotely controlled, following the test and validation trajectories respectively. We created a different rosbag file every time Orby-One made the walk, saving the location estimates gathered by the KIO device for a later analysis.&lt;br /&gt;
&lt;br /&gt;
[[File:apartamento_leon_robotics2017.jpg|thumb|'''Fig. 2''': Robotics mobile lab plane. Light gray line shows the test trajectory. Dark gray line shows the validation trajectory. Red dots show the location of anchors.]]&lt;br /&gt;
&lt;br /&gt;
We repeated the test and validation trajectories 10 times each, so that 20 rosbag files were recorded. Orby-One takes about 72 seconds to finish the walk following the test trajectory, and about 40 seconds following the validation trajectory. Each test run yielded 270 location estimates on average, validation runs 150.&lt;br /&gt;
&lt;br /&gt;
The runs were recorded in three different scenarios: without suffering any attack (labeled as WA), suffering a DoS attack (labeled as A1), and suffering a Spoofing attack (labeled as A2). DoS attacks were carried out by interrupting the signal of one or more radio beacons. Spoofing attacks were carried out by changing the signal of the radio beacons. The affected radio beacons were selected by looking for anchors with redundancy (A-anchors) and anchors without (C- and D-anchors), at different locations.&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
=== v1.0 ===&lt;br /&gt;
&lt;br /&gt;
==== Without attack (test route) ====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-06-17-48-50.bag                    WA_2017-03-06-17-48-50.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-08-46-41.bag                    WA_2017-03-07-08-46-41.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-08-48-56.bag                    WA_2017-03-07-08-48-56.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-08-50-50.bag                    WA_2017-03-07-08-50-50.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-14-53-42.bag                    WA_2017-03-07-14-53-42.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-14-55-29.bag                    WA_2017-03-07-14-55-29.bag                ]&lt;br /&gt;
&lt;br /&gt;
==== Without attack (validation route) ====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-06-18-05-27.bag         WA-validation_2017-03-06-18-05-27.bag     ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-07-08-53-28.bag         WA-validation_2017-03-07-08-53-28.bag     ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-07-08-54-40.bag         WA-validation_2017-03-07-08-54-40.bag     ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-07-08-55-48.bag         WA-validation_2017-03-07-08-55-48.bag     ]&lt;br /&gt;
&lt;br /&gt;
==== DoS attack (test route) ====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-401d_2017-03-07-14-15-12.bag               A1-401d_2017-03-07-14-15-12.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408a_2017-03-06-17-53-12.bag               A1-408a_2017-03-06-17-53-12.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408b_2017-03-07-14-08-25.bag               A1-408b_2017-03-07-14-08-25.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-501c_2017-03-07-09-00-44.bag               A1-501c_2017-03-07-09-00-44.bag           ]&lt;br /&gt;
&lt;br /&gt;
==== DoS attack (validation route) ====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-401d-validation_2017-03-07-09-32-49.bag    A1-401d-validation_2017-03-07-09-32-49.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408a-validation_2017-03-06-18-03-44.bag    A1-408a-validation_2017-03-06-18-03-44.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408b-validation_2017-03-07-14-10-42.bag    A1-408b-validation_2017-03-07-14-10-42.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-501c-validation_2017-03-07-09-04-18.bag    A1-501c-validation_2017-03-07-09-04-18.bag]&lt;br /&gt;
&lt;br /&gt;
==== Spoofing attack (test route) ====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-408a_2017-03-06-17-58-28.bag               A2-408a_2017-03-06-17-58-28.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-501c_2017-03-07-09-10-55.bag               A2-501c_2017-03-07-09-10-55.bag           ]&lt;br /&gt;
&lt;br /&gt;
==== Spoofing attack (validation route) ====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-401d-validation_2017-03-07-09-35-07.bag    A2-401d-validation_2017-03-07-09-35-07.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-408a-validation_2017-03-06-18-01-09.bag    A2-408a-validation_2017-03-06-18-01-09.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-501c-validation_2017-03-07-09-13-14.bag    A2-501c-validation_2017-03-07-09-13-14.bag]&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Datasets&amp;diff=5144</id>
		<title>Datasets</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Datasets&amp;diff=5144"/>
				<updated>2017-10-11T15:02:01Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This site summarizes different datasets gathered by the Robotics Group during their researches.&lt;br /&gt;
&lt;br /&gt;
__FORCETOC__&lt;br /&gt;
&lt;br /&gt;
== Available datasets ==&lt;br /&gt;
&lt;br /&gt;
Currently the following datasets are available:&lt;br /&gt;
&lt;br /&gt;
=== Benchmark dataset for analysis of cyber-attacks to an indoor real time localization system for autonomous robots ===&lt;br /&gt;
&lt;br /&gt;
This dataset can be used to analyze cyber-attacks to an indoor real time localization system for autonomous robots. Data have been gathered in an indoor mock-up apartmentlocated at the Robotics Lab of the University of León (Spain). An autonomous robot, called Karen, with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
[http://robotica.unileon.es/index.php/Benchmark_dataset_for_analysis_of_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots  Go to dataset]&lt;br /&gt;
&lt;br /&gt;
=== Benchmark dataset for training/testing of Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots ===&lt;br /&gt;
&lt;br /&gt;
This dataset can be used to train and test Machine Learning Models to detect cyber-attacks  to an indoor real time localization system for autonomous robots. Data have been gathered in an indoor mock-up apartmentlocated at the Robotics Lab of the University of León (Spain). An autonomous robot, called Orbi-One, with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
[http://robotica.unileon.es/index.php/Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots  Go to dataset]&lt;br /&gt;
&lt;br /&gt;
=== (RRID:SCR_015743) Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots ===&lt;br /&gt;
&lt;br /&gt;
This dataset can be used to evaluate the performance of different approaches for detecting and tracking people by using lidar sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers.&lt;br /&gt;
&lt;br /&gt;
[http://robotica.unileon.es/index.php/Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots Go to dataset]&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Datasets&amp;diff=5143</id>
		<title>Datasets</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Datasets&amp;diff=5143"/>
				<updated>2017-10-11T15:00:45Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This site summarizes different datasets gathered by the Robotics Group during their researches.&lt;br /&gt;
&lt;br /&gt;
__FORCETOC__&lt;br /&gt;
&lt;br /&gt;
== Available datasets ==&lt;br /&gt;
&lt;br /&gt;
Currently the following datasets are available:&lt;br /&gt;
&lt;br /&gt;
=== [http://robotica.unileon.es/index.php/Benchmark_dataset_for_analysis_of_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots Benchmark dataset for analysis of cyber-attacks to an indoor real time localization system for autonomous robots] ===&lt;br /&gt;
&lt;br /&gt;
This dataset can be used to analyze cyber-attacks to an indoor real time localization system for autonomous robots. Data have been gathered in an indoor mock-up apartmentlocated at the Robotics Lab of the University of León (Spain). An autonomous robot, called Karen, with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
=== [http://robotica.unileon.es/index.php/Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots Benchmark dataset for training/testing of Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots] ===&lt;br /&gt;
&lt;br /&gt;
This dataset can be used to train and test Machine Learning Models to detect cyber-attacks  to an indoor real time localization system for autonomous robots. Data have been gathered in an indoor mock-up apartmentlocated at the Robotics Lab of the University of León (Spain). An autonomous robot, called Orbi-One, with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
=== (RRID:SCR_015743) Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots ===&lt;br /&gt;
&lt;br /&gt;
This dataset can be used to evaluate the performance of different approaches for detecting and tracking people by using lidar sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers.&lt;br /&gt;
&lt;br /&gt;
[http://robotica.unileon.es/index.php/Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots  Go to dataset]&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Datasets&amp;diff=5142</id>
		<title>Datasets</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Datasets&amp;diff=5142"/>
				<updated>2017-10-11T14:58:50Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This site summarizes different datasets gathered by the Robotics Group during their researches.&lt;br /&gt;
__FORCETOC__&lt;br /&gt;
&lt;br /&gt;
== Available datasets ==&lt;br /&gt;
&lt;br /&gt;
Currently the following datasets are available:&lt;br /&gt;
&lt;br /&gt;
=== [http://robotica.unileon.es/index.php/Benchmark_dataset_for_analysis_of_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots Benchmark dataset for analysis of cyber-attacks to an indoor real time localization system for autonomous robots] ===&lt;br /&gt;
&lt;br /&gt;
This dataset can be used to analyze cyber-attacks to an indoor real time localization system for autonomous robots. Data have been gathered in an indoor mock-up apartmentlocated at the Robotics Lab of the University of León (Spain). An autonomous robot, called Karen, with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
=== [http://robotica.unileon.es/index.php/Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots Benchmark dataset for training/testing of Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots] ===&lt;br /&gt;
&lt;br /&gt;
This dataset can be used to train and test Machine Learning Models to detect cyber-attacks  to an indoor real time localization system for autonomous robots. Data have been gathered in an indoor mock-up apartmentlocated at the Robotics Lab of the University of León (Spain). An autonomous robot, called Orbi-One, with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
=== (RRID:SCR_015743) [http://robotica.unileon.es/index.php/Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots] ===&lt;br /&gt;
&lt;br /&gt;
This dataset can be used to evaluate the performance of different approaches for detecting and tracking people by using lidar sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers.&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Datasets&amp;diff=5141</id>
		<title>Datasets</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Datasets&amp;diff=5141"/>
				<updated>2017-10-11T14:52:36Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: /* Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This site summarizes different datasets gathered by the Robotics Group during their researches.&lt;br /&gt;
&lt;br /&gt;
__FORCETOC__&lt;br /&gt;
&lt;br /&gt;
== Available datasets ==&lt;br /&gt;
&lt;br /&gt;
Currently the following datasets are available:&lt;br /&gt;
&lt;br /&gt;
=== [http://robotica.unileon.es/index.php/Benchmark_dataset_for_analysis_of_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots Benchmark dataset for analysis of cyber-attacks to an indoor real time localization system for autonomous robots] ===&lt;br /&gt;
&lt;br /&gt;
This dataset can be used to analyze cyber-attacks to an indoor real time localization system for autonomous robots. Data have been gathered in an indoor mock-up apartmentlocated at the Robotics Lab of the University of León (Spain). An autonomous robot, called Karen, with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
=== [http://robotica.unileon.es/index.php/Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots Benchmark dataset for training/testing of Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots] ===&lt;br /&gt;
&lt;br /&gt;
This dataset can be used to train and test Machine Learning Models to detect cyber-attacks  to an indoor real time localization system for autonomous robots. Data have been gathered in an indoor mock-up apartmentlocated at the Robotics Lab of the University of León (Spain). An autonomous robot, called Orbi-One, with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
=== (RRID:SCR_015743) [http://robotica.unileon.es/index.php/Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots] ===&lt;br /&gt;
&lt;br /&gt;
This dataset can be used to evaluate the performance of different approaches for detecting and tracking people by using lidar sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers.&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5140</id>
		<title>Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots&amp;diff=5140"/>
				<updated>2017-10-11T11:36:54Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Grupo de Robótica Benchmark Dataset, '''RRID:SCR_015743'''.&lt;br /&gt;
&lt;br /&gt;
This data report summarizes a benchmark dataset which can be used to evaluate the performance of different approaches for detecting and tracking people by using lidar sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers. Data have been gathered in an indoor mock-up apartment, shown in Fig 1 (B), located at the Robotics Lab of the University of León (Spain). An autonomous robot, called Orbi-One and shown in Fig 1 (A), with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
== Materials ==&lt;br /&gt;
&lt;br /&gt;
Data gathered by Orbi-One robot include: &lt;br /&gt;
* Lidar sensor measures. &lt;br /&gt;
* Location estimates from two people trackers: '''ROS-LD''' and '''PeTra'''. &lt;br /&gt;
* People location provided by a commercial RTLS, called '''KIO''', which can be used as ground-truth.&lt;br /&gt;
* Some other useful data gathered by the Orbi-One robot such as map information, odometry, and transform data.&lt;br /&gt;
&lt;br /&gt;
Additional information about Orbi-One and the devices/packages used to get data is given below.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-one ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One, shown at Fig 1 (A), is an assistant robot manufactured by [http://www.robotnik.es/manipuladores-roboticos-moviles/rb-one/ Robotnik]. The software to control the robot hardware is based on [http://www.ros.org/ ROS].&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig1.png|frame|center|'''Fig. 1''': From left to right: (A) Orbi-One carrying KIO tag and a KIO anchor attached in the ceiling; (B) tobotics mobile lab plane, red dots show the location of KIO anchors; (C) occupancy map generated using lidar sensor measures; and (D) Network output.]]&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
KIO RTLS commercial solution by [https://www.eliko.ee/products/kio-rtls/ Eliko] has been used to provide people location at the study area.&lt;br /&gt;
&lt;br /&gt;
=== ROS Leg Detector (ROS-LD) ===&lt;br /&gt;
&lt;br /&gt;
ROS-LD is a ROS package which takes messages published by a lidar sensor as input and uses a machine-learning-trained classifier to detect groups of laser readings as possible legs. The code is available in a public [http://wiki.ros.org/leg_detector repository], but is unsupported at this time.&lt;br /&gt;
&lt;br /&gt;
=== PeTra ===&lt;br /&gt;
&lt;br /&gt;
PeTra is a people tracker tool for detecting and tracking people developed by the Robotics Group from the University of León.&lt;br /&gt;
&lt;br /&gt;
== Recording procedure ==&lt;br /&gt;
&lt;br /&gt;
The data were gathered under 14 different scenarios. In all of them, Orbi-One was standing still as one or more people, carrying a KIO tag, moved around him. Fig. 2 shows the 14 different recognition scenarios recorded. These scenarios have been chosen according to different situations that may occurs on robotics competitions such as [https://www.eu-robotics.net/robotics_league/ ERL] or [http://www.robocup.org/ RoboCup].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PeopleTrackingFig2.png|frame|center|'''Fig. 2''': recognition scenarios recorded.]]&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
A rosbag file was created for each scenario (3 for scenarios 3 and 12), recording lidar sensor measures, location estimates from PeTra and ROS-LD, locations from KIO RTLS and other useful data. Specifically, the following data were included in the rosbag files:&lt;br /&gt;
&lt;br /&gt;
* Laser sensor messages (''sensor\_msgs/LaserScan'') published at the ''/scan'' topic.&lt;br /&gt;
* Location estimates calculated by PeTra published at the ''/person'' topic.&lt;br /&gt;
* Location estimates calculated by ROS-LD at the ''/people\_tracker\_measurements'' topic.&lt;br /&gt;
* Location estimates calculated by the KIO RTLS published at the ''/kio/PointStamped/4037/out'' topic. &lt;br /&gt;
* Messages from the ''/map'', ''/odom'', and ''/tf'' topics which includes map information, odometry of the robot base, and transform information respectively.&lt;br /&gt;
&lt;br /&gt;
Differents versions of the dataset are enumerated below.&lt;br /&gt;
&lt;br /&gt;
=== v1.0 [Jul-2017] ===&lt;br /&gt;
&lt;br /&gt;
As a result of applying the recording method explained above, a first version of the dataset have been released. It includes measures for the scenarios defined at Fig. 2.:&lt;br /&gt;
&lt;br /&gt;
* Scenario 01 (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_01.bag test_01.bag]: '''duration:''' 14:56s, '''size:''' 227.8 MB, '''start date/time:''' Jul 20, 2017 12:49:21.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 02  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_02.bag test_02.bag]: '''duration:''' 15:08s, '''size:''' 233.0 MB, '''start date/time:''' Jul 26, 2017 11:01:24.72&lt;br /&gt;
&lt;br /&gt;
* Scenario 03  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_03_1.bag test_03_1.bag]: '''duration:''' 39.9s, '''size:''' 10.4 MB, '''start date/time:''' Jul 20, 2017 13:27:25.50&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_03_2.bag test_03_2.bag]: '''duration:''' 39.1s, '''size:''' 10.2 MB, '''start date/time:''' Jul 20, 2017 13:28:56.41&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_03_3.bag test_03_3.bag]: '''duration:''' 40.5s, '''size:''' 10.5 MB, '''start date/time:''' Jul 20, 2017 13:30:04.94&lt;br /&gt;
&lt;br /&gt;
* Scenario 04  (4 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_04_1.bag test_04_1.bag]: '''duration:''' 58.3s, '''size:''' 15.0 MB, '''start date/time:''' Jul 25, 2017 10:39:52.62 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_04_2.bag test_04_2.bag]: '''duration:''' 57.2s, '''size:''' 14.7 MB, '''start date/time:''' Jul 25, 2017 10:41:16.31&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_04_3.bag test_04_3.bag]: '''duration:''' 50.5s, '''size:''' 13.0 MB, '''start date/time:''' Jul 25, 2017 10:42:44.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_04_4.bag test_04_4.bag]: '''duration:''' 1:01s, '''size:''' 15.7 MB, '''start date/time:''' Jul 25, 2017 10:43:52.44&lt;br /&gt;
&lt;br /&gt;
* Scenario 05  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_05.bag test_05.bag]: '''duration:''' 15:15s, '''size:''' 236.8 MB, '''start date/time:''' Jul 26, 2017 11:33:13.31&lt;br /&gt;
&lt;br /&gt;
* Scenario 06  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_06.bag test_06.bag]: '''duration:''' 09:18s, '''size:''' 143.3 MB, '''start date/time:''' Jul 26, 2017 12:25:45.12&lt;br /&gt;
&lt;br /&gt;
* Scenario 07  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_07.bag test_07.bag]: '''duration:''' 4:16s, '''size:''' 65.1 MB, '''start date/time:''' Jul 25, 2017 11:40:01.65&lt;br /&gt;
&lt;br /&gt;
* Scenario 08  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_08.bag test_08.bag]: '''duration:''' 03:39s, '''size:''' 55.8 MB, '''start date/time:''' Jul 25, 2017 12:25:29.22&lt;br /&gt;
&lt;br /&gt;
* Scenario 09  (5 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_09_1.bag test_09_1.bag]: '''duration:''' 22.9s, '''size:''' 6.1 MB, '''start date/time:''' Jul 25, 2017 10:50:02.95 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_09_2.bag test_09_2.bag]: '''duration:''' 20.8s, '''size:''' 5.6 MB, '''start date/time:''' Jul 25, 2017 10:51:02.62&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_09_3.bag test_09_3.bag]: '''duration:''' 34.3s, '''size:''' 9.0 MB, '''start date/time:''' Jul 25, 2017 10:51:45.96 &lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_09_4.bag test_09_4.bag]: '''duration:''' 29.3s, '''size:''' 7.9 MB, '''start date/time:''' Jul 25, 2017 10:52:51.24&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_09_5.bag test_09_5.bag]: '''duration:''' 36.9s, '''size:''' 9.7 MB, '''start date/time:''' Jul 25, 2017 10:54:00.13&lt;br /&gt;
&lt;br /&gt;
* Scenario 10  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_10.bag test_10.bag]: '''duration:''' 15:50s, '''size:''' 240.5 MB, '''start date/time:''' Jul 20, 2017 13:07:40.16&lt;br /&gt;
&lt;br /&gt;
* Scenario 11  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_11.bag test_11.bag]: '''duration:''' 03:50s, '''size:''' 58.7 MB, '''start date/time:''' Jul 25, 2017 11:48:33.90&lt;br /&gt;
&lt;br /&gt;
* Scenario 12  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_12_1.bag test_12_1.bag]: '''duration:''' 43.6s, '''size:''' 11.3 MB, '''start date/time:''' Jul 20, 2017 13:33:23.74&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_12_2.bag test_12_2.bag]: '''duration:''' 44.3s, '''size:''' 11.5 MB, '''start date/time:''' Jul 20, 2017 13:34:24.95&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_12_3.bag test_12_3.bag]: '''duration:''' 37.3s, '''size:'''  9.7 MB, '''start date/time:''' Jul 20, 2017 13:35:31.55&lt;br /&gt;
&lt;br /&gt;
* Scenario 13  (3 files):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_13_1.bag test_13_1.bag]: '''duration:''' 57.8s, '''size:''' 14.9 MB, '''start date/time:''' Jul 25, 2017 11:01:15.23&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_13_2.bag test_13_2.bag]: '''duration:''' 59.9s, '''size:''' 15.4 MB, '''start date/time:''' Jul 25, 2017 11:02:37.85&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_13_3.bag test_13_3.bag]: '''duration:''' 54.0s, '''size:''' 13.9 MB, '''start date/time:''' Jul 25, 2017 11:04:30.69&lt;br /&gt;
&lt;br /&gt;
* Scenario 14  (1 file):&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/LegTracking/v1.0/test_14.bag test_14.bag]: '''duration:''' 05.57s, '''size:''' 90.5 MB, '''start date/time:''' Jul 25, 2017 11:10:32.97&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots&amp;diff=5139</id>
		<title>Benchmark dataset for training/testing of Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots&amp;diff=5139"/>
				<updated>2017-10-11T11:26:17Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This data report summarizes a benchmark dataset which can be used to train and test Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots. Data have been gathered in an indoor mock-up apartment, shown in Fig 2, located at the Robotics Lab of the University of León (Spain). An autonomous robot, called Orbi-One and shown in Fig 1, with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
== Materials ==&lt;br /&gt;
&lt;br /&gt;
Data gathered by Orbi-One robot include: &lt;br /&gt;
* Orbi-One location estimates provided by a commercial RTLS, called '''KIO'''.&lt;br /&gt;
&lt;br /&gt;
Additional information about Karen and the devices/packages used to get data is given below.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-One robot ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One, shown at Fig 1, is an assistant robot manufactured by [http://www.robotnik.es/manipuladores-roboticos-moviles/rb-one/ Robotnik]. The software to control the robot hardware is based on [http://www.ros.org/ ROS].&lt;br /&gt;
&lt;br /&gt;
[[File:Orbi-Obe_and_KIO.jpg|thumb|'''Fig. 1''': Orbi-One and KIO RTLS.]]&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
KIO RTLS commercial solution by [https://www.eliko.ee/products/kio-rtls/ Eliko] has been used to provide people location at the study area. Fig 1 shows a KIO beacon (1), and a KIO tag on the robot (2).&lt;br /&gt;
&lt;br /&gt;
== Recording procedure ==&lt;br /&gt;
&lt;br /&gt;
Two predefined trajectories were set for the robot in the study area, as shown in Fig 2: a test trajectory (light gray line in Fig 2) was used to build a training dataset, for training and testing the models; and a validation trajectory (dark gray line in Fig 2) was used to generate a different dataset to validate models in a different location ensuring generalization. The robot started in both cases at the point marked &amp;quot;0&amp;quot; and finished at the point marked &amp;quot;1&amp;quot;&amp;quot;. Data were recorded by the Orby-One robot moving through the apartment, remotely controlled, following the test and validation trajectories respectively. We created a different rosbag file every time Orby-One made the walk, saving the location estimates gathered by the KIO device for a later analysis.&lt;br /&gt;
&lt;br /&gt;
[[File:apartamento_leon_robotics2017.jpg|thumb|'''Fig. 2''': Robotics mobile lab plane. Light gray line shows the test trajectory. Dark gray line shows the validation trajectory. Red dots show the location of anchors.]]&lt;br /&gt;
&lt;br /&gt;
We repeated the test and validation trajectories 10 times each, so that 20 rosbag files were recorded. Orby-One takes about 72 seconds to finish the walk following the test trajectory, and about 40 seconds following the validation trajectory. Each test run yielded 270 location estimates on average, validation runs 150.&lt;br /&gt;
&lt;br /&gt;
The runs were recorded in three different scenarios: without suffering any attack, suffering a DoS attack, and suffering a Spoofing attack. DoS attacks were carried out by interrupting the signal of one or more radio beacons. Spoofing attacks were carried out by changing the signal of the radio beacons. The affected radio beacons were selected by looking for anchors with redundancy (A-anchors) and anchors without (C- and D-anchors), at different locations.&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
=== v1.0 ===&lt;br /&gt;
&lt;br /&gt;
==== Without attack (test route) ====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-06-17-48-50.bag                    WA_2017-03-06-17-48-50.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-08-46-41.bag                    WA_2017-03-07-08-46-41.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-08-48-56.bag                    WA_2017-03-07-08-48-56.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-08-50-50.bag                    WA_2017-03-07-08-50-50.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-14-53-42.bag                    WA_2017-03-07-14-53-42.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-14-55-29.bag                    WA_2017-03-07-14-55-29.bag                ]&lt;br /&gt;
&lt;br /&gt;
==== Without attack (validation route) ====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-06-18-05-27.bag         WA-validation_2017-03-06-18-05-27.bag     ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-07-08-53-28.bag         WA-validation_2017-03-07-08-53-28.bag     ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-07-08-54-40.bag         WA-validation_2017-03-07-08-54-40.bag     ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-07-08-55-48.bag         WA-validation_2017-03-07-08-55-48.bag     ]&lt;br /&gt;
&lt;br /&gt;
==== DoS attack (test route) ====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-401d_2017-03-07-14-15-12.bag               A1-401d_2017-03-07-14-15-12.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408a_2017-03-06-17-53-12.bag               A1-408a_2017-03-06-17-53-12.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408b_2017-03-07-14-08-25.bag               A1-408b_2017-03-07-14-08-25.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-501c_2017-03-07-09-00-44.bag               A1-501c_2017-03-07-09-00-44.bag           ]&lt;br /&gt;
&lt;br /&gt;
==== DoS attack (validation route) ====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-401d-validation_2017-03-07-09-32-49.bag    A1-401d-validation_2017-03-07-09-32-49.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408a-validation_2017-03-06-18-03-44.bag    A1-408a-validation_2017-03-06-18-03-44.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408b-validation_2017-03-07-14-10-42.bag    A1-408b-validation_2017-03-07-14-10-42.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-501c-validation_2017-03-07-09-04-18.bag    A1-501c-validation_2017-03-07-09-04-18.bag]&lt;br /&gt;
&lt;br /&gt;
==== Spoofing attack (test route) ====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-408a_2017-03-06-17-58-28.bag               A2-408a_2017-03-06-17-58-28.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-501c_2017-03-07-09-10-55.bag               A2-501c_2017-03-07-09-10-55.bag           ]&lt;br /&gt;
&lt;br /&gt;
==== Spoofing attack (validation route) ====&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-401d-validation_2017-03-07-09-35-07.bag    A2-401d-validation_2017-03-07-09-35-07.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-408a-validation_2017-03-06-18-01-09.bag    A2-408a-validation_2017-03-06-18-01-09.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-501c-validation_2017-03-07-09-13-14.bag    A2-501c-validation_2017-03-07-09-13-14.bag]&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots&amp;diff=5138</id>
		<title>Benchmark dataset for training/testing of Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots&amp;diff=5138"/>
				<updated>2017-10-11T11:24:32Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This data report summarizes a benchmark dataset which can be used to train and test Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots. Data have been gathered in an indoor mock-up apartment, shown in Fig 2, located at the Robotics Lab of the University of León (Spain). An autonomous robot, called Orbi-One and shown in Fig 1, with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
== Materials ==&lt;br /&gt;
&lt;br /&gt;
Data gathered by Orbi-One robot include: &lt;br /&gt;
* Orbi-One location estimates provided by a commercial RTLS, called '''KIO'''.&lt;br /&gt;
&lt;br /&gt;
Additional information about Karen and the devices/packages used to get data is given below.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-One robot ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One, shown at Fig 1, is an assistant robot manufactured by [http://www.robotnik.es/manipuladores-roboticos-moviles/rb-one/ Robotnik]. The software to control the robot hardware is based on [http://www.ros.org/ ROS].&lt;br /&gt;
&lt;br /&gt;
[[File:Orbi-Obe_and_KIO.jpg|thumb|'''Fig. 1''': Orbi-One and KIO RTLS.]]&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
KIO RTLS commercial solution by [https://www.eliko.ee/products/kio-rtls/ Eliko] has been used to provide people location at the study area. Fig 1 shows a KIO beacon (1), and a KIO tag on the robot (2).&lt;br /&gt;
&lt;br /&gt;
== Recording procedure ==&lt;br /&gt;
&lt;br /&gt;
Two predefined trajectories were set for the robot in the study area, as shown in Fig 2: a test trajectory (light gray line in Fig 2) was used to build a training dataset, for training and testing the models; and a validation trajectory (dark gray line in Fig 2) was used to generate a different dataset to validate models in a different location ensuring generalization. The robot started in both cases at the point marked &amp;quot;0&amp;quot; and finished at the point marked &amp;quot;1&amp;quot;&amp;quot;. Data were recorded by the Orby-One robot moving through the apartment, remotely controlled, following the test and validation trajectories respectively. We created a different rosbag file every time Orby-One made the walk, saving the location estimates gathered by the KIO device for a later analysis.&lt;br /&gt;
&lt;br /&gt;
[[File:apartamento_leon_robotics2017.jpg|thumb|'''Fig. 2''': Robotics mobile lab plane. Light gray line shows the test trajectory. Dark gray line shows the validation trajectory. Red dots show the location of anchors.]]&lt;br /&gt;
&lt;br /&gt;
We repeated the test and validation trajectories 10 times each, so that 20 rosbag files were recorded. Orby-One takes about 72 seconds to finish the walk following the test trajectory, and about 40 seconds following the validation trajectory. Each test run yielded 270 location estimates on average, validation runs 150.&lt;br /&gt;
&lt;br /&gt;
The runs were recorded in three different scenarios: without suffering any attack, suffering a DoS attack, and suffering a Spoofing attack. DoS attacks were carried out by interrupting the signal of one or more radio beacons. Spoofing attacks were carried out by changing the signal of the radio beacons. The affected radio beacons were selected by looking for anchors with redundancy (A-anchors) and anchors without (C- and D-anchors), at different locations.&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
=== v1.0 ===&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-06-17-48-50.bag                    WA_2017-03-06-17-48-50.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-08-46-41.bag                    WA_2017-03-07-08-46-41.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-08-48-56.bag                    WA_2017-03-07-08-48-56.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-08-50-50.bag                    WA_2017-03-07-08-50-50.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-14-53-42.bag                    WA_2017-03-07-14-53-42.bag                ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA_2017-03-07-14-55-29.bag                    WA_2017-03-07-14-55-29.bag                ]&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-06-18-05-27.bag         WA-validation_2017-03-06-18-05-27.bag     ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-07-08-53-28.bag         WA-validation_2017-03-07-08-53-28.bag     ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-07-08-54-40.bag         WA-validation_2017-03-07-08-54-40.bag     ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/WA-validation_2017-03-07-08-55-48.bag         WA-validation_2017-03-07-08-55-48.bag     ]&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-401d_2017-03-07-14-15-12.bag               A1-401d_2017-03-07-14-15-12.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408a_2017-03-06-17-53-12.bag               A1-408a_2017-03-06-17-53-12.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408b_2017-03-07-14-08-25.bag               A1-408b_2017-03-07-14-08-25.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-501c_2017-03-07-09-00-44.bag               A1-501c_2017-03-07-09-00-44.bag           ]&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-401d-validation_2017-03-07-09-32-49.bag    A1-401d-validation_2017-03-07-09-32-49.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408a-validation_2017-03-06-18-03-44.bag    A1-408a-validation_2017-03-06-18-03-44.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-408b-validation_2017-03-07-14-10-42.bag    A1-408b-validation_2017-03-07-14-10-42.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A1-501c-validation_2017-03-07-09-04-18.bag    A1-501c-validation_2017-03-07-09-04-18.bag]&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-408a_2017-03-06-17-58-28.bag               A2-408a_2017-03-06-17-58-28.bag           ]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-501c_2017-03-07-09-10-55.bag               A2-501c_2017-03-07-09-10-55.bag           ]&lt;br /&gt;
&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-401d-validation_2017-03-07-09-35-07.bag    A2-401d-validation_2017-03-07-09-35-07.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-408a-validation_2017-03-06-18-01-09.bag    A2-408a-validation_2017-03-06-18-01-09.bag]&lt;br /&gt;
# [http://robotica.unileon.es/~datasets/MLClassifiersTraining/v1.0/A2-501c-validation_2017-03-07-09-13-14.bag    A2-501c-validation_2017-03-07-09-13-14.bag]&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots&amp;diff=5134</id>
		<title>Benchmark dataset for training/testing of Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots&amp;diff=5134"/>
				<updated>2017-10-11T10:04:18Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This data report summarizes a benchmark dataset which can be used to train and test Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots. Data have been gathered in an indoor mock-up apartment, shown in Fig 2, located at the Robotics Lab of the University of León (Spain). An autonomous robot, called Orbi-One and shown in Fig 1, with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
== Materials ==&lt;br /&gt;
&lt;br /&gt;
Data gathered by Orbi-One robot include: &lt;br /&gt;
* Orbi-One location estimates provided by a commercial RTLS, called '''KIO'''.&lt;br /&gt;
&lt;br /&gt;
Additional information about Karen and the devices/packages used to get data is given below.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-One robot ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One, shown at Fig 1, is an assistant robot manufactured by [http://www.robotnik.es/manipuladores-roboticos-moviles/rb-one/ Robotnik]. The software to control the robot hardware is based on [http://www.ros.org/ ROS].&lt;br /&gt;
&lt;br /&gt;
[[File:Orbi-Obe_and_KIO.jpg|thumb|'''Fig. 1''': Orbi-One and KIO RTLS.]]&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
KIO RTLS commercial solution by [https://www.eliko.ee/products/kio-rtls/ Eliko] has been used to provide people location at the study area. Fig 1 shows a KIO beacon (1), and a KIO tag on the robot (2).&lt;br /&gt;
&lt;br /&gt;
== Recording procedure ==&lt;br /&gt;
&lt;br /&gt;
Two predefined trajectories were set for the robot in the study area, as shown in Fig 2: a test trajectory (light gray line in Fig 2) was used to build a training dataset, for training and testing the models; and a validation trajectory (dark gray line in Fig 2) was used to generate a different dataset to validate models in a different location ensuring generalization. The robot started in both cases at the point marked &amp;quot;0&amp;quot; and finished at the point marked &amp;quot;1&amp;quot;&amp;quot;. Data were recorded by the Orby-One robot moving through the apartment, remotely controlled, following the test and validation trajectories respectively. We created a different rosbag file every time Orby-One made the walk, saving the location estimates gathered by the KIO device for a later analysis.&lt;br /&gt;
&lt;br /&gt;
[[File:apartamento_leon_robotics2017.jpg|thumb|'''Fig. 2''': Robotics mobile lab plane. Light gray line shows the test trajectory. Dark gray line shows the validation trajectory. Red dots show the location of anchors.]]&lt;br /&gt;
&lt;br /&gt;
We repeated the test and validation trajectories 10 times each, so that 20 rosbag files were recorded. Orby-One takes about 72 seconds to finish the walk following the test trajectory, and about 40 seconds following the validation trajectory. Each test run yielded 270 location estimates on average, validation runs 150.&lt;br /&gt;
&lt;br /&gt;
The runs were recorded in three different scenarios: without suffering any attack, suffering a DoS attack, and suffering a Spoofing attack. DoS attacks were carried out by interrupting the signal of one or more radio beacons. Spoofing attacks were carried out by changing the signal of the radio beacons. The affected radio beacons were selected by looking for anchors with redundancy (A-anchors) and anchors without (C- and D-anchors), at different locations.&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
=== v1.0 ===&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots&amp;diff=5133</id>
		<title>Benchmark dataset for training/testing of Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots&amp;diff=5133"/>
				<updated>2017-10-11T10:02:42Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This data report summarizes a benchmark dataset which can be used to train and test Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots. Data have been gathered in an indoor mock-up apartment, shown in Fig 2, located at the Robotics Lab of the University of León (Spain). An autonomous robot, called Orbi-One and shown in Fig 1, with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
== Materials ==&lt;br /&gt;
&lt;br /&gt;
Data gathered by Orbi-One robot include: &lt;br /&gt;
* Orbi-One location estimates provided by a commercial RTLS, called '''KIO'''.&lt;br /&gt;
&lt;br /&gt;
Additional information about Karen and the devices/packages used to get data is given below.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-One robot ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One, shown at Fig 1, is an assistant robot manufactured by [http://www.robotnik.es/manipuladores-roboticos-moviles/rb-one/ Robotnik]. The software to control the robot hardware is based on [http://www.ros.org/ ROS].&lt;br /&gt;
&lt;br /&gt;
[[File:Orbi-Obe_and_KIO.jpg|thumb|'''Fig. 1''': Orbi-One and KIO RTLS.]]&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
KIO RTLS commercial solution by [https://www.eliko.ee/products/kio-rtls/ Eliko] has been used to provide people location at the study area. Fig 1 shows a KIO beacon (1), and a KIO tag on the robot (2).&lt;br /&gt;
&lt;br /&gt;
== Recording procedure ==&lt;br /&gt;
&lt;br /&gt;
Two predefined trajectories were set for the robot in the study area, as shown in Fig 2: a test trajectory (light gray line in Fig 2) was used to build a training dataset, for training and testing the models; and a validation trajectory (dark gray line in Fig 2) was used to generate a different dataset to validate models in a different location ensuring generalization. The robot started in both cases at the point marked &amp;quot;0&amp;quot; and finished at the point marked &amp;quot;1&amp;quot;&amp;quot;. Data were recorded by the Orby-One robot moving through the apartment, remotely controlled, following the test and validation trajectories respectively. We created a different rosbag file every time Orby-One made the walk, saving the location estimates gathered by the KIO device for a later analysis.&lt;br /&gt;
&lt;br /&gt;
We repeated the test and validation trajectories 10 times each, so that 20 rosbag files were recorded. Orby-One takes about 72 seconds to finish the walk following the test trajectory, and about 40 seconds following the validation trajectory. Each test run yielded 270 location estimates on average, validation runs 150.&lt;br /&gt;
&lt;br /&gt;
The runs were recorded in three different scenarios: without suffering any attack, suffering a DoS attack, and suffering a Spoofing attack. DoS attacks were carried out by interrupting the signal of one or more radio beacons. Spoofing attacks were carried out by changing the signal of the radio beacons. The affected radio beacons were selected by looking for anchors with redundancy (A-anchors) and anchors without (C- and D-anchors), at different locations.&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
=== v1.0 ===&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots&amp;diff=5132</id>
		<title>Benchmark dataset for training/testing of Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots&amp;diff=5132"/>
				<updated>2017-10-11T09:58:53Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This data report summarizes a benchmark dataset which can be used to train and test Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots. Data have been gathered in an indoor mock-up apartment, shown in Fig 2 , located at the Robotics Lab of the University of León (Spain). An autonomous robot, called Orbi-One and shown in Fig 1, with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
== Materials ==&lt;br /&gt;
&lt;br /&gt;
Data gathered by Orbi-One robot include: &lt;br /&gt;
* Orbi-One location estimates provided by a commercial RTLS, called '''KIO'''.&lt;br /&gt;
&lt;br /&gt;
Additional information about Karen and the devices/packages used to get data is given below.&lt;br /&gt;
&lt;br /&gt;
=== Orbi-One robot ===&lt;br /&gt;
&lt;br /&gt;
Orbi-One, shown at Fig 1 (A), is an assistant robot manufactured by [http://www.robotnik.es/manipuladores-roboticos-moviles/rb-one/ Robotnik]. The software to control the robot hardware is based on [http://www.ros.org/ ROS].&lt;br /&gt;
&lt;br /&gt;
[[File:Orbi-Obe_and_KIO.jpg|thumb|'''Fig. 1''': Orbi-Obe and KIO RTLS.]]&lt;br /&gt;
&lt;br /&gt;
=== KIO RTLS ===&lt;br /&gt;
&lt;br /&gt;
KIO RTLS commercial solution by [https://www.eliko.ee/products/kio-rtls/ Eliko] has been used to provide people location at the study area. Fig 1 shows a KIO beacon (1), and a KIO tag on the robot (2).&lt;br /&gt;
&lt;br /&gt;
== Recording procedure ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
&lt;br /&gt;
=== v1.0 ===&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	<entry>
		<id>https://robotica.unileon.es/index.php?title=Datasets&amp;diff=5131</id>
		<title>Datasets</title>
		<link rel="alternate" type="text/html" href="https://robotica.unileon.es/index.php?title=Datasets&amp;diff=5131"/>
				<updated>2017-10-11T09:58:19Z</updated>
		
		<summary type="html">&lt;p&gt;Am6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This site summarizes different datasets gathered by the Robotics Group during their researches.&lt;br /&gt;
&lt;br /&gt;
__FORCETOC__&lt;br /&gt;
&lt;br /&gt;
== Available datasets ==&lt;br /&gt;
&lt;br /&gt;
Currently the following datasets are available:&lt;br /&gt;
&lt;br /&gt;
=== [http://robotica.unileon.es/index.php/Benchmark_dataset_for_analysis_of_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots Benchmark dataset for analysis of cyber-attacks to an indoor real time localization system for autonomous robots] ===&lt;br /&gt;
&lt;br /&gt;
This dataset can be used to analyze cyber-attacks to an indoor real time localization system for autonomous robots. Data have been gathered in an indoor mock-up apartmentlocated at the Robotics Lab of the University of León (Spain). An autonomous robot, called Karen, with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
=== [http://robotica.unileon.es/index.php/Benchmark_dataset_for_training/testing_of_Machine_Learning_Models_to_detect_cyber-attacks_to_an_indoor_real_time_localization_system_for_autonomous_robots Benchmark dataset for training/testing of Machine Learning Models to detect cyber-attacks to an indoor real time localization system for autonomous robots] ===&lt;br /&gt;
&lt;br /&gt;
This dataset can be used to train and test Machine Learning Models to detect cyber-attacks  to an indoor real time localization system for autonomous robots. Data have been gathered in an indoor mock-up apartmentlocated at the Robotics Lab of the University of León (Spain). An autonomous robot, called Orbi-One, with an on-board Real Time Location System (RTLS) was used to gather the data.&lt;br /&gt;
&lt;br /&gt;
=== [http://robotica.unileon.es/index.php/Benchmark_dataset_for_evaluation_of_range-based_people_tracker_classifiers_in_mobile_robots Benchmark dataset for evaluation of range-based people tracker classifiers in mobile robots] ===&lt;br /&gt;
&lt;br /&gt;
This dataset can be used to evaluate the performance of different approaches for detecting and tracking people by using lidar sensors. Information contained at the dataset is specially suitable to be used as training data for neural network-based classifiers.&lt;/div&gt;</summary>
		<author><name>Am6</name></author>	</entry>

	</feed>