THÖR is a public dataset of human motion trajectories, recorded in a controlled indoor experiment.
We present THÖR: a dataset of motion trajectories with diverse and accurate social human motion data in a shared indoor environment
The dataset was recorded in a spacious laboratory room of 8.4x18.8 m and the adjacent utility room, separated by a glass wall. The laboratory room, where the motion capture system is installed, is mostly empty to allow for maneuvering of large groups, but also includes several constrained areas where obstacle avoidance and the choice of homotopy class is necessary. Goal positions are placed to force navigation along the room and generate frequent interactions in its center, while the placement of obstacles prevents walking between goals on a straight line.
To track the motion of the agents we used the Qualisys Oqus 7+ motion capture system with 10 infrared cameras, mounted on the perimeter of the room. For people tracking, the reflective markers have been arranged in distinctive 3D patterns on the bicycle helmets. There are 9 helmets in this dataset, marked from 2 to 10.
For recording gaze directions, we used Tobii Pro Glasses mobile tracking headset. The gaze sampling frequency of Tobii Pro Glasses is 50 Hz. It also has a scene camera which records the video at 25 fps. A gaze overlaid version of this video is included in this dataset
Stationary 3D LiDAR was placed in the corner of the room on the height of 1.58 m, synchronized in time with the Qualisys system and the eye-tracking glasses. In the opposite corner, a stationary camera was placed.
The robot, used in our experiment, is a small forklift Linde CitiTruck robot with a footprint of 1.56 x 0.55 meter and 1.17 meter high. It was programmed to move in a socially unaware manner, following a pre-defined path around the room and adjusting neither its speed nor trajectory to account for surrounding people. The robot was navigating with a maximal speed of 0.34 m/s and projecting its current motion intent on the floor in front of it using a mounted beamer.
In order to collect motion data relevant for a broad spectrum of
research areas, we have designed an experiment that encourages social interactions between
individuals, groups of people and with the robot. The interactive setup assigns social roles and
tasks so as to imitate typical activities found in populated spaces such as offices, train
stations, shopping malls or airports. Its goal is to motivate participants to engage into
natural and purposeful motion behaviors as well as to create a rich variety of unscripted
interactions.
There are three social roles in our experiment:
Recorded data includes motion capture files as Matlab structures and csv files, velodyne sweeps, ros bags from the motion capture system. We also share the Matlab scripts for loading, plotting and animating the motion capture data. We thoroughly inspected the motion capture data and manually cleaned it to remove occasional helmet ID switches and recover several lost tracks. Afterwards we applied an automated procedure to restore the lost positions of the helmets from incomplete set of recognized markers (included as a Matlab script).
The dataset includes 13 separate recordings in 3 variations:
Name | Details | Notes |
---|---|---|
THÖR - people tracks https://zenodo.org/record/3382145 |
mat
bag
tsv |
|
THÖR - point clouds https://zenodo.org/record/3405915 |
bag Contains 'velodyne_packets' from velodyne 3D LIDAR used in the experiments. 3D pointclouds need to be transformed to the Qualisys frame. |
bag Use thor_velodyne_utils to visualize nicely. Due to technical errors, for experiments 2.1 to 2.4, no velodyne data is available. |
THÖR - eye-tracking https://zenodo.org/record/3460326 |
mat/xlsx/tsv Data of the participant (Helmet number 9) from the Tobii Glasses included in this dataset. |
|
THÖR - videos https://zenodo.org/record/3460125 |
mp4 Videos recorded during data collection from stationary camera and an eye tracker. |
To obtain the access to the data please motivate your request and also present your strategy to protect it according to GDPR standards. |
THÖR - extras https://github.com/OrebroUniversity/thor_extras |
Scripts related to visualizing velodyne and Qualisys data in RViz. |
@article{thorDataset2019, title={TH{\"O}R: Human-Robot Navigation Data Collection and Accurate Motion Trajectories Dataset}, author={Rudenko, Andrey and Kucner, Tomasz P and Swaminathan, Chittaranjan S and Chadalavada, Ravi T and Arras, Kai O and Lilienthal, Achim J}, journal={IEEE Robotics and Automation Letters}, volume={5}, number={2}, pages={676--682}, year={2020}, publisher={IEEE} }