Coalescent Mobile Robotics

Trolley Detector

Motivation

Coalescent Mobile Robotics robots help supermarkets by moving trolleys to help its staff. Specifically, our robots can move, dock trolleys, move docked trolleys and undock trolleys.

To be able to dock trolleys we need to have a very good estimate of the pose of the trolley with respect to the robot. This is necessary to approach the trolley and dock it without colliding with the trolley’s legs. Besides, we need to dock and move the trolley by keeping our robot in the center of the trolley. This allows the robot to move the trolley safely and predictably. Thus, we need to have a good estimate of the trolley pose with respect to the robot when the robot is docked to the trolley.

Objective

Using computer vision and laser sensory data, keep an accurate estimate of the relative pose of the trolley with respect to the robot. There might be multiple trolleys within the robot’s field of view. Thus, the solution should be reliable such that the robot can estimate its relative pose with respect to every trolley in a region of interest inside its field of view.

Problem Statement

Our robots have multiple sensors that can aid in the detection of the trolleys:

  • Laser: this sensor produces 2D cloud points. These 2D points are parallel to the ground plane, but they might have slight orientation deviation that shall be accounted for. The advantage of these points is that they provide directly a location in 3D space (for a fixed height) relative to the robot, as well as the high sample frequency.
  • RGB-D Cameras: we have two of these sensors per robot that are facing the front and have an overlapping field of view. This sensor provides 3 color channels plus depth. These sensors are richer than the laser, but they have a higher error and a lower sample frequency. The error of such sensors depends on the distance to the object as well as how parallel the projecting ray is to the object’s surface.
  • IMU and wheel encoder: these are sensors utilized to extract robot’s odometry. For trolley detection it might be necessary to account for odometry state to track the trolleys. For instance, errors in odometry can affect tracking of the trolley.

 

Off of these sensors, the system shall estimate the relative pose of the trolleys within the robot’s field of view.

The following drawing illustrates an example of what we expect in terms of functionality. At the top-left corner, there is the robot with its pose in orange. There are two trolleys in its field of view (Trolley 1 and Trolley 2). For each trolley there is the correct (orange) pose as well as the predicted (green) one. In red you can see the translation error (on top of it, there will be an orientation error as well). In black you can see the position of the trolleys with respect to the robot.

Trolley detector

Additionally, the solution might incorporate a standard deviation area of the different predictions. For instance, if the predictions follow a multinomial distribution, there could be more than one candidate solution. This can be important to analyze how the variance is reduced by incorporating new predictions, or how it increases in the presence of moving trolleys.

There will also be an acceptance criterion between the true and the estimated poses. For instance, that can be 1 cm position tolerance and 0.5 degrees orientation. The acceptance criteria shall have into account the mean, standard deviation, and maximum values of the error. These values shall be agreed with our robotics engineers to make sure the different tasks that the robot performs can be achieved with the agreed values.

Milestones

  1. Produce a validation dataset of trolley pose relative to the robot. This dataset will be used to measure the quality of the different solutions provided. The dataset shall include the following groups of data: partially occluded trolleys, multiple trolleys, robot docked in the trolley and absence of trolleys. The results of the different solutions shall be provided in aggregate as well as specifically for each data group.
  2. Implement a solution of the trolley relative pose estimate that only has into account current sensory data (not previous ones).
  3. Implement a solution that integrates the relative pose estimate over time.
  4. Summary of the final solution with quality metrics regarding distance as well as orientation. The solution shall be able to run in a container and in case supervised learning is used, automated scripts to train and refine models as well as scripts to update the training, validation and test datasets.

Apply for this Project

Please accept [renew_consent]%cookie_types[/renew_consent] cookies to watch this video.

Download White Paper

We will send our white paper to your email.