Mobile-robot reflectance acquisition to detect plastic wrapping
Motivation and scope
There are several applications of computer vision and field robotics where it is important to understand what materials things in the environment are made of. State-of-the-art methods for 3D mapping and scene reconstruction are capable of creating detailed 3D models of the geometry of a place, or an object, but to also faithfully model the appearance of surfaces (e.g., under different lighting conditions) and what they are made of, is a harder problem.
Traditionally, reflectance acquisition has been performed in a controlled setups, with a point-like light source and camera, at a set of known configurations. For a robot operating in uncontrolled environments, it would be useful to be able to acquire material reflectance properties also in cases where it is not possible to study each surface in detail, and to only use available (uncontrolled) lighting.
In particular, this project is to be performed within the ILIAD Horizon2020 project, where one of the tasks is to detect transparent plastic stretch wrapping on goods stacked on pallets in a warehouse environment. The goal is to enable a robotic lift truck to (a) determine whether a pallet that it drives by is wrapped or not and (b) if possible, detect the extents of the plastic sheet, in order to guide a robotic manipulator to the edge of the sheet so as to unwrap the pallet.
As an initial step, reflectance may be measured using the returned intensity from laser range measurements at different angles. For glossy plastic surfaces, the intensity should have a sharp peak at small angles. A similar setup can be created using an RGB-D camera using projected-light triangulation. A more far-reaching goal would be to learn the reflectance properties without any active illumination, but simultaneously detect ambient light fields and model surface reflectance.
Specific tasks
- Minimal requirements:
- Using prerecorded lidar data from end-user sites, construct a classifier that can label an object as "plastic" or "not plastic".
- Compare performance with reference measurements from a dedicated reflectance measurement device.
- Integrate with existing (in-house) 3D mapping and localisation software.
- If time and interest permits:
- Construct a classifier that uses RGB-D data instead of lidar data.
- Classify surfaces per point, instead of per object.
- Using lidar or RGB-D data, estimate a parametric model of the reflectance properties (BRDF), as opposed to a binary classification.
- Render reconstructed scenes under different (artificial) lighting conditions, using learned material glossiness.
- Estimate parametric reflectance models (BRDFs) using ambient light instead of light projected from the robot's active sensors.
Necessary skills
- Good experience with C++ programming.
- Experience with the ROS ecosystem would be helpful (but not required).
- Experience with computer graphics (BRDF modelling, rendering, path tracing) and statistics (Gaussian processes, Bayes filters) is also a bonus, but not required.
Contact: Martin Magnusson