Mobile-robot reflectance acquisition to detect plastic wrapping
![](/contentassets/1d47b887a2954e3ebc03036517442bc9/iliad-concept-edited.jpeg?w=720)
Motivation and scope
There are several applications of computer vision and field robotics where it is important to understand what materials things in the environment are made of. State-of-the-art methods for 3D mapping and scene reconstruction are capable of creating detailed 3D models of the geometry of a place, or an object, but to also faithfully model the appearance of surfaces (e.g., under different lighting conditions) and what they are made of, is a harder problem.
Traditionally, reflectance acquisition has been performed in a controlled setups, with a point-like light source and camera, at a set of known configurations. For a robot operating in uncontrolled environments, it would be useful to be able to acquire material reflectance properties also in cases where it is not possible to study each surface in detail, and to only use available (uncontrolled) lighting.
As an initial step, reflectance may be measured using the returned intensity from laser range measurements at different angles. For glossy plastic surfaces, the intensity should have a sharp peak at small angles. A similar setup can be created using an RGB-D camera using projected-light triangulation. A more far-reaching goal would be to learn the reflectance properties without any active illumination, but simultaneously detect ambient light fields and model surface reflectance.
Specific tasks
- Minimal requirements:
- Using prerecorded lidar data from end-user sites, construct a classifier that can label an object as "plastic" or "not plastic".
- Compare performance with reference measurements from a dedicated reflectance measurement device.
- Integrate with existing (in-house) 3D mapping and localisation software.
- If time and interest permits:
- Construct a classifier that uses RGB-D data instead of lidar data.
- Classify surfaces per point, instead of per object.
- Using lidar or RGB-D data, estimate a parametric model of the reflectance properties (BRDF), as opposed to a binary classification.
- Render reconstructed scenes under different (artificial) lighting conditions, using learned material glossiness.
- Estimate parametric reflectance models (BRDFs) using ambient light instead of light projected from the robot's active sensors.
Necessary skills
- Good experience with C++ programming.
- Experience with the ROS ecosystem would be helpful (but not required).
- Experience with computer graphics (BRDF modelling, rendering, path tracing) and statistics (Gaussian processes, Bayes filters) is also a bonus, but not required.
Contact: Martin Magnusson