This page in Swedish

Research projects

SeSAM - Sensorbased safety for autonomous mobility

About this project

Project information

Project status

Completed

Contact

Dimiter Driankov

Research subject

Research environments

The main goal of the project is to develop and improve methods used in safety critical applications with a focus on autonomous machines to increase the detection rate and perceptual range of safety sensors.

Current used methods commonly utilize the latest sensor readings in order to detect and track objects. Especially for autonomous platforms that operate in a confined space that continuously re-observe the environment could utilize this information to detect changes and deviations from a normal state. For example, a truck that transport goods in a warehouse traverse the same regions multiple times every day; floor cleaning equipment covers the same are multiple times per week, robotic systems used for transportation typically follows the same trajectories and so on. By using statistical properties significant deviations from a normal state can be determined, which indicates regions with a higher likelihood that something have occurred that could be a potential risk of safety. However, also cases which statistically have a lot of movements as normal can be potentially dangerous, for example, if lots of people move in the facilities during certain times during the day. Therefore regions commonly containing lots of dynamics will need more attention.

This information can be used to improve existing safety sensor or to apply this to regular sensor not typically used for safety applications such as standard cameras. Within the project an already developed reflective vest detection system will be used developed in an earlier project SAVIE but other sensors will be used as well ranging from standard cameras up to 3D laser scanners. These to assure that the developed methods are generic and can be applied to different scenarios at different costs.

The developed system will consist of a mapping component that updates the internal representation of the environment. A difference detection function will use the internal map together with the sensor data to detect changes in the environment. The difference will be used as a bias in the further processing steps to validate whether the differences stems from a moving object, the type of object and its position and orientation.

Research funding bodies