Safe and Efficient Collaborative Automation Systems (SECAS)
Research Project, 2022 – 2027

This project is concerned with handling uncertainty between perception, planning and control for autonomous and collaborative robots in an industrial setting. Understanding and handling uncertainty are important for two reasons. (i) Humans must be able to interact with the system in a safe way; (ii) the system has to fulfill performance expectations. Understanding and handling of uncertainty is critical to develop automated systems that are both safe and efficient. In this project we aim at integrating uncertainty estimations from perception with planning and control. We will consider two industrial use cases. (a) Control of a fleet of mobile robot guided by cameras mounted in the ceiling navigating in an environment where human operators are working; (b) collaboration between a human and an industrial collaborative robot completing an assembly task together. It is our vision that by addressing the two use cases by integrating uncertainty into the methods of the perception, planning, and control we will be able to push the state-of-art for human-in-the-loop logistics systems.        

While the industrial uses cases provided by AB Volvo are generic and relevant not only to manufacturing industry but also other logistic applications, for example, in the food or pharmaceutical industry, they also have a few characteristics that distinguish it from the other applications, including autonomous driving. Firstly, the robots operate in a controlled environment with many static objects. Secondly, the system needs to handle on the order of tens of thousands of different types of parts, but the geometry of the parts are known. Third, manually annotated data will be scarce due to the large number of assembly station and large number of objects.

 


Recently, uncertainty quantification in deep learning has been studied in various perception tasks, such as object detection and semantic segmentation. Classifying uncertainty into aleatoric and epistemic uncertainty has provided both better understanding of the model's output, for example in terms of confidence intervals, as well as performance gains through for example active learning. In the motion forecasting literature, however, uncertainty quantification is not well studied and robustness is rarely considered an important issue.Furthermore, how uncertainties of both the current and future state of the system should be handled in planning and control is also an open question that needs to be addressed to enable safe collaboration of robots and humans. While identifying the uncertainties can be enough to take precautionary action, such as stopping the system when encountering scenarios with too high uncertainty, sophisticated modelling of the uncertainties is needed to be able to device a planning and control scheme that can safely and efficiently navigate through uncertain environments. With background to this problem description, we aim to address the following research questions in this project: 

1. How to identify, classify, and model different types of uncertainty for perception and motion forecasting tasks?       

2. How can uncertainty in perception and motion forecasting models be reduced?       

3. How to handle modelled uncertainty during planning and control of autonomous industrial robots?

Towards RQ1, we plan to extend the current methodologies for uncertainty quantification of perception tasks to the domain of motion forecasting. Methods to investigate could include well established ones such as Monte Carlo Dropout inference or model ensembles, or more recent approaches that estimate uncertainty in a single forward pass. For RQ2, we will not limit ourselves to active learning, which is an established method to reduce epistemic uncertainty, but also investigate semi-supervised or self-supervised approaches that leverage for example spatial and temporal consistency to train also on unlabelled data. Towards RQ3, we plan to further investigate receding horizon approaches with explicit handling of uncertainty for both planning and control. In all of this, we will try to address and take advantage of the unique characteristics of the presented industrial use cases, including the unique setup of static ceiling mounted cameras with overlapping field of view, prior knowledge of the environment and objects, and limited amount of labelled data.

Participants

Knut Åkesson (contact)

Chalmers, Electrical Engineering, Systems and control

Kristofer Bengtsson

Chalmers, Electrical Engineering, Systems and control

Erik Brorsson

Chalmers, Electrical Engineering, Systems and control

Lennart Svensson

Chalmers, Electrical Engineering, Signal Processing and Biomedical Engineering

Collaborations

Volvo Group

Gothenburg, Sweden

Funding

Wallenberg AI, Autonomous Systems and Software Program

(Funding period missing)

Related Areas of Advance and Infrastructure

Production

Areas of Advance

Publications

More information

Latest update

9/11/2025