The overall aim is to develop a supportive tool that: -automatically can predict human behaviour considering dynamic effects and analyse human work in manual work stations from a musculoskeletal viewpoint. -automatically can optimise workstations where humans are supported with exoskeletons or collaborative robots. -considers human diversity in terms of appearance, shape, clothes, and safety equipment used during work. -easily can be understood, used and manipulated as well as adapted to recent developments within immersive VR and digital twin solutions.
At the end of the project, the demonstrator software will have: -functionality for advanced modelling of human body shapes, appearances and creation of personas. -functionality to simulate forces in muscle enabling musculoskeletal assessments. -functionality for assessment of human-robot collaboration from an ergonomics and productivity point of view. -functionality for automatic optimisation of workstations where workers are supported by either collaborative robots or exoskeletons or both. -full VR support and an efficient way of interaction.
The project is divided into a number of work packages that will lead the project towards the final objectives. From each work package, new knowledge and sub-solutions will be developed and implemented in a demonstrator. Throughout the project, the demonstrator will be tested and evaluated by industrial and academic partners. Cases from collaborating companies will be used to verify outcomes. A final demonstrator tool, combining the knowledge and solutions gained from the project, will be developed and allow for easier further adaptation and implementation in companies.
Projectleader Research at Chalmers, Industrial and Materials Science, Production Systems
Full Professor at Chalmers, Industrial and Materials Science, Production Systems
Funding Chalmers participation during 2019–2022