Control of visually guided event-based behaviors in industrial robotic systems
Licentiatavhandling, 2017

Vision-based robotics is an ever-growing field within industrial automation. Demands for greater flexibility and higher quality motivate manufacturing companies to adopt these technologies for such tasks as material handling, assembly, and inspection. Manufacturing systems, as well as their control mechanism, are typically modeled as discrete event systems, and off-the-shelf PLC hardware is used for realization of sequential control systems. However, with introduction of robots and machine vision solutions, it becomes harder to reason about the combined systems collectively. Vi- sion algorithms require complex processing, and imaging setups have to be accurately calibrated with respect to other active systems such as robots, sensors, and material handling equipment. This thesis considers the application area of machine vision, and particularly vi- sual metrology systems, from the perspective of being a part of larger cyber-physical systems. This includes, in addition to the traditional computer vision algorithms and estimation methods, considerations of distributed system architecture and behavioral characteristics expressed with discrete event semantics. The ultimate goal is to make the first steps towards a theory backing control of visually-guided event-based be- haviors in industrial robotic and automation systems. The thesis approaches the goal by a combination of laboratory results, case study of an industrial company, and formalization of modeling abstractions and archi- tectural solutions. The practical results include a method for image analysis of star washers (small automotive parts), a feature engineering technique and machine learning experiments for classification of star washers’ orientation on a feeder, and a probabilistic analysis of the camera calibration process. In addition, to model and implement industrial vision systems in event-driven environment, a formalism of Discrete Event Data Flow is formulated, and the EPypes software framework is developed.

discrete event systems

cyber-physical systems

robotics middleware

camera calibration

machine learning

data flow

Machine vision

Room EC, floor 5, Hörsalsvägen 11, Campus Johanneberg
Opponent: Assoc. Prof. Mikael Ekström, Mälardalen University

Författare

Oleksandr Semeniuta

Chalmers, Signaler och system, System- och reglerteknik

Semeniuta, O., Falkman, P. EPypes: a framework for building event-driven data processing pipelines for cloud robotics and automation

Semeniuta, O., Dransfeld, S., Martinsen, K., Falkman, P. Towards increased intelligence and automatic improvement in industrial vision systems

Styrkeområden

Informations- och kommunikationsteknik

Produktion

Ämneskategorier

Inbäddad systemteknik

Robotteknik och automation

Datavetenskap (datalogi)

Utgivare

Chalmers

Room EC, floor 5, Hörsalsvägen 11, Campus Johanneberg

Opponent: Assoc. Prof. Mikael Ekström, Mälardalen University

Mer information

Skapat

2017-03-20