Optical flow estimation on image sequences with differently exposed frames
Artikel i vetenskaplig tidskrift, 2015

Optical flow (OF) methods are used to estimate dense motion information between consecutive frames in image sequences. In addition to the specific OF estimation method itself, the quality of the input image sequence is of crucial importance to the quality of the resulting flow estimates. For instance, lack of texture in image frames caused by saturation of the camera sensor during exposure can significantly deteriorate the performance. An approach to avoid this negative effect is to use different camera settings when capturing the individual frames. We provide a framework for OF estimation on such sequences that contain differently exposed frames. Information from multiple frames are combined into a total cost functional such that the lack of an active data term for saturated image areas is avoided. Experimental results demonstrate that using alternate camera settings to capture the full dynamic range of an underlying scene can clearly improve the quality of flow estimates. When saturation of image data is significant, the proposed methods show superior performance in terms of lower endpoint errors of the flow vectors compared to a set of baseline methods. Furthermore, we provide some qualitative examples of how and when our method should be used.

Motion information

Cost functionals

Multiple cameras

High dynamic range

multiple camera settings

high-dynamic range

Cameras

Baseline methods

Optical flows

Estimation methods

temporal regularization

optical flow estimation

Computer vision

Författare

Tomas Bengtsson

Chalmers, Signaler och system, Signalbehandling och medicinsk teknik

Tomas McKelvey

Chalmers, Signaler och system, Signalbehandling och medicinsk teknik

Konstantin Lindström

Volvo Cars

Optical Engineering

0091-3286 (ISSN) 15602303 (eISSN)

Vol. 54 9 Article Number: 093103- 093103

Ämneskategorier

Signalbehandling

DOI

10.1117/1.OE.54.9.093103

Mer information

Senast uppdaterat

2018-11-15