Progressive Batching for Efficient Non-linear Least Squares
Paper i proceeding, 2021

Non-linear least squares solvers are used across a broad range of offline and real-time model fitting problems. Most improvements of the basic Gauss-Newton algorithm tackle convergence guarantees or leverage the sparsity of the underlying problem structure for computational speedup. With the success of deep learning methods leveraging large datasets, stochastic optimization methods received recently a lot of attention. Our work borrows ideas from both stochastic machine learning and statistics, and we present an approach for non-linear least-squares that guarantees convergence while at the same time significantly reduces the required amount of computation. Empirical results show that our proposed method achieves competitive convergence rates compared to traditional second-order approaches on common computer vision problems, such as image alignment and essential matrix estimation, with very large numbers of residuals.

Non-convex optimization, non-linear least squares

Författare

Huu Le

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

Christopher Zach

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

Edward Rosten

Snap Inc

Oliver Woodford

Snap Inc

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

03029743 (ISSN) 16113349 (eISSN)

Vol. 12624 LNCS 506-522
9783030695347 (ISBN)

15th Asian Conference on Computer Vision
Kyoto, Virtual online, Japan,

Ämneskategorier

Datorseende och robotik (autonoma system)

DOI

10.1007/978-3-030-69535-4_31

Mer information

Senast uppdaterat

2023-03-21