Single-shot self-supervised object detection in microscopy
Journal article, 2022

Object detection is a fundamental task in digital microscopy, where machine learning has made great strides in overcoming the limitations of classical approaches. The training of state-of-the-art machine-learning methods almost universally relies on vast amounts of labeled experimental data or the ability to numerically simulate realistic datasets. However, experimental data are often challenging to label and cannot be easily reproduced numerically. Here, we propose a deep-learning method, named LodeSTAR (Localization and detection from Symmetries, Translations And Rotations), that learns to detect microscopic objects with sub-pixel accuracy from a single unlabeled experimental image by exploiting the inherent roto-translational symmetries of this task. We demonstrate that LodeSTAR outperforms traditional methods in terms of accuracy, also when analyzing challenging experimental data containing densely packed cells or noisy backgrounds. Furthermore, by exploiting additional symmetries we show that LodeSTAR can measure other properties, e.g., vertical position and polarizability in holographic microscopy.

Author

Benjamin Midtvedt

University of Gothenburg

Jesús Pineda

University of Gothenburg

Fredrik Skärberg

University of Gothenburg

Erik Olsén

Chalmers, Physics, Nano and Biophysics

Harshith Bachimanchi

University of Gothenburg

Emelie Vilhelmsson Wesén

Chalmers, Biology and Biological Engineering, Chemical Biology

Elin Esbjörner Winters

Chalmers, Biology and Biological Engineering, Chemical Biology

Erik Selander

University of Gothenburg

Fredrik Höök

Chalmers, Physics, Nano and Biophysics

Daniel Midtvedt

University of Gothenburg

Giovanni Volpe

University of Gothenburg

Nature Communications

2041-1723 (ISSN) 20411723 (eISSN)

Vol. 13 1 7492

Subject Categories

Other Computer and Information Science

Robotics

Computer Vision and Robotics (Autonomous Systems)

DOI

10.1038/s41467-022-35004-y

PubMed

36470883

More information

Latest update

10/27/2023