Rethinking Long-Tailed Visual Recognition with Dynamic Probability Smoothing and Frequency Weighted Focusing
Paper in proceeding, 2023

Deep learning models trained on long-tailed (LT) datasets often exhibit bias towards head classes with high frequency. This paper highlights the limitations of existing solutions that combine class- and instance-level re-weighting loss in a naive manner. Specifically, we demonstrate that such solutions result in overfitting the training set, significantly impacting the rare classes. To address this issue, we propose a novel loss function that dynamically reduces the influence of outliers and assigns class-dependent focusing parameters. We also introduce a new long-tailed dataset, ICText-LT, featuring various image qualities and greater realism than artificially sampled datasets. Our method has proven effective, outperforming existing methods through superior quantitative results on CIFAR-LT, Tiny ImageNet-LT, and our new ICText-LT datasets. The source code and new dataset are available at \url{https://github.com/nwjun/FFDS-Loss}

Long-tailed Classification

Weighted-loss

Author

Wan Jun Nah

Universiti Malaya

Chun Chet Ng

Universiti Malaya

Che-Tsung Lin

Chalmers, Electrical Engineering, Signal Processing and Biomedical Engineering

Yeong Khang Lee

Centre of Excellence, ViTrox Corporation Berhad

Jie Long Kew

Universiti Malaya

Zhi Qin Tan

Universiti Malaya

Chee Seng Chan

Universiti Malaya

Christopher Zach

Chalmers, Electrical Engineering, Signal Processing and Biomedical Engineering

Shang Hong Lai

National Tsing Hua University

Proceedings - International Conference on Image Processing, ICIP

15224880 (ISSN)

435-439
978-1-7281-9835-4 (ISBN)

30th IEEE International Conference on Image Processing, ICIP 2023
Kuala Lumpur, Malaysia,

Subject Categories

Electrical Engineering, Electronic Engineering, Information Engineering

DOI

10.1109/ICIP49359.2023.10222779

More information

Latest update

2/5/2024 1