Building efficient CNNs using Depthwise Convolutional Eigen-Filters (DeCEF)
Artikel i vetenskaplig tidskrift, 2024

Deep Convolutional Neural Networks (CNNs) have been widely used in various domains due to their impressive capabilities. These models are typically composed of a large number of 2D convolutional (Conv2D) layers with numerous trainable parameters. To manage the complexity of such networks, compression techniques can be applied, which typically rely on the analysis of trained deep learning models. However, in certain situations, training a new CNN from scratch may be infeasible due to resource limitations. In this paper, we propose an alternative parameterization to Conv2D filters with significantly fewer parameters without relying on compressing a pre-trained CNN. Our analysis reveals that the effective rank of the vectorized Conv2D filters decreases with respect to the increasing depth in the network. This leads to the development of the Depthwise Convolutional Eigen-Filter (DeCEF) layer, which is a low rank version of the Conv2D layer with significantly fewer trainable parameters and floating point operations (FLOPs). The way we define the effective rank is different from previous work, and it is easy to implement and interpret. Applying this technique is straightforward – one can simply replace any standard convolutional layer with a DeCEF layer in a CNN. To evaluate the effectiveness of DeCEF layers, experiments are conducted on the benchmark datasets CIFAR-10 and ImageNet for various network architectures. The results have shown a similar or higher accuracy using about 2/3 of the original parameters and reducing the number of FLOPs to 2/3 of the base network. Additionally, analyzing the patterns in the effective rank provides insights into the inner workings of CNNs and highlights opportunities for future research.

Low rank approximation

Deep learning

Subspace method

Efficient network

Convolutional neural network

Network complexity

Författare

Yinan Yu

Chalmers, Data- och informationsteknik, Funktionell programmering

Samuel Scheidegger

Lumilogic

Asymptotic AI

Tomas McKelvey

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

Neurocomputing

0925-2312 (ISSN) 18728286 (eISSN)

Vol. 609 128461

Ämneskategorier

Elektroteknik och elektronik

DOI

10.1016/j.neucom.2024.128461

Mer information

Senast uppdaterat

2024-09-17