SC2: A statistical compression cache scheme
Paper i proceeding, 2014

Low utilization of on-chip cache capacity limits performance and wastes energy because of the long latency, limited bandwidth, and energy consumption associated with off-chip memory accesses. Value replication is an important source of low capacity utilization. While prior cache compression techniques manage to code frequent values densely, they trade off a high compression ratio for low decompression latency, thus missing opportunities to utilize capacity more effectively. This paper presents, for the first time, a detailed design-space exploration of caches that utilize statistical compression. We show that more aggressive approaches like Huffman coding, which have been neglected in the past due to the high processing overhead for (de)compression, are suitable techniques for caches and memory. Based on our key observation that value locality varies little over time and across applications, we first demonstrate that the overhead of statistics acquisition for code generation is low because new encodings are needed rarely, making it possible to off-load it to software routines. We then show that the high compression ratio obtained by Huffman-coding makes it possible to utilize the performance benefits of 4X larger last-level caches with about 50% lower power consumption than such larger caches.

Författare

Angelos Arelakis

Chalmers, Data- och informationsteknik, Datorteknik

Per Stenström

Chalmers, Data- och informationsteknik, Datorteknik

Conference Proceedings - Annual International Symposium on Computer Architecture, ISCA

1063-6897 (ISSN)

145-156
978-147994396-8 (ISBN)

Ämneskategorier

Datorteknik

Styrkeområden

Informations- och kommunikationsteknik

DOI

10.1109/ISCA.2014.6853231

ISBN

978-147994396-8

Mer information

Senast uppdaterat

2023-08-10