On-demand Memory Compression of Stream Aggregates through Reinforcement Learning
Paper in proceeding, 2025

Stream Aggregates are crucial in digital infrastructures for transforming continuous data streams into actionable insights. However, state-of-the-art Stream Processing Engines lack mechanisms to effectively balance performance with memory consumption - a capability that is especially crucial in environments with fluctuating computational resources and data-intensive workloads. This paper tackles this gap by introducing a novel on-demand adaptive memory compression scheme for stream Aggregates. Our approach uses Reinforcement Learning (RL) to dynamically adapt how a stream Aggregate compresses its state, balancing performance and memory utilization under a given processing latency threshold. We develop a model that incorporates the application- and data-specific nuances of stream Aggregates and create a framework to train RL Agents to adjust memory compression levels in real-time. Additionally, we shed light on a trade-off between the timeliness of an RL Agent training and its resulting behavior, defining several policies to account for this trade-off. Through extensive evaluation, we show that the proposed RL Agent supports well on-demand memory compression. We also study the effects of our policies - providing guidance on their role in RL applied to stream Aggregates - and show our framework supports lean execution of such RL jobs.

reinforcement learning

stream aggregates

memory compression

Author

Jingyu Liu

University of Gothenburg

Chalmers, Computer Science and Engineering (Chalmers), Networks and Systems (Chalmers)

Vincenzo Massimiliano Gulisano

Chalmers, Computer Science and Engineering (Chalmers), Networks and Systems (Chalmers)

University of Gothenburg

Icpe 2025 Proceedings of the 16th ACM Spec International Conference on Performance

240-252
9798400710735 (ISBN)

16th ACM/SPEC International Conference on Performance, ICPE 2025
Toronto, Canada,

Subject Categories (SSIF 2025)

Computer Sciences

Computer Systems

DOI

10.1145/3676151.3719369

More information

Latest update

6/13/2025