Compressed computations using wavelets for hidden Markov models with continuous observations
Journal article, 2023

Compression as an accelerant of computation is increasingly recognized as an important component in engineering fast real-world machine learning methods for big data; c.f., its impact on genome-scale approximate string matching. Previous work showed that compression can accelerate algorithms for Hidden Markov Models (HMM) with discrete observations, both for the classical frequentist HMM algorithms—Forward Filtering, Backward Smoothing and Viterbi—and Gibbs sampling for Bayesian HMM. For Bayesian HMM with continuous-valued observations, compression was shown to greatly accelerate computations for specific types of data. For instance, data from large-scale experiments interrogating structural genetic variation can be assumed to be piece-wise constant with noise, or, equivalently, data generated by HMM with dominant self-transition probabilities. Here we extend the compressive computation approach to the classical frequentist HMM algorithms on continuous-valued observations, providing the first compressive approach for this problem. In a large-scale simulation study, we demonstrate empirically that in many settings compressed HMM algorithms very clearly outperform the classical algorithms with no, or only an insignificant effect, on the computed probabilities and infered state paths of maximal likelihood. This provides an efficient approach to big data computations with HMM. An open-source implementation of the method is available from https://github.com/lucabello/wavelet-hmms.

Author

Luca Bello

University of Gothenburg

John Wiedenhöft

University Medical Center Göttingen

Alexander Schliep

University of Gothenburg

Brandenburg University of Technology

PLoS ONE

1932-6203 (ISSN) 19326203 (eISSN)

Vol. 18 6 June e0286074

Subject Categories

Other Computer and Information Science

Computational Mathematics

Probability Theory and Statistics

Computer Science

DOI

10.1371/journal.pone.0286074

PubMed

37279196

More information

Latest update

7/18/2023