Interframe Quantization for Noisy Channels
The demand for efficient transmission and storage of, for example, speech signals poses high requirements on the development of signal compression techniques. This thesis deals with vector quantization (VQ) which is a powerful technique for signal compression. More specifically, the focus is on memory based or, synonymously, interframe vector quantization. These terms refer to schemes that exploit the inherent memory in the input signal in order to obtain performance gains.
Memory based techniques have often been judged as inferior to memoryless VQ in the presence of channel noise. The main objective of this work has been to improve noisy channel performance for memory based VQ schemes. We present a novel technique, the safety-net method, which is capable of improving performance of memory based VQ for noisy, as well as noisefree channels. This is accomplished by combining a memory based VQ with a fixed memoryless VQ, where the memoryless VQ acts as a safety-net for the memory based VQ. Furthermore, a predictive VQ scheme that is optimized for noisy channel performance is proposed and thoroughly investigated. Several other methods for improvement of noisy channel performance for memory based VQ are also treated.
The main application for memory based VQ in this thesis is quantization of LPC parameters, which is an important part of many speech coding algorithms. The methods presented here are however not restricted to this application, which is demonstrated in one of the included papers.
Performance results from numerous simulations are presented. Specifically, it is demonstrated that the most successful schemes proposed in this work need 4-5 bits per frame less than memoryless VQ to achieve comparable performance for LPC spectrum quantization, regardless of channel error probability. This does not only result in a reduction of the number of bits that must be transmitted, but it will also bring about a significant reduction of complexity and storage requirements.