Subsample distribution distance and McMC convergence
Artikel i vetenskaplig tidskrift, 2005

A new measure based on comparison of empirical distributions for sub sequences or parallel runs and the full sequence of Markov chain Monte Carlo simulations, is proposed as a criterion of stability or convergence. The measure is also put forward as a loss function when the design of a Markov chain is optimized. The comparison is based on a Kullback-Leibler (KL) type distance over value sets defined by the output data. The leading term in a series expansion gives an interpretation in terms of the relative uncertainty of cell frequencies. The validity of this term is studied by simulation in two analytically tractable cases with Markov dependency. The agreement between the leading term and the KL-measure is close, in particular when the simulations are extensive enough for stable results. Comparisons with established criteria turn out favourably in examples studied.

Carlo simulation

parallel chains

single chain

Kullback-Leibler distance

convergence diagnostics

Markov chain Monte

chain monte-carlo

proposal distribution


Urban Hjorth

Chalmers, Matematiska vetenskaper

Göteborgs universitet

A. Vadeby

Scandinavian Journal of Statistics

0303-6898 (ISSN) 1467-9469 (eISSN)

Vol. 32 2 313-326


Sannolikhetsteori och statistik

Mer information