We present a theoretical analysis of a class of particle filters (PFs), which differ from conventional PFs in the sense that at each time step, the $N$ particles are dependently sampled according to a MCMC chain rather than being generated conditionally independently. These "sequential MCMC" methods (abbreviated here as "MCMC-PFs") were proposed in Golightly & Wilkinson (2006); Septier et al. (2009) and extended in Septier & Peters (2016); Finke et al. (2016). They may also be viewed as the "unconditional" counterpart to the embedded hidden Markov models proposed in Shestopaloff & Neal (2016).
We derive a strong law of large numbers and a central limit theorem for MCMC-PFs. Compared to conventional PFs, the asymptotic variance of MCMC-PFs includes extra penalty terms related the autocorrelations of the MCMC chain generating the particles. This implies that a standard PF always outperforms the corresponding MCMC-PF if both algorithms target the same distribution flow. However, the potential advantage of MCMC-PFs over conventional PFs is that the former can often target a more efficient (e.g. fully-adapted auxiliary PF-type) flow even if the latter can only target a relatively inefficient (e.g. bootstrap-PF-type) flow and the resulting efficiency gains may compensate for the variance-increase due to the increased correlation of the particles.
This is joint work with Arnaud Doucet and Adam M. Johansen.