I’m pleased to announce that my paper on Importance Nested Sampling in collaboration with Farhan Feroz & Mike Hobson (&, of course, Tony Pettitt) has at last been submitted to JCGS, and is available on the astro-ph preprint server: http://arxiv.org/abs/1306.2144 . As it stands, I believe that MultiNest with INS must be one of the most efficient out-of-the-box routines for marginal likelihood computation available these days. Looking ahead to future work I have the suspicion that nested sampling may well prove rather effective for marginal likelihood computation in Wiener measure spaces (where direct importance sampling, for instance, seems unfeasible); though obviously with MCMC moves to find each constrained likelihood point rather than ellipse-based sampling (which doesn’t make sense in this context). I will need to extend much further the measure-theoretic work we present in Appendix C of the INS paper to make progress with this idea though …

- Follow Another Astrostatistics Blog on WordPress.com
### View Posts by Category

ABC Astronomy Astrostatistics Bad Science Big Data Bayes Dirichlet Processes Fourier analysis Gaussian Processes Infinite-Dimensional Inference INLA Marginal Likelihood Estimation Measure Theory Non-Parametric Order Statistics Particle MCMC Quantile Regression Rants Semi-Parametric Statistics Uncategorized Zoology, Epidemiology, & Clinical Trials### Archive

- October 2017
- August 2017
- July 2017
- June 2017
- May 2017
- April 2017
- March 2017
- February 2017
- January 2017
- December 2016
- November 2016
- October 2016
- September 2016
- August 2016
- July 2016
- March 2016
- February 2016
- January 2016
- December 2015
- October 2015
- September 2015
- August 2015
- July 2015
- June 2015
- May 2015
- April 2015
- March 2015
- February 2015
- January 2015
- December 2014
- November 2014
- October 2014
- September 2014
- August 2014
- July 2014
- June 2014
- May 2014
- April 2014
- March 2014
- February 2014
- January 2014
- December 2013
- November 2013
- October 2013
- September 2013
- August 2013
- July 2013
- June 2013
- May 2013
- April 2013

Advertisements

This is pretty much the same idea as Diffusive Nested Sampling. Would appreciate a citation. Thanks!

I really don’t think it is, Brendan! Diffusive NS (which I have indeed been aware of for some time) still estimates X_i’s (sampled via MCMC) to sum up the likelihoods via the one-dimensional integration of ordinary nested sampling, whereas INS uses a losing the labels strategy to combine our Li’s (sampled uniformly within ellipses; and thereby being nested sampling only in its exploration of the posterior, not its marginal likelihood estimation technique given those samples). If INS and DNS were so similar then I imagine your paper would include citations to Geyer et al. 1994, Vardi 1985, Kong et al. 1994, Veach & Guibas 1995 and so on! 🙂

My mistake.

I’ll give this paper a proper read now. 🙂

We have an appendix (A) where I’ve tried to reconstruct the heritage of INS. Appendix C is probably needs some clarification, but is potentially the most interesting as I suspect NS [or DNS] (rather than INS, which isn’t applicable in the following) could be quite useful for marginal likelihood estimation in infinite-dimensional MCMC problems (like those of Beksos & Stuart 2010 for Wiener measure; but [to clarify] not for those concerning the Dirichlet process where it’s easier to integrate out the infinite-dimensional aspects of the problem via the Polya urn Gibbs sampling construction and compute marginal likelihoods via reverse logistic regression; e.g. Doss 2010).