I just arXived the final version of our technical note “Approximate Bayesian parameterization of a process-based tropical forest model” (coauthored with Claudia Dislich, Thorsten Wiegand and Andreas Huth) that is now accepted and will appear in Biogeosciences soon. There have been a few minor changes in response to reviewer comments on a discussion version of this paper that was also published by Biogeosciences and that I blogged about last year.

To give you the story in a nutshell: if you have any stochastic simulation and you want to fit this simulation to data, you can use the stochasticity of your model outputs to create an approximation for the probability of obtaining the data you observed conditional on your model / parameters, and you can then use this probability (Likelihood) to compare different models and parameters (see our 2011 review in EL). One popular way to do this is Approximate Bayesian Computation (ABC), which I explained with an example in a recent post. Another option is the method of synthetic likelihoods used by Wood, Nature 2010. The idea of the latter method is that you run the stochastic model many times, fit a distribution to the model outputs, and compare the probability density of this distribution to your data to get the likelihood. We called this a “parametric approximation” in our review, which still seems a more appropriate name to me, but I guess the term “synthetic likelihood” is more well known now and sticks for the moment.

What we show in the paper is that the method of synthetic likelihoods works pretty well to fit a comparatively complex forest model to data. It does sound rather trivial, but I have to admit that I was quite surprised that this works so well, I didn’t expect this when we started this project, and it’s to my knowledge the first application of such a method to rather complex ecological model.

Among other things, I see this paper as an encouragement for ecological modelers to try this method out. Synthetic likelihoods, although noted by statisticians and the ABC crowd, are still pretty much undiscovered by the larger ecological modeling community (as is ABC, although to a lesser extent), which is unfortunate because these approaches offer a tremendous potential to embed simulation models much tighter in empirical data.

Addition: a pdf of the final paper is here.

Pingback: Explaining the ABC-Rejection Algorithm in R | theoretical ecology

Pingback: Explaining the ABC-Rejection Algorithm in R ← Patient 2 Earn

Are you familiar with indirect inference? http://www.unige.ch/ses/metri/cahiers/2001_01.pdf

Yes, we note indirect inference as one possibility to obtain summary statistics in our review on simulation-based inference http://onlinelibrary.wiley.com/doi/10.1111/j.1461-0248.2011.01640.x/full .

We reviewed the indirect inference literature at that time, but I have to say that I’m still a bit confused about what exactly the “specifications” of indirect inference are. My current understanding was that people typically

1) Obtain summary stats by moments or statistical models fitted to the data

2) Optimize? (and here is my problem) the parameters with respect to a distance measure based on this summary stats

What I never understood in step 2) is whether people that use indirect inference actually make the step of interpreting the distance in terms of the summary stats as a likelihood, so that you could use the indirect inference in a Bayesian setting. It seems to me that most literature concentrates on the quality of the estimator obtained by 2), not on posteriors or CIs.

If someone is an expert on indirect inference – I would find a discussion about the differences to ABC and synthetic likelihoods helpful.

Sorry, recently left academia for greener pastures so I only had the 2010 paper for reference… much harder to get papers behind a pay-wall.

Drovandi has a more comprehensive paper on the various flavors of indirect inference within the ABC framework: http://eprints.qut.edu.au/63767/3/63767.pdf. As far as I can tell the closest concept to the synthetic likelihood is described in section 4.2. Good discsussion and reply here as wel;- https://xianblog.wordpress.com/2014/01/31/bayesian-indirect-inference/

Also, Chopin introduced the concept of Expectation-Propogation ABC that is similar in spirit- http://arxiv.org/pdf/1107.5959v2.pdf.

Thanks for the links … the Drovandi paper looks interesting on the first glance. I’ll have to read this in more detail.

I’m still a bit puzzled about whether indirect inference is just a way to create summary statistics for ABC, or whether I could use it to directly create an approximation of the likelihood from repeated simulations as with synthetic likelihoods.

But OK, maybe I should read a bit first. If anyone can give more explanations though, I’d be interested.

I think the answer is- Yes. If the auxillary model were a Gaussian, then the obvious summary statistics i.e. the first two moments, would be sufficient statistics and tell you all about the likelihood. But if you were to introduce a more complicated auxillary model like an HMM or state space model then one would use an indirect likelihood. But I do agree with you that the indirect inference literature is obtuse… probably because it’s mostly written by economists 😀