A comment on a recent post gave me the motivation to try out the new EasyABC package for R, developed by Franck Jabot, Thierry Faure, Nicolas Dumoulin and maintained by Nicolas Dumoulin.

Approximate Bayesian Computation (ABC) is a relatively new method that allows treating any stochastic model (IBM, stochastic population model, …) in a statistical framework by generating “approximate” likelihood values through simulating from the model. We provide a gentle introduction to ABC and some alternative approaches in our recent Ecology Letters review on “statisitical inference for stochastic simulation models”.

The EasyABC package, available from CRAN, implements a number of algorithms for the three main sampling strategies used in ABC, namely Rejection Sampling, Sequential Monte Carlo (SMC) and Markov Chain Monte Carlo (MCMC). All those are also discussed in our review. The use of the package is relatively straightforward. To give a demonstration, I implemented the parameter inference of a normal distribution using the ABC-MCMC algorithm proposed by Marjoram that I coded by hand in my previous post on ABC in EasyABC. The EasyABC solution is provided below.

I think that this is an interesting package that will hopefully contribute to making ABC more widely known and used in ecology. As we say in our review, ABC has a huge potential as a solution for many typical ecological problems, but to make this more widely known is currently hindered by the fact that you have to code everything by hand, which excludes a large number of users. Thanks to the authors for their work!

library(EasyABC) # assuming the data are 10 samples of a normal distribution # with mean 5.3 and sd 2.7 data = rnorm(10, mean =5.3, sd = 2.7) # we want to use ABC to infer the parameters that were used. # we sample from the same model and use mean and variance # as summary statstitics for the model and the data. # observed summary statistics summarydata = c(mean(data), sd(data)) # stochastic model generates a sample for given par and returns summary statistics model <- function(par){ samples <- rnorm(10, mean =par[1], sd = par[2]) return(c(mean(samples), sd(samples))) } # call to EasyABC with the ABC-MCMC algorithm Marjoram, P.; Molitor, J.; Plagnol, V. & # Tavare, S. (2003) Markov chain Monte Carlo without likelihoods. Proc. Natl. Acad. Sci. USA, 100, 15324-15328. # with some automatic adjustment options ABC_Marjoram_original<-ABC_mcmc(method="Marjoram", model=model, prior=list(c("unif",0,10),c("unif",1,5)), summary_stat_target=summarydata, n_rec = 10000) str(ABC_Marjoram_original) par(mfrow=c(2,1)) hist(ABC_Marjoram_original$param[5000:10000,1], main = "Posterior for mean") hist(ABC_Marjoram_original$param[5000:10000,2], main = "Posterior for standard deviation")

Hello,

I am trying to implement an ABC MCMC scheme in R similar to your example above. I notice when I try to run your code above, I receive the following errors:

1) Prior must be in the form of a list.

2) Tab Normalization must have the same dimensions as the observed summary statistics.

Do you find the same errors?

Best,

Henry

LikeLike

Hi Henry,

yes, thanks for letting me know – when I wrote this post the package was brand new and I think Franck and the others must have changed the syntax since then. It’s more flexible now with the priors, I think when I wrote the post only uniform priors were possible.

I have corrected the code, the example should run again!

Florian

LikeLike

Pingback: Explaining the ABC-Rejection Algorithm in R | theoretical ecology

Hi Florian,

Do you know if the EasyABC package can be used for inference in Dynamic Bayes Nets?

Thanks for these blogs. They’re very useful.

Paul

LikeLike

Hi Paul,

as long as you can simulate from your model, you can always use ABC. The question is if that’s efficient. In a hierarchical dependency structure, I would think that some kind of iterative filtering through the network is more efficient (assuming you have data on intermediate nodes). But it depends on your problem and data, hard to say in general.

LikeLike

Great, and thanks, that makes sense. If your dependencies are governed by unknown and non-linear processes (i.e. they come from a generative simulation but you don’t know the corresponding likelihood functions), then you’d be stuck with some type of ABC method, right? Or do you know of a potentially more efficient technique in that circumstance?

Thanks again.

LikeLike

yes, but you would probably still want to ABC sample across the net, so do an ABC particle filter, or a particle MCMC rather than an ABC MCMC

LikeLike

Excellent and useful post. Thank you very much.

LikeLike