Is it possible to do posterior predictive checks when using Random Forest for Bayesian parameter inference?

by SimonLL   Last Updated April 18, 2018 23:19 PM

Random Forest algorithm has been recently proposed for estimating parameter values within the context of Approximate Bayesian Computation (Raynal et al 2017). The idea consists of training regression trees on a reference table composed of the parameters values (response variables) used for generating pseudo data from which summary statistics (predictors) were computed. Once the trees are trained, it is possible to predict the expected parameter value from the summary statistics computed on the observations.

As I'm new to Random Forest algorithm, I would like to know if there is a way to access some how the posterior distribution, and not only the expectation of the parameter of interest. I'm particularly interested in doing posterior predictive checks.



Related Questions



summary(randomforest) - how to read?

Updated March 07, 2017 18:19 PM

What is the honesty condition for regression trees?

Updated April 20, 2015 21:08 PM


How to choose ML model based on the data?

Updated January 26, 2018 06:19 AM