Greetings
I have been experimenting with sampling from posterior distributions using
R. Assume that I have the following observations from a normal distribution,
with an unscaled joint likelihood function:

normsamples = rnorm(1000,8,3)

joint_likelihood = function(observations, mean, sigma){
    return((sigma ^ (-1 * length(observations))) * exp(-0.5 * sum(
((observations - mean ) ^ 2)) / (sigma ^ 2) ));
}

the joint likelihood omits the constant (1/(2Pi)^n), which is what I want,
because I've been experimenting with some crude sampling methods. The
problem is, with the samples above, the joint likelihood becomes 0 very
quickly.
I wanted to experiment with tens of thousands of observations, but without
an implementation of a transformation that can handle very small values, my
sampling algorithms would not work.

This is an attempt to use some sampling algorithms for Bayesian parameter
estimation. I do not want to resort to conjugacy, since I am developing this
to handle non conjugate scenarios, I just wanted to test it on a conjugate
scenario, but I've quickly realized that I'm in trouble due to likelihood
reaching 0 quickly.

Your feedback would be appreciated. It makes me wonder how JAGS/BUGS handles
this problem

Best regards
Seref

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to