Prior in this Bayesian Analyses

hlsmith

Less is more. Stay pure. Stay poor.
I have done the following, based on code from the below link:

Code:
n1 = 321 # Pre
y1 = 313 # OUt-of-Range
n2 = 356 # Post
y2 = 49   # Out-of-Range

# SIMULATION
I = 10000 # simulations
theta1 = rbeta(I, y1+1, (n1-y1)+1)
theta1
theta2 = rbeta(I, y2+1, (n2-y2)+1)
theta2
diff = theta1-theta2  # simulated diffs
diff
https://lingpipe-blog.com/2009/10/1...t-to-fisher-exact-test-on-contingency-tables/

Is the uniform prior coming from the "+1" in the theta lines? If so, if I want to propose another prior can I just substitute out the 1's?

Thanks.

@Dason any input?

Last edited:

hlsmith

Less is more. Stay pure. Stay poor.
So if I wanted to say there going to be a 50% change (improvement) between the groups (beta(7.5, 2.5) - (beta(2.5, 7.5), the following code would do this. And this actually pulls the estimate down by about 5%, since the data had an 84% change and I used a prior of about 50%.

Code:
I = 10000 # simulations
theta1 = rbeta(I, y1 + 7.5, (n1-y1) + 2.5)
theta1
theta2 = rbeta(I, y2+2.5, (n2-y2) + 7.5)
theta2
diff = theta1-theta2  # simulated diffs
diff
hist(diff)

hlsmith

Less is more. Stay pure. Stay poor.
This is a good reference on the topic and provides code to plot prior, likelihood, and posteriors via triplot.

https://rdrr.io/cran/LearnBayes/src/R/triplot.R

https://alexanderetz.com/2015/07/

Of note,
x = seq(.001, .999, .001) ##Set up for creating the distributions
y1 = dbeta(x, PS, PF) # data for prior curve
y3 = dbeta(x, PS + k, PF + n - k) # data for posterior curve
y2 = dbeta(x, 1 + k, 1 + n - k) # data for likelihood curve, plotted as the posterior from a beta(1,1)

Last edited: