Arithmetic proof of predicted values

#1
Maybe the title of this thread should be:

Arithmetic proof of predicted values or Why you shouldn't let stats lie dormant in your life for 7 years! :D

I'm boning up on stats and reviewing some introductory text that looked interesting. One that I'm working through is Doing Bayesian Analysis by Kruschke.

In the text ~p52 we're dealing with a coin we have some suspicion of being biased. We are most confident it is a fair coin portioning our our belief of fair and bias as follows: fair -> Θ=.5 and equal suspicion that it is biased high and low Θ=.25 or Θ=.75. If D is our data, and the likelihood function is p(y=Heads | Θ), given that in a simple coin model p(D | Θ) => Θ; and Bayes' rule for discrete variables p(y|x) = [p(x|y)p(y)] / [∑y p(x|y)p(y)] I would think that (just as the author points out) that we could back into the prior belief predicted probability of getting a Heads as (marginal probability of Heads):

p(D=Heads) = ∑,Θ p(D=Heads | Θ )p(Θ)
going something like this:
p(y=H) = p(y=H|Θ=.25)p(Θ=.25)
+ p(y=H|Θ=.50)p(Θ=.50)
+ p(y=H|Θ=.75)p(Θ=.75)
= (.25*.25)+(.5*.5)+(.75*.75)

But here is where my understanding breaks down.

1) My calculation does not deliver the .5 I was looking for, rats!
2) Turning the page I discover that I'm almost there but the author has done this:
p(y=H) = (.25*.25)+(.5*.5)+.(75*.25) <-- huh?

Granted, the product is .5 but why .25?

I can give you more info if needed. If you could help me understand this I'd be only too grateful.
 

Dason

Ambassador to the humans
#2
We'll play the zoom game and see if you can recognize it before we get to the end.

p(D=Heads) = ∑,Θ p(D=Heads | Θ )p(Θ)
going something like this:
p(y=H) = p(y=H|Θ=.25)p(Θ=.25)
+ p(y=H|Θ=.50)p(Θ=.50)
+ p(y=H|Θ=.75)p(Θ=.75)
= (.25*.25)+(.5*.5)+(.75*.75)
The problem lies somewhere in there.










... + p(y=H|Θ=.75)p(Θ=.75)
= ... + (.75*.75)
Getting closer.










What is that value? What did you write it as?
 
#3
*smacks forehead*

Thank you. OF COURSE!

What I love most about statistics is it's quiet calibrating effect. What I hate most is the process of submission one must endure. I suspected I was being careless, riding rough shod over territory I felt confident "I knew backwards and forwards". *shakes head*

At least I've given up on being surprised of my carelessness. And I can repent of mistakes quicker now than 7 years ago.