Now someone picks evenly coin A or B, tosses it as well and tells the result. How do I calculate the probability that coin A was tossed vs the probability that B was tossed?

- Thread starter tal goldberg
- Start date
- Tags coin flips

Now someone picks evenly coin A or B, tosses it as well and tells the result. How do I calculate the probability that coin A was tossed vs the probability that B was tossed?

1. Parameters p_a = Prob(Heads in 1 toss if the coin is A) and p_b = Prob(Heads in 1 toss if the coin is B) can be estimated using the first round of tosses. This is done via either

___ 1.1. the method of maximum likelihood

or

___ 1.2. Bayesian analysis where p_a and p_b have uniform prior on [0, 1].

2. Once (p_a, p_b) have been estimated, the identity of the coin in the second round can be calculated using formulas for conditional probability.

@

These are two separate problems. The choice of the method for one problem is separate from the choice of the method for the other problem.

I would personally use a Bayesian approach for the entire problem

I'm not convinced of that. The uncertainty in the original estimation needs to be accounted for in the second part when calculating the probability that the observed coin is A or B.

I would personally use a Bayesian approach for the entire problem

I would personally use a Bayesian approach for the entire problem

How would you estimate if/when the uncertainty should be taken into account. If we toss coin A 100 times and get 20 heads, then assuming p_a = 0.2 seems reasonable, However, if we toss 5 times and get 1 head, it does not. Is there a rule of thumb to use here (i.e. if we. tossed the coin at least X time, we can assume using p_a = heads/tosses is good enough).

I would personally use a Bayesian approach for the entire problem

Last edited: