Rolling two dice - what is the probability that a 1 will show on either dice. I can prove the the following method is incorrect but i can't understand why - for each dice P(1) = 1/6, and since there are two dice whose outcome are independent of each other, the probability doubles to 2/6 (or 1/3). The correct answer is 11/36 which is less than 1/3. Whats wrong with the first way of thinking? Intuitively it makes sense? (the proof that its wrong is if i throw 6 dice, then the probability is 6/6)

Thanks!