Let's say there's a data about how many car accidents per month (and assume each month is exactly 30 days for the sake of more simple calculation). Assume the probability of car accident is same everyday, and assume the car accidents are independent events (also a car accident doesn't increase the probability of the next car accident.

The average car accident per month is 60. That means 30/60 = 2 car accidents per day on average. But it's not "exactly 2 car accidents per day", it's "the average is 2 car accidents per day". So in one day number of car accident can be anything from [0, ∞) per day.

Then I use "Poisson distribution" formula to find out probability of one car accident per day:

P(

*x*; μ) = (e^-μ) (μ^x) / x!

P(1; 2) = (e^-2) (2^1) / 1! = 0.27067

If average car accidents per day is 2, then average car accidents per hour is: 2/24 = 1/12 = 0.083333

So the probability of one car accident in 24 hours is:

P(1/24; 2/24) = (e^-(2/24)) ((2/24)^(1/24)) / (1/24)! = Calculation error

What's wrong with the calculation?

How can we calculate the probability of one car accident per day or per hour when we know the average per day is 2?