So this is a homework question.

There were 40 chess matches at the World Chess Championships. If we assume that the average attendance for all those games was 750 spectators:

What is the maximum number of games in which there could have been at least 1,500 spectators if the standard deviation of the attendance was 250?

I can plot the data, no problem. I can find the probability, no problem. What I can't seem to find out is *how* to figure the maximum number of games that could have 1,500 spectators, with a standard deviation of 250. I've still got training wheels on my statistical bike, so if anyone could give me a shove in the right direction, I'd very much appreciate it.

Thanks in advance!

EDIT: I did some math, and I could be way off. But it looks like every instance of 1500 spectators (when mean is 750) adds roughly 118 to the standard deviation. So, I *think* the answer might be 2. 2 instances of 1,500 would add 236 to the standard deviation. Though I'm not certain.

Thanks again!

There were 40 chess matches at the World Chess Championships. If we assume that the average attendance for all those games was 750 spectators:

What is the maximum number of games in which there could have been at least 1,500 spectators if the standard deviation of the attendance was 250?

I can plot the data, no problem. I can find the probability, no problem. What I can't seem to find out is *how* to figure the maximum number of games that could have 1,500 spectators, with a standard deviation of 250. I've still got training wheels on my statistical bike, so if anyone could give me a shove in the right direction, I'd very much appreciate it.

Thanks in advance!

EDIT: I did some math, and I could be way off. But it looks like every instance of 1500 spectators (when mean is 750) adds roughly 118 to the standard deviation. So, I *think* the answer might be 2. 2 instances of 1,500 would add 236 to the standard deviation. Though I'm not certain.

Thanks again!

Last edited: