# For a known variance, what mean maximizes Shannon enthalpy?

#### winestat

##### New Member
It's easy to show that a normal PDF maximizes Shannon entropy. My data are discrete, so I have a discrete normal PMF. I do know the variance of the data, but have only one to three actual observations. Can I infer a mean that maximizes entropy? It is NOT just the mean of the observations. For example, if known variance is infinite then the mean has to be zero regardless of the observations. Ideas will be much appreciated.

#### Dason

If known variance is infinite it doesn't really make sense to use a normal distribution

#### winestat

##### New Member
If known variance is infinite it doesn't really make sense to use a normal distribution
That is only a boundary condition and an example. Do you have a solution or idea?

#### winestat

##### New Member
Another idea here. The PMF takes care of the boundary at variance = infinity; that turns a bell shape into a uniform distribution's flat line over a bounded interval. The problem of what mean maximizes enthalpy remains. It is the mean of the bounded interval if you have no information at all. Easy to that point. Once you have an observation, what is the mean that maximizes enthalpy?

#### winestat

##### New Member
Catrching up. It may be just that I have to embrace MEE instead of MLE. Any thoughts or advice?