For a known variance, what mean maximizes Shannon enthalpy?

#1
It's easy to show that a normal PDF maximizes Shannon entropy. My data are discrete, so I have a discrete normal PMF. I do know the variance of the data, but have only one to three actual observations. Can I infer a mean that maximizes entropy? It is NOT just the mean of the observations. For example, if known variance is infinite then the mean has to be zero regardless of the observations. Ideas will be much appreciated.
 
#4
Another idea here. The PMF takes care of the boundary at variance = infinity; that turns a bell shape into a uniform distribution's flat line over a bounded interval. The problem of what mean maximizes enthalpy remains. It is the mean of the bounded interval if you have no information at all. Easy to that point. Once you have an observation, what is the mean that maximizes enthalpy?