Indefinite covariance matrix

Hi all.

I have a situation regarding maximum likelihood estimation. As per usual i maximize the likelihood function, but the hessian comes out with one or two small negative eigenvalues. I know that in principle this means that i found a saddle point, but it seems that my estimates are reasonable (and i have spend a lot of time finding a real maximum with no succes).

The thing is inverse of the average hessian becomes my covariance matrix. Since the hessian is not positive definite, standard errors come out imaginary. So, anyone with a solution?

I tried doing the following. I did a spectral decomposition of the covariance matrix, then took absolute values of the eigenvalues. This seems to work reasonable well. I haven't done a complete Monte Carlo yet, but from my preliminary inspection it seems like standard errors are of the right magnitude.

I guess this is somehow related to singular value decomposition? I'm not sure how completely, though. Does anyone have experience with something like this? Does it make sence? If so why?

Thanks all.