Hi. Here is my understanding of it.
Think of the covariance matrix of the model parameters. We have to calculate that, and its inverse, so we can get standard errors, make hypothesis tests, and evaluate how related the parameters are to each other (i.e., correlated or redundant).
Generally speaking, the more correlated ("collinear") model parameters are to each other, the harder it is to find the inverse, or at least more "unstable" the solution is. As in any matrix algebra, we find ourselves in trouble if the matrix we're trying to invert isn't of full rank (positive definite).
In short, it sounds to me like your predictor variables may have some collinearity. Check to see if any of them can be eliminated.
------------------------------
Rick Marcantonio
Quality Assurance
IBM
------------------------------
Original Message:
Sent: Wed March 01, 2023 08:48 AM
From: Ian Hobbs
Subject: Hessian Matrix is not positive definite although all convergence criteria are satisfied. Validity of model fit is uncertain.
Hello,
I ran a glmm in SPSS using a negative binomial regression with the log link function and received this warning. What exactly does this mean? Is there something wrong with this model?

Thank you!
Ian
------------------------------
Ian Hobbs
------------------------------