tensorflow / probability

Probabilistic reasoning and statistical analysis in TensorFlow

Home Page:https://www.tensorflow.org/probability/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Normal Inverse Gaussian Outputs Positive log_prob

i418c opened this issue · comments

While training a model with a NIG output, I noticed that the loss for the negative log likelihood would go negative. At first I thought that perhaps the values of verify in the model were outside of support for the NIG function, but fixing these values to what (I think) should be valid still would yield a negative loss.

Any help on why this occurs and how to fix it would be appreciated.

While creating an example, I noticed that the quickest way to trigger this was to set the verify values to single number. A gist reproducing the issue can be found here.

Let me ask a different question then. If the loss of my model can be negative, how would I know if it's anywhere near optimal? Experimentation?

Thanks for the help. It may be helpful for the project to have a tutorial or example to highlight that a negative loss isn't abnormal when training with NLL. Those of us who did poorly in statistics would be grateful.