learningmatter-mit / NeuralForceField

Neural Network Force Field based on PyTorch

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Meaning of 'energy_grad'

Nokimann opened this issue · comments

I'm reading your tutorials. Thank you for the tutorials.

Is 'energy_grad' the same as the 'force' data?

The 'force' data seems to be a 3-dimensional vector.
Does the 'energy_grad' have the same dimension and meaning of property for molecules?

Can I ask further about the tutorials?

Same as the forces, but with a negative sign

Thank you!

The force might be calculated minus the energy derivative by using the gradient of the trained model function.

Is it better to use the predicted force directly from the model prediction instead of the calculation from the derivative of the model function?

That's definitely an option, and the ForceNet model does exactly that (https://arxiv.org/abs/2103.01436). The benefit of differentiating the energy is that you guarantee that the force is the gradient of a scalar, ie that it has no curl, and that it rotates with the coordinate system (equivariant).

The benefit of predicting the forces directly is that you only have to do a forward pass, not a backward pass, and so it's faster. Also during inference you don't have to store the computation graph in the forward pass, so you can fit a lot more molecules in memory. Note that, as in ForceNet, you can augment the model with rotated data during training to make it approximately rotationally equivariant.

The accuracy of the two approaches probably depends on the specific model architecture. ForceNet wasn't tested on MD-17, so I don't know how it compares to PaiNN. But also, nobody has trained a PaiNN model to produce the forces directly, and compared it to PaiNN with forces as the gradient of the energy. So I'm not sure we know which approach is more accurate in general.

You have guided me well enough. Thank you so much!