Is it possible to see gradient function?
Samuel-Bachorik opened this issue · comments
Hi, when I use autograd, it is possible to see its gradient function? Or in other words, it is possible to see derivative of that function? Or is it possible to see computational graph?
For example, I want to see grad_tanh function
import autograd.numpy as np # Thinly-wrapped numpy
from autograd import grad # The only autograd function you may ever need
def tanh(x): # Define a function
y = np.exp(-2.0 * x)
return (1.0 - y) / (1.0 + y)
grad_tanh = grad(tanh) # Obtain its gradient function
Thank you
Have you got the answer? I have the same problem. I need to get the formula of the first derivative of the function