Lately, I've had a lot of fun trying to think about how to vectorize functions. Here is a function for applying the central difference method for multivariable functions

def compute_gradient(func, x, epsilon=1e-6):
x = np.asarray(x)
input_shape = x.shape[0]
h = np.identity(input_shape) * epsilon
u = (x + h).T
l = (x - h).T

gradient = (func(u) - func(l)) / (2 * epsilon)

return gradient

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.