2

I want to calculate the gradient of the following function h(x) = 0.5 x.T * A * x + b.T + x.

For now I set A to be just a (2,2) Matrix.

def function(x):
    return 0.5 * np.dot(np.dot(np.transpose(x), A), x) + np.dot(np.transpose(b), x)

where

A = A = np.zeros((2, 2))
n = A.shape[0]
A[range(n), range(n)] = 1

a (2,2) Matrix with main diagonal of 1 and

b = np.ones(2) 

For a given Point x = (1,1) numpy.gradient returns an empty list.

x = np.ones(2)  
result = np.gradient(function(x))

However shouldn't I get something like that: grad(f((1,1)) = (x1 + 1, x2 + 1) = (2, 2).

Appreciate any help.

1 Answer 1

1

It seems like you want to perform symbolic differentiation or automatic differentiation which np.gradient does not do. sympy is a package for symbolic math and autograd is a package for automatic differentiation for numpy. For example, to do this with autograd:

import autograd.numpy as np
from autograd import grad

def function(x):
    return 0.5 * np.dot(np.dot(np.transpose(x), A), x) + np.dot(np.transpose(b), x)

A = A = np.zeros((2, 2))
n = A.shape[0]
A[range(n), range(n)] = 1
b = np.ones(2)
x = np.ones(2)
grad(function)(x)

Outputs:

array([2., 2.])
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.