1

I am trying to perform minimization of the following function:

def mvqr(P, y, x, c):
    s = 0
    for i in xrange(1, len(y)):
        summation =  numpy.linalg.norm(numpy.dot(numpy.linalg.inv(P), (y[i,:] - numpy.dot(beta, x[i,:])))) + numpy.dot(numpy.dot(c.T, linalg.inv(P)), (y[i,:] - numpy.dot(beta, x[i,:])))
        s = s + summation
    return s

this are the lines of the main file:

fun = lambda beta: mvqr(E, Y_x, X_x, v)
result = minimize(fun, beta0, method = 'BFGS')

beta is the unknown variable of the function mvqr() and beta0 is the initial guess, a (2,2) array I have previously calculated.

I got an error:

NameError: global name 'beta' is not defined.

For who is wondering if the file of the function mvqr() has already been located in the directory of the python packages, the answer is: yes, it has.

I think the problem is with beta in the mvqr() function and the use of lambda function.

Any help?

EDIT

Thanks to pv. the code now compiles with no error but when perform minimization does not iterate since the output of the function minimize displays the message 'Optimization terminated successfully.' but simply does not iterate and returns the initial guess.

  status: 0
 success: True
    njev: 1
    nfev: 6
hess_inv: array([[1, 0, 0, 0],
                 [0, 1, 0, 0],
                 [0, 0, 1, 0],
                 [0, 0, 0, 1]])
     fun: 1.2471261924040662e+31
       x: array([  3.44860608e+13,  -4.10768809e-02,  -1.42222910e+15,
                  -1.22803296e+00])
 message: 'Optimization terminated successfully.'
     jac: array([ 0.,  0.,  0.,  0.])

I have also tried with scipy.optimize.fmin_bfgs but the result is pretty the same:

Optimization terminated successfully.
Current function value: 937385449919245008057547138533569682802290504082509386481664.000000
            Iterations: 0
  Function evaluations: 6
  Gradient evaluations: 1

It could be that unfortunately beta0 is a local minimum or however a stationary point as holds jac == [0, 0, 0, 0] and therefore the algorithm terminates, but it looks strange to me that the initial guess is the minimum of the function (even if a local one). Does anyone have idea of how to avoid it?

Any help would be appreciated.

11
  • Is this a typo?: result = minimize(fun, beta0, method = 'BFGS') shoudl be result = minimize(fun, beta, method = 'BFGS') e.g. beta without the 0? Commented Jan 20, 2015 at 15:56
  • @EdChum Actually beta0 is the initial guess for scipy.optimize.minimize. Or x0 = beta0 to be more precise. And as I told I have previously calculated it. Commented Jan 20, 2015 at 15:59
  • Can you post the full traceback? Commented Jan 20, 2015 at 16:01
  • 1
    Aside: this isn't your issue, but your mvqr function probably doesn't do what you want -- it returns on the very first step of the loop, when i == 1, and so the loop doesn't really accomplish anything. Commented Jan 20, 2015 at 16:05
  • @ mgilson The traceback is quite a long one and has some data files loaded with pandas. My concern is that I am missing something in defining the unknown variable. Note that the optimization is unconstrained so I don't know what are all the possible values of beta, I only have the starting point beta0. Commented Jan 20, 2015 at 16:08

1 Answer 1

2

Change definition to def mvqr(beta, P, y, x, c): and do fun = lambda beta: mvqr(beta.reshape(2,2), E, Y_x, X_x, v) and minimize(fun, beta0.ravel()) if you wish to optimize value of beta that is a 2x2 matrix.

After that, consider reading a Python tutorial, esp. on global and local variables.

Sign up to request clarification or add additional context in comments.

4 Comments

I have tried your answer but it returns: status: 0 success: True njev: 1 nfev: 6 hess_inv: array([[1, 0, 0, 0], [0, 1, 0, 0], [0, 0, 1, 0], [0, 0, 0, 1]]) fun: 9.3738544991924501e+59 x: array([ 3.44860608e+13, -4.10768809e-02, -1.42222910e+15, -1.22803296e+00]) message: 'Optimization terminated successfully.' jac: array([ 0., 0., 0., 0.]). So basically doesn't compute any iteration and returns the initial value.
I'm not an expert, but if the inverse of the Hessian (hess_inv) is the identity matrix, and the Jacobian (jac) is zeros then the iteration will not go anywhere. See e.g. en.wikipedia.org/wiki/Newton%27s_method_in_optimization
@FuzzyDuck exactly! I had study lots of optimization stuff instead and if jack == 0 that is a stationary point and as the hess is semi-positive definite is a point of minimum as well. My guess is it could be a local minimum, but it seems a bit strange to me as is the starting point of the algorithm.
Have you tried other starting points? Your function evaluates to rather a large number. Furthermore, your array of x values seems to span a wide range of magnitudes. Makes me speculate that it might be something numerical.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.