I am trying to perform minimization of the following function:
def mvqr(P, y, x, c):
s = 0
for i in xrange(1, len(y)):
summation = numpy.linalg.norm(numpy.dot(numpy.linalg.inv(P), (y[i,:] - numpy.dot(beta, x[i,:])))) + numpy.dot(numpy.dot(c.T, linalg.inv(P)), (y[i,:] - numpy.dot(beta, x[i,:])))
s = s + summation
return s
this are the lines of the main file:
fun = lambda beta: mvqr(E, Y_x, X_x, v)
result = minimize(fun, beta0, method = 'BFGS')
beta is the unknown variable of the function mvqr() and beta0 is the initial guess, a (2,2) array I have previously calculated.
I got an error:
NameError: global name 'beta' is not defined.
For who is wondering if the file of the function mvqr() has already been located in the directory of the python packages, the answer is: yes, it has.
I think the problem is with beta in the mvqr() function and the use of lambda function.
Any help?
EDIT
Thanks to pv. the code now compiles with no error but when perform minimization does not iterate since the output of the function minimize displays the message 'Optimization terminated successfully.' but simply does not iterate and returns the initial guess.
status: 0
success: True
njev: 1
nfev: 6
hess_inv: array([[1, 0, 0, 0],
[0, 1, 0, 0],
[0, 0, 1, 0],
[0, 0, 0, 1]])
fun: 1.2471261924040662e+31
x: array([ 3.44860608e+13, -4.10768809e-02, -1.42222910e+15,
-1.22803296e+00])
message: 'Optimization terminated successfully.'
jac: array([ 0., 0., 0., 0.])
I have also tried with scipy.optimize.fmin_bfgs but the result is pretty the same:
Optimization terminated successfully.
Current function value: 937385449919245008057547138533569682802290504082509386481664.000000
Iterations: 0
Function evaluations: 6
Gradient evaluations: 1
It could be that unfortunately beta0 is a local minimum or however a stationary point as holds jac == [0, 0, 0, 0] and therefore the algorithm terminates, but it looks strange to me that the initial guess is the minimum of the function (even if a local one). Does anyone have idea of how to avoid it?
Any help would be appreciated.
result = minimize(fun, beta0, method = 'BFGS')shoudl beresult = minimize(fun, beta, method = 'BFGS')e.g. beta without the0?beta0is the initial guess forscipy.optimize.minimize. Orx0 = beta0to be more precise. And as I told I have previously calculated it.mvqrfunction probably doesn't do what you want -- it returns on the very first step of the loop, wheni == 1, and so the loop doesn't really accomplish anything.pandas. My concern is that I am missing something in defining the unknown variable. Note that theoptimizationisunconstrainedso I don't know what are all the possible values ofbeta, I only have the starting pointbeta0.