I am running an optimization with scipy.optimize.minimize
sig_init = 2
b_init = np.array([0.2,0.01,0.5,-0.02])
params_init = np.array([b_init, sig_init])
mle_args = (y,x)
results = opt.minimize(crit, params_init, args=(mle_args))
The problem is, I need to set a bound on sig_init. But the opt.minimize() requires that I specify bounds for each of the input parameters. But one of my inputs is a numpy array.
How can I specify the bounds given that one of my inputs is a numpy array?
crit?