1

I am confused by this openmdao error. Why is this being raised? Can I somehow tell openmdao that I don't have gradients and to use finite differences? Why is this raised for childWeight but not eta?

I can get past this problem by initializing al my variables as floating point (eg. root.add('childWeight', IndepVarComp('x',100)) -> root.add('childWeight', IndepVarComp('x',100.0))), but I would like to understand why this error was raised.

from openmdao.api import Component, Group, Problem, ScipyOptimizer, IndepVarComp

class gym(Component):
    def __init__(self):
        super(gym, self).__init__()
        self.add_param('eta', 0.01)
        self.add_param('childWeight', 240)

        self.add_output('acc', 1)

    def solve_nonlinear(self, params, unknowns, resids):
        <...... parameters are used to produce objective "acc" ...>
        unknowns["acc"] = ....

top = Problem()
root = top.root = Group()
root.add('gym', gym())
top.driver = ScipyOptimizer()
top.driver.options['optimizer'] = 'BFGS'

root.add('eta',IndepVarComp('x', 0.01))
root.add('childWeight', IndepVarComp('x',100))

root.connect('eta.x', 'gym.eta')
root.connect('childWeight.x', 'gym.childWeight')

top.driver.add_desvar('eta.x', 0, 1.0)
top.driver.add_desvar('childWeight.x', 0, 1000)

top.driver.add_objective('gym.acc')
top.setup()
top.run()

raises the error

  File "script.py", line 98, in <module>
    top.setup()
  File "/usr/local/lib/python2.7/site-packages/openmdao/core/problem.py", line 694, in setup
    self.driver._setup()
  File "/usr/local/lib/python2.7/site-packages/openmdao/drivers/scipy_optimizer.py", line 91, in _setup
    super(ScipyOptimizer, self)._setup()
  File "/usr/local/lib/python2.7/site-packages/openmdao/core/driver.py", line 115, in _setup
    (item_name, name, oname))
RuntimeError: Parameter 'childWeight.x' is a 'pass_by_obj' variable and can't be used with a gradient based driver of type 'BFGS'.

2 Answers 2

2

The problem is this line

root.add('childWeight', IndepVarComp('x',240))

You've created an integer variable. Try this instead:

root.add('childWeight', IndepVarComp('x',240.))

If you want to use finite-differences you will also want:

top.root.fd_options['force_fd'] = True
Sign up to request clarification or add additional context in comments.

2 Comments

So is there no way to add an integer variable to a gradient-based driver?
By definition, no. A gradient based optimizer requires gradients (derivatives), which are not defined for integer variables.
1

I believe this error is being raised because the default value for childWeight is provided as the integer 240 instead of a float 240.0. If you ensure that your design variables always have default values as floats, they should not be categorized as a pass_by_obj variable.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.