4

Hoping this has a straightforward answer that I missed from my reading of the docs. Following is the problem -

  1. I have a module loaded on all ipengine(s) on startup
  2. I've since made changes to the module
  3. I want these changes propagated to the remote ipengine(s) i.e. I want the module to be reloaded in all the remote instances

How can this be accomplished?

3 Answers 3

2

You can also turn on the ipython autoreload feature on the engines with the following:

%px %load_ext autoreload
%px %autoreload 2

Note that this solution, and calling reload with dview.execute() both have a problem when new engines can come online later (as when using a batch scheduler on a cluster): they only execute on the engines currently present.

One other wrinkle: you may want deep (recursive) reloading. See this option to ipengine:

--ZMQInteractiveShell.deep_reload=<CBool>
Default: False
Enable deep (recursive) reloading by default. IPython can use the
deep_reload module which reloads changes in modules recursively (it replaces
the reload() function, so you don't need to change anything to use it).
deep_reload() forces a full reload of modules whose code may have changed,
which the default reload() function does not.  When deep_reload is off,
IPython will use the normal reload(), but deep_reload will still be
available as dreload().
Sign up to request clarification or add additional context in comments.

1 Comment

Does that actually work? I'm using that but nothing gets reloaded nor deep_reloaded
1

This is the answer I found, not sure if this is the best way

from IPython.parallel import Client
rc = Client(profile='ssh')
dview = rc[:]
dview.execute('reload(<module>)', block = True)

Comments

0

I ran into the same problem, where I was working on a module that I wanted to test on remote engines, but I didn't want to commit my changes to git, and then pull the changes on the engine machines before each remote reload.

There might be a better way to do this, but my solution was to write a simple helper module which makes it easy to shuttle in-progress code over to the engines via scp.

I will copy the usage example here:

import IPython.parallel
import ishuttle
import my_module as mm

# Create a client for the cluster with remote engines
rc = IPython.parallel.Client(profile='remote_ssh_engines')
dview = rc[:]

# Create a shuttle object; the engines' working directory
# is switched to '/Remote/engine/server/scratch' after its creation
s = ishuttle.Shuttle(rc, '/Remote/engine/server/scratch')

# Make my_module available on all engines as mm. This code scp's the
# module over, imports it as mm, then reloads it.
s.remote_import('my_module', import_as='mm')

# Apply our favourite function from our favourite module
dview.apply_sync(mm.my_func, 'favourite argument for my_func')

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.