I ran into the same problem, where I was working on a module that I wanted to test on remote engines, but I didn't want to commit my changes to git, and then pull the changes on the engine machines before each remote reload.
There might be a better way to do this, but my solution was to write a simple helper module which makes it easy to shuttle in-progress code over to the engines via scp.
I will copy the usage example here:
import IPython.parallel
import ishuttle
import my_module as mm
# Create a client for the cluster with remote engines
rc = IPython.parallel.Client(profile='remote_ssh_engines')
dview = rc[:]
# Create a shuttle object; the engines' working directory
# is switched to '/Remote/engine/server/scratch' after its creation
s = ishuttle.Shuttle(rc, '/Remote/engine/server/scratch')
# Make my_module available on all engines as mm. This code scp's the
# module over, imports it as mm, then reloads it.
s.remote_import('my_module', import_as='mm')
# Apply our favourite function from our favourite module
dview.apply_sync(mm.my_func, 'favourite argument for my_func')