11

I have been using this as a reference, but not able to accomplish exactly what I need: Calling an external command in Python

I also was reading this: http://www.python.org/dev/peps/pep-3145/

For our project, we have 5 svn checkouts that need to update before we can deploy our application. In my dev environment, where speedy deployments are a bit more important for productivity than a production deployment, I have been working on speeding up the process.

I have a bash script that has been working decently but has some limitations. I fire up multiple 'svn updates' with the following bash command:

(svn update /repo1) & (svn update /repo2) & (svn update /repo3) &

These all run in parallel and it works pretty well. I also use this pattern in the rest of the build script for firing off each ant build, then moving the wars to Tomcat.

However, I have no control over stopping deployment if one of the updates or a build fails.

I'm re-writing my bash script with Python so I have more control over branches and the deployment process.

I am using subprocess.call() to fire off the 'svn update /repo' commands, but each one is acting sequentially. I try '(svn update /repo) &' and those all fire off, but the result code returns immediately. So I have no way to determine if a particular command fails or not in the asynchronous mode.

import subprocess

subprocess.call( 'svn update /repo1', shell=True )
subprocess.call( 'svn update /repo2', shell=True )
subprocess.call( 'svn update /repo3', shell=True )

I'd love to find a way to have Python fire off each Unix command, and if any of the calls fails at any time the entire script stops.

1 Answer 1

18

Don't use shell=True. It will needlessy invoke the shell to call your svn program, and that will give you the shell's return code instead of svn's.

repos = ['/repo1', '/repo2', '/repo3']
# launch 3 async calls:
procs = [subprocess.Popen(['svn', 'update', repo]) for repo in repos]
# wait.
for proc in procs:
    proc.wait()
# check for results:
if any(proc.returncode != 0 for proc in procs):
    print 'Something failed'
Sign up to request clarification or add additional context in comments.

2 Comments

That's just what I was looking for.
Thank you, it works. Is there any way to run subprocess in background — now showing in terminal? (currently I see opened shell with working command)

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.