0

I have made 100 .py file, now I want to run them at the same time , they all have a same function , just the argument is changing. Each file can take 20 minutes to run completely so I need them to run simultaneously. How can I do that ?

3
  • what shell are you using? Commented Jul 7, 2020 at 7:20
  • this seems like a pretty general question - right now it sounds like you want to have a look at multithreading or multiprocessing. This of course depends on what you actually want to do... Commented Jul 7, 2020 at 7:21
  • 1
    Is this something that can be restructured and run with a single file? What is your end goal? Commented Jul 7, 2020 at 7:26

4 Answers 4

1

You just need 1 file and multithreading, an array with 100 arguments and the method, and then you create a for-loop that starts a new thread for each argument.

or else you can make the for loop in bash/zsh shell

#!/bin/bash
for filename in *.py; do
    python $filename
done

note: the script have to be in the same directory as the python files to work. or else you have to define the path where i wrote *.py e.g. /home/user/python_project/*.py

Sign up to request clarification or add additional context in comments.

Comments

0

Depends on the shell you are using. If you are using Linux terminals, you can easily do so by using :

python file1.py & python file2.py & python file3.py

and so on.

Either way, instead of manually specifying names, you can write a bash script to automatically create this pattern.

Comments

0

You can make a call from first.py to second.py and so on and so forth

Comments

0

First off: if all your files are exactly the same, but only a single argument is changing, then creating 100 copies of the file feels very wrong. What you would need to do is passing that argument when you run the script.

So here's the roundup:

Given:

  • Your script which could look like this

    if __name__ == "__main__":
        #this line catches your argument
        your_argument = sys.argv[1]
        print(your_argument)
    
  • Another script which will be in charge of running 100 instances of your script

     if __name__ == "__main__":
        #you fill this list with all the arguments (ie: [ 'arg1', 'arg2'...])
        aListOfArguments = []
        for argument in aListOfArguments:
               subprocess.Popen( [ 'path/to/your/script/script.py',argument ], 
                          stdout=subprocess.DEVNULL,
                          stderr=subprocess.DEVNULL,    
                          shell = False,
                          close_fds = False,
                          preexec_fn=os.setpgrp          
                          )
    

The code above will automatically start several instances of your script but it will pass a different argument every time.

However, should you still want to use your solution - 100 separate scripts where you hardcode the argument each time - this will work

NOTE: this works under the assumption that your 100 python scripts are inside the same folder

if __name__ == "__main__":
      scriptList= os.listdir('/path/to/scripts/folder')
      for script in scriptList:
           subprocess.Popen( [ '/path/to/scripts/folder/{}'.format(script) ], 
                            stdout=subprocess.DEVNULL,
                            stderr=subprocess.DEVNULL,    
                            shell = False,
                            close_fds = False,
                            preexec_fn=os.setpgrp          
                            )

This code simply creates a list containing the name of all files contained in a folder. After that you call once again Popen just like before

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.