Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python running multiple processes

I have a script (a.py) that needs to call another script (b.py) multiple times, each execution of b.py takes 1 minute. If I run in a for loop, this takes a lot of time.

I want to know how I can optimize this, to reduce the time. Any help or suggestions would be very helpful.

Source code:

# a.py
import os

if __name__ == '__main__':
    inputs = ['file1', 'file2', 'file3', 'file4']
    script_path = 'b.py'

    # Some logging statements. 
    for infile in inputs:
    os.system("{} -i {}".format(script_path, infile))  


# b.py
    # take arguments
    # perform some operations
    # log the execution

So far, I have been using os.system to call the other script. How can I call the script b.py n times in parallel ?

like image 491
R George Avatar asked Nov 15 '25 18:11

R George


1 Answers

You may use muliprocessing.Process to run it in parallel:

from multiprocessing import Process

inputs = ['file1', 'file2', 'file3', 'file4']
script_path = 'b.py'

def run(script, name):
    os.system("{} -i {}".format(script, name))  

if __name__ == '__main__':
    inputs = ['file1', 'file2', 'file3', 'file4']
    script_path = 'b.py'
    for infile in inputs:
        p = Process(target=run, args=(script_path, infile))
        p.start()
    p.join()

Note: Executing a Python script from a Python script with os.system is not very elegant. You should modify your script b.py that it works as a module which provides its main functionality with functions or classes as well. Then you can import b and use these functions or classes instead.

like image 131
clemens Avatar answered Nov 17 '25 08:11

clemens



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!