Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python multiprocessing pool OSError: Too many files open

I have to check how much time do_something() takes in total for a pair of lists containing 30k elements. Below is my code

def run(a, b, data):
    p = datetime.datetime.now()
    val = do_something(a, b, data[0], data[1])
    q = datetime.datetime.now()
    res = (q - p).microseconds
    return res 

Next, I call this using the following code:

func = functools.partial(run, a, b)

x = np.linspace(500, 1000, 30000).tolist()
y = np.linspace(20, 500, 30000).tolist()

data = zip(x, y)

with multiprocessing.Pool(processes=multiprocessing.cpu_count()) as pool:
    d = pool.map(func, data)
res = sum(d)

Whenever I run this, I keep getting OSError: [Errno 24] Too many open files. How do I fix this?

like image 882
Somnath Rakshit Avatar asked Oct 23 '25 02:10

Somnath Rakshit


1 Answers

You can use ulimit -u 2048 to raise the process limit.

Use ulimit -a to check the current limits.

like image 133
smanna Avatar answered Oct 25 '25 16:10

smanna



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!