Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Airflow task: OSError: [Errno 23] Too many open files in system

Tags:

ubuntu

airflow

Does anyone has had this error with the S3PrefixSensor?

OSError: [Errno 23] Too many open files in system: '/usr/local/lib/python3.6/dist-packages/botocore/data/endpoints.json'

I'm having that error when the scheduler runs over 12 tasks with that operator at the same time. If I rerun them manually, they work fine.

I tried increasing the ulimit as suggested by the answer of this question but it didn't work for me: Errno 24: Too many open files. But I am not opening files?

It's odd that error is coming up, as I'm only running 12 tasks at the same time. Is it an issue with the S3 sensor operator?

like image 536
ebertbm Avatar asked Jan 28 '26 07:01

ebertbm


1 Answers

You mentioned in comments that you use airflow.sensors.s3_prefix_sensor.S3PrefixSensor This is an outdated version of the sensor.

The updated version contains PR which caches the hook and prevent creating a new connection every time the sensor poke.

For Airflow<2.0.0 install backport provider:

pip install apache-airflow-backport-providers-amazon

For Airflow>=2.0.0 install providers:

pip install apache-airflow-providers-amazon

Then import the sensor via:

from airflow.providers.amazon.aws.sensors.s3_prefix import S3PrefixSensor
like image 109
Elad Kalif Avatar answered Jan 30 '26 08:01

Elad Kalif



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!