I have a php script processing some big data from database. It takes it from table by N(tried from 100 to 100000) rows and inserts it other table. max_execution_time is set to 0. Every iteration is wrapped in transaction. When selecting each portion I use pg_query(). But after 1-2 hours my script fails with Maximum execution time of 0 seconds exceeded, with error message pointing to the line with pg_query(). Did anyone have this issue? Any cure?
UPD:
Having tried the answer proposed here -- setting max_input_time to -1 -- still have no luck. The error moved from pg_query line to another line, which seems to be a pretty random one. So pg_query I guess has nothing to do with that, as well as max_input_time.
Where do you get that setting from? From a php.ini file? If so, search your project code for ini_set's, they have higher priority. I bet there is one that silently creeped in.
max_execution_time = 0 means run forever.
However there might be other things that may stop your script. For example apache has a default script execution timeout of 5 minutes.
see this: Is ini_set('max_execution_time', 0) a bad idea?
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With