I'm writing a PHP CLI script for a client that runs in a shared hosting. It logs to a file using a simple function like:
function log_entry($msg) {
global $log_file, $log_handle;
$msg = "[".date('Y-m-d H:i:s')."] ".$msg."\n";
echo $msg;
$log_handle = fopen($log_file, 'a');
fwrite($log_handle, $msg);
}
And I get this error:
PHP Warning: fopen(./logs/sync.20130410.log)
[<a href='function.fopen'>function.fopen</a>]: failed to open stream:
Too many open files in ./functions.php on line 61
I thought there was an issue with using the same handle, so I changed it to:
function log_entry($msg) {
global $log_file;
$msg = "[".date('Y-m-d H:i:s')."] ".$msg."\n";
echo $msg;
$log_handle = fopen($log_file, 'a');
fwrite($log_handle, $msg);
fclose($log_handle);
}
But that didn't work. I get the error always in the same log line. When I do ulimit -n I get 1024, but that shouldn't be an issue because I'm never open more than one file. Ideas?
Spotted the issue. I'm answering this just in case anyone Googles for the same reason, but I know this answer wasn't implied in the question.
I'm using BigCommerce API client and turns out that they were opening a handle per request and making my script crash. Here is how I fixed it:
BigCommerce/API/Connection.php:354-365:
public function put($url, $body)
{
$this->addHeader('Content-Type', $this->getContentType());
if (!is_string($body)) {
$body = json_encode($body);
}
$this->initializeRequest();
$handle = tmpfile();
fwrite($handle, $body);
fseek($handle, 0);
curl_setopt($this->curl, CURLOPT_INFILE, $handle);
curl_setopt($this->curl, CURLOPT_INFILESIZE, strlen($body));
curl_setopt($this->curl, CURLOPT_URL, $url);
curl_setopt($this->curl, CURLOPT_PUT, true);
curl_exec($this->curl);
fclose($handle); // Added this line
return $this->handleResponse();
}
(Added the fclose($handle);) line.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With