I have two million text files in a server online accesible to internet users. I was asked to make a change (a string replace operation) to these files as soon as possible. I was thinking about doing a str_replace on every text file on the server. However, I don't want to tie up the server and make it unreachable by internet users.
Do you think the following is a good idea?
<?php
ini_set('max_execution_time', 1000);
$path=realpath('/dir/');
$objects = new RecursiveIteratorIterator(new RecursiveDirectoryIterator($path), RecursiveIteratorIterator::SELF_FIRST);
foreach($objects as $name => $object){
set_time_limit(100);
//do str_replace stuff on the file
}
Use find, xargs and sed from shell, i.e.:
cd /dir
find . -type f -print0 | xargs -0 sed -i 's/OLD/NEW/g
Will search all files recursively (hidden also) inside the current dir and replace OLD for NEW using sed.
Why -print0?
From man find:
If you are piping the output of find into another program and there is the faintest possibility that the files which you are searching for might contain a newline, then you should seriously consider using the '-print0' option instead of '-print'.
Why xargs ?
From man find:
The specified command is run once for each matched file.
That is, if there are 2000 files in /dir, then find ... -exec ... will result in 2000 invocations of sed; whereas find ... | xargs ... will only invoke sed once or twice.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With