I have a file like below in file allreport.txt:
11:22:33:456 Script started Running: first_script
11:22:34:456 GetData - Read Read 12 Bytes
11:22:34:456 SetData - Write Write 12 Bytes
11:32:33:456 Script started Running: second_script
11:32:34:456 GetData - Read Read 12 Bytes
11:32:34:456 SetData - Write Write 12 Bytes
11:42:33:456 Script started Running: third_script
11:42:34:456 GetData - Read Read 12 Bytes
11:42:34:456 SetData - Write Write 12 Bytes
11:52:33:456 Script started Running: fourth_script
My requirement is that I need to extract the '....' lines which are in between '*scripts'. I tried something like below:
grep 'Running:' allreport.txt | sed 's/[^ ]* //' | cut -d":" -f2 | tr '\n' ' ' | awk -v col=1 '/$col/,/$($col+1)/ allreport.txt > $col'
But after execution of the command I see no output and no result files also created?
How can I achieve the same - the expected output is to have files like first_script, second_script and so on and each of the files contains logs of its run - example first_script should have below lines only:
11:22:34:456 GetData - Read Read 12 Bytes
11:22:34:456 SetData - Write Write 12 Bytes
Similarly second_script should have below lines and so on:
11:32:34:456 GetData - Read Read 12 Bytes
11:32:34:456 SetData - Write Write 12 Bytes
With your shown samples only, please try following. As an output it will create 3 files named first_script, second_script and third_script with shown samples.
awk -F': ' '/Running/{close(outFile);outFile=$2;next} {print > (outFile)}' Input_file
Explanation: Simple explanation would be, making field separator as : then checking if line has Running in it then set output file name as 2nd field. If line is NOT having Running then print that line into output file. Also making sure to close the output file in backend to avoid "too many opened files error" here.
You may try this awk also:
awk '$(NF-1) == "Running:" {close(fn); fn = $NF; next} {print > fn}' file
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With