I have a directory with several sub-directories, these sub-directories have many files and I'm interested in *.txt files. I want to go to every sub-directory, read the *.txt file and print a certain line matching a "pattern".
I would prefer to have it as a one-liner.
Here is the command what I tried.
for i in $(ls -d *_fastqc); do cd $i; awk '/FAIL/ {print $0}' ls -l su*.txt; done
I get an error command for this, as:
awk: cmd. line:1: fatal: cannot open file `-rw-rw-r--' for reading (No such file or directory)
What might be going wrong here?
Awk is not the right tool meant for this, see why you shouldn't be parsing ls ouput,
Rather use GNU find to list the files matching your criterion with xargs for de-limiting the output returned from find and grep for pattern-matching.
find . -type f -name "*.txt" -print0 | xargs -0 grep "FAIL"
-print0 (a GNU find specific option) appends a NULL character at end of each file/directory to handle files with white-spaces/special-characters and xargs -0 splits input from stdin with \0 as the de-limiter. Using grep on the returned file to return the line from the file if matched.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With