I have a file Query.txt with data like this:
# Query: Name_ID
# 2 hits found****
# Query: Name_ID
# 20 hits found
# Query: Name_ID
# 0 hits found
when I awk it or grep it for a pattern I get output as follows:
grep "0 hits found" Query.txt | head
# 20 hits found
# 0 hits found
# 140 hits found
# 70 hits found
Two questions: How do I specifically get just the "0 hits found" and not the 20 or 140 or 70? second, How do I create another file Query2.txt with format like below using AWK ??
# Query: Name_ID # 2 hits found
# Query: Name_ID # 20 hits found
# Query: Name_ID # 0 hits found
To get only lines with 0 hits found try to match that exactly string but without a digit just before the zero:
awk '$0 ~ /[^[:digit:]]0 hits found/' infile
Assuming a test input file (infile) like:
# Query: Name_ID1
# 2 hits found
# Query: Name_ID2
# 20 hits found
# Query: Name_ID3
# 0 hits found
# Query: Name_ID4
# 140 hits found
# Query: Name_ID5
# 0 hits found
# Query: Name_ID6
# 60 hits found
It yields:
# 0 hits found
# 0 hits found
For second question, use getline to read odd lines and print both at same time, like:
awk '{ getline hits_line; printf "%s %s\n", $0, hits_line }' infile
Using same test file as before, it yields:
# Query: Name_ID1 # 2 hits found
# Query: Name_ID2 # 20 hits found
# Query: Name_ID3 # 0 hits found
# Query: Name_ID4 # 140 hits found
# Query: Name_ID5 # 0 hits found
# Query: Name_ID6 # 60 hits found
An as extra, I like to try this kind of tasks using vim too, so here a solution with it:
Content of script.vim:
set backup
for n in range( 1, line('$') / 2 )
execute "normal Jj"
endfor
saveas! Query2.txt
q!
Run it like:
vim -S script.vim infile
That will generate a Query2.txt file with content:
# Query: Name_ID1 # 2 hits found
# Query: Name_ID2 # 20 hits found
# Query: Name_ID3 # 0 hits found
# Query: Name_ID4 # 140 hits found
# Query: Name_ID5 # 0 hits found
# Query: Name_ID6 # 60 hits found
sed '/\s0/!d' file
(why "hits found" ?)
awk '/\y0/' file
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With