1

I am trying to output the name of every file that in it's last line has "#FIXME" into a new file.

find -type f -print0 |while IFS= read -rd '' file; do
tail -n1 "$file" | grep -q FIXME && echo "$file" > newlog.log 
done  

This only outputs one file containing "#FIXME" to newlog.log when there are actually mutliple files.
Could anyone tell me what is going wrong?

2
  • 1
    Are you sure the FIXME comments are always the last line in the file — no stray blank lines at the end of a file? If the FIXME lines aren't reliably the last line, you could simplify your life with find . -type f -exec grep -l -e FIXME {} + > newlog.log. Commented Feb 28, 2020 at 6:45
  • 3
    Only the last file is found in the log afterwards, isn't it? Wouldn't it be necessary to use >> newlog.log to also keep all the previous output? Commented Feb 28, 2020 at 7:17

2 Answers 2

2

You have setup a loop which executes this line in the middle for many files:

tail -n1 "$file" | grep -q FIXME && echo "$file" > newlog.log

This line however creates and writes the file "newlog.log" each time,
which results in the already existing content being overwritten each time, with only one file name. This gave you the impression that only one file name was written to the log file. It happens to be the last one.

In order to get a list of all the written file names, i.e. to keep all of the content, you need to append to the existing loggile, instead of overwriting it.
To do that, use >> instead of >.

tail -n1 "$file" | grep -q FIXME && echo "$file" >> newlog.log

This creates the need to consider the content of the log file before executing the script.
Either you are fine with always also keeping the content of the previous script execution, which I guess you are probably not, or you can make sure the file has no content.

You can either delete the file and create it freshly. Or you do once use the > intentionally. This would allow you to make something of a headline for the list, e.g. containing the date of the creation and the path you executed it in.

Sign up to request clarification or add additional context in comments.

3 Comments

I always think that detailed explanation is appreciated. Thanks for confirming.
Rather than using >> newlog.log inside the loop, it's probably better to redirect the loop as a whole to newlog.log, by putting the > newlog.log after the done.
@ruakh That is an interesting approach. May I recommend to make an answer of it?
1

You need to append to the file, not overwrite. ANd this

tail -n1 "$file" | grep -q FIXME && echo "$file" > newlog.log 

should be

tail -n1 "$file" | grep -q FIXME && echo "$file" >> newlog.log 

2 Comments

thank you! i looked up the difference between > and >> and for some reason i didn't see it
@comearound > write to the file from begin (erase previous content) >> append to the existing content of the file

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.