I have a directory structure with several levels, and on some of the directories there are files name "file1.txt"
I am trying to write a script to find the files, and write their paths (separated by spaces instead of '/') as well as grep some information for inside the file.
I thought a combination of find, while loops and read commands were the way to go, but now I'm not so sure.
The idea was to find the files, and write the list to a variable.
Then go through that viable one line at a time and get the file path and the text from inside and write to an output file.
My best attempt is:
#!/bin/bash
outputf='output.dat'
printf "some_text1 some_text2 some_text3 some_text4\n" > "$outputf"
FILE="$(find . -iname "file1.txt" -print0)"
while IFS= read -r -d '' LINE ; do
while IFS="/" read -ra PARTS ; do \
for i in "${PARTS[@]}" ; do
printf '%s ' "$i" >> "$outputf"
done
# grep "some text" $FILE >> "$outputf"
printf "\n" >> "$outputf"
done <<< "$LINE"
done <<< "$FILE"
Unfortunately, I only get the output from the first printf command.
I was expecting to get something like the following
some_text1 some_text2 some_text3 some_text4
. path_to_file1 file1.dat sometext
. path_to_file2 file1.dat sometext
. path_to_file3 file1.dat sometext
. path_to_file4 file1.dat sometext
However, I can't even get the file path to write to the output file (hence why the grep command is commented).
printf stuff; find -name file1.txt | xargs awk '/some text/ { f=FILENAME; gsub("/"," ",f); print f,$0 }'readloops need to use different file descriptors.readloops.