0

Let's say I have a bunch of *.tar.gz files located in a hierarchy of folders. What would be a good way to find those files, and then execute multiple commands on it.

I know if I just need to execute one command on the target file, I can use something like this:

$ find . -name "*.tar.gz" -exec tar xvzf {} \; 

But what if I need to execute multiple commands on the target file? Must I write a bash script here, or is there any simpler way?

Samples of commands that need to be executed a A.tar.gz file:

$ tar xvzf A.tar.gz   # assume it untars to folder logs
$ mv logs logs_A
$ rm A.tar.gz
5
  • If you want the "A" in logs_A to correspond to the A in A.tar.gz, you'd have to write a script (inlined or in a file) Commented Jul 14, 2015 at 23:11
  • Thanks - is there a way to get and assign each filename from the find command? (so I can apply multiple commands on it later on). Commented Jul 14, 2015 at 23:23
  • 2
    You can use a while-read loop or -exec bash -c 'multiple; commands;' ... Commented Jul 14, 2015 at 23:30
  • 2
    You can also just use multiple -exec arguments. find ... -exec echo {} \; -exec echo "[{}]" \; Commented Jul 14, 2015 at 23:36
  • @EtanReisner - yeah that multiple usage of exec is new to me - thanks Commented Jul 15, 2015 at 5:21

2 Answers 2

2

Here's what works for me (thanks to Etan Reisner suggestions)

    #!/bin/bash    # the target folder (to search for tar.gz files) is parsed from command line
    find $1 -name "*.tar.gz" -print0 | while IFS= read -r -d '' file; do    # this does the magic of getting each tar.gz file and assign to shell variable `file`
        echo $file                        # then we can do everything with the `file` variable
        tar xvzf $file
        # mv untar_folder $file.suffix    # untar_folder is the name of folder after untar
        rm $file
    done

As suggested, the array way is unsafe if file name contained space(s), and also doesn't seem to work properly in this case.

Sign up to request clarification or add additional context in comments.

Comments

1

Writing a shell script is probably easiest. Take a look at sh for loops. You could use the output of a find command in an array, and then loop over that array to perform a set of commands on each element.

For example,

arr=( $(find . -name "*.tar.gz" -print0) )
for i in "${arr[@]}"; do
    # $i now holds each of the filenames output by find
    tar xvzf $i
    mv $i $i.suffix
    rm $i
    # etc., etc.
done

3 Comments

+1 for the links and a sample. But it's not quite working for me yet. Here's the simplest case when I just need to print out the name of all files, using the arr syntax you recommended. It just prints out the first file only (ie it doesn't really go through all the items in the array). Did I miss something? Have you tried it on your PC? #!/bin/bash arr=( $(find . -type f) ) for i in $arr ; do echo $i done
Collecting the results of find into an array like this is not safe for file names that can contain whitespace or other shell glob/metacharacters.
$arr does not expand to the array contents it expands to the first value in the array (it is the same as ${arr[0]}). You need "${arr[@]}" to get all the array values (but with this code the quotes there don't matter as you will have already blown up any files with spaces/etc. as per my first comment). See Bash FAQ 001 for ways to safely read lines of data in the shell, specifically the example that uses find's -print0 argument.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.