1

I have a zsh shell scripting problem :-( There is a file containing a list of 4 columns :

   NAME    SURNAME     OLD     TOWN
   DOE     John        30      London
   CALAS   Maria       50      Athens
   ...

I want to make a treatment of only some "elements" of each line. For example, I don't know if that's possible but it should be like :

for user,livesIn in `cat MyFile | awk '{print $2 $4}'`
    echo "My friend $user lives in $livesIn"
done

Of course this code is wrong and I didn't find how to write it correctly. Do someone knows if that's possible ?

Thanks in advance for your help.

1
  • Additionally, awk does string concatenation just by placing strings side-by-side: print $2 $4 will not separate the fields with a space. You want either print $2, $4 (which uses the OFS implicitly) or explicitly print $2 " " $4 Commented Feb 1, 2016 at 21:20

4 Answers 4

1

When you want to loop through the output of awk, you can try

awk '{print $2 $4}' MyFile | while read -r user livesIn; do
   echo "${user} was last seen in ${livesIn}."
done

In this case awkis not needed:

while read -r field1 user field3 livesIn; do
   echo "${user} was last seen in ${livesIn}."
done < MyFile

The constructions above will fail when some field has a space, like New York.
Take a good look at the specifications of your MyFile, how the fields are seperated. Fixed width? TAB-character? With a TAB you are lucky:

while IFS=$'\t' read -r field1 user field3 livesIn; do
   echo "${user} was last seen in ${livesIn}."
done < MyFile
Sign up to request clarification or add additional context in comments.

1 Comment

Using dummy placeholders is fine if there are few fields. But what if you want to use the 20th and 25th fields instead? Then you change your mind and really want the 9th and 27th fields. It's easier to change a number or two than to count and add placeholders.
1

awk processes each line one at a time, so no need for a for loop. Also no need for the cat as awk takes the file name as an argument. Try this:

awk '{print "My friend "$2" lives in "$4}' MyFile

3 Comments

Thanks for your answer... You.re too quick guys :o)
"My friend DOE lives in John"?
Yeah, I think you mean $2 and $4, not $1 and $2.
1

If you are already using awk, why don't you process everything with awk? For example:

$ awk '{print "My friend", $2, "lives in", $4}' MyFile

That gets the output you are looking for. Unless there is something else not stated in the question.

2 Comments

But I realize that I.ve probaly too simplified my example... In fact my treatment is a little bit more complicated than a print. I need to make some tests and assignments in the "for" loop.
@Marcoounet are you sure about that? I bet whatever it is you think you need to do with the awk output would be better served by being done in the awk script.
0

It is usually simpler and more efficient to do both iterating and processing in AWK rather than trying to divide the task by iterating in the shell and processing in AWK. AWK is well designed for iterating through input, and it also has its own loop structures (e.g. for). If at all possible, do all your processing in AWK.

That said, it seems that your problem also requires access to the input fields in the shell, and so full processing in AWK may not be possible in your case. (It is worth noting that AWK can also execute shell commands, but this may be just another level of complication.)

Other answers use AWK to iterate through the file and print something with columns 2 and 4, like

$ awk '{print "My friend", $2, "lives in", $4}' MyFile

which is fine if you can iterate and process with AWK like this. As an addition to this type of solution, you might want to skip the first line (which seems to have column headers instead of actual data) with

$ awk 'NR>1{print "My friend", $2, "lives in", $4}' MyFile

Your comment

In fact my treatment is a little bit more complicated than a print. I need to make some tests and assignments in the "for" loop.

suggests that what you really want is access to the fields in your shell. You can get this by using AWK to pick out the fields (as before), but piping the values into the shell:

    awk 'NR>1{print $2,$4}' MyFile | while read user livesIn; do
        echo "My friend $user lives in $livesIn"
    done

This gives you $user and $livesIn in the shell, and so you can do more complicated shell processing with it. For example:

    awk 'NR>1{print $2,$4}' MyFile | while read user livesIn; do
        if [[ "$user" == "John" ]]; then
            echo "$user is not my friend, but lives in $livesIn"
        else
            echo "My friend $user lives in $livesIn"
            echo "$user" >> friends.txt
        fi
    done

Be careful with the format of your input file since AWK is splitting on white space.

3 Comments

When writing shell loops to read input, always include IFS= and -r (e.g. while IFS= read -r user livesIn; do) unless you have a specific purpose in mind by leaving them out and fully understand all of the implications. Also see unix.stackexchange.com/q/169716/133219.
@EdMorton This is a good point and a good link. The performance hit alone would convince me not to do this. But in this case, using a null input field separator IFS= interprets the line as one field instead of separating on the delimiter (default space) printed by AWK. So it becomes one field, not two, and we lose the livesIn value. Would it be appropriate to use some obscure character (such as a '\v', or just '\t') and match the OFS of AWK with the IFS of the shell?
Good point that you can't set IFS= in this case, unless you want to split the variable after it's read as a single line. In this case since awk is already stripping all white space from around it's output I guess you may as well leave the shell IFS as-is but generally yes, matching the shell IFS to the awk OFS would make sense.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.