3

I need to pass multiple lines from one file into a script as one comma separated argument. Whenever I try to use the output of processing the file as a single string, the commas become separators. How can I do this?

Test case:

[user@host]$ #Here is a word list, my target words are on lines starting with "1,"
[user@host]$ cat word_list_numbered.txt
1,lakin
2,chesterfield
3,sparkplug
4,unscrawling
5,sukkah
1,girding
2,gripeful
3,historied
4,hypoglossal
5,nonmathematician
1,instructorship
2,loller
3,containerized
4,duodecimally
5,oligocythemia
1,nonsegregation
2,expecter
3,enterrologist
4,tromometry
5,salvia
[user@host]$ #Here is a mock operation, it just outputs the number of args, I want all selected words as one argument
[user@host]$ cat operation.sh
echo "This script has $# arguments"
[user@host]$ #Here is a script that outputs the needed words as comma delimited
[user@host]$ grep '^1,' word_list_numbered.txt | tr -d '1,' | tr '\n' ',' | sed 's/,$//'        lakin,girding,instructorship,nonsegregation[user@host]$
[user@host]$ #Here is the operation script receiving that comma delimited list
[user@host]$ ./operation.sh $(grep '^1,' word_list_numbered.txt | tr -d '1,' | tr '\n' ',' | sed 's/,$//')
This script has 4 arguments
[user@host]$ #oops, too many arguments
[user@host]$ ./operation.sh foo,bar
This script has 1 arguments
[user@host]$ ./operation.sh foo bar
This script has 2 arguments
[user@host]$

Details:

  • The needed words are in lines starting with 1,
  • All words should be passed to operation.sh as one comma-delimited argument
  • I don't have control over the format of word_list_numbered.txt or the need for operation.sh to take all words as one comma-delimited argument
  • It is not optimal to run operation.sh many times--I'm asking this question so I don't have to do that
4
  • Your foo,bar test would seem to cover this possibility but since I can't reproduce this, what is the value of $IFS on your system? Commented Sep 11, 2015 at 18:15
  • @swornabsent Yes, I did go down the $IFS route. It is apparently set to newline: Commented Sep 11, 2015 at 19:11
  • @msw I'm attempting to: select the words in all lines starting with "1,"; concatenate them delimited by commas; apply them as an argument for operation.sh Commented Sep 11, 2015 at 19:13
  • Why did every answer here get a down vote? Commented Sep 12, 2015 at 0:01

4 Answers 4

2

How about a combination of awk and xargs?

 awk -F, -v ORS=, '$1==1{print $2}' file | xargs ./operation.sh

Or if you mind the trailing comma:

 awk -F, -v ORS=, '$1==1{print $2}' file | sed 's/,$//' | xargs ./operation.sh

Test:

$ cat operation.sh 
echo "This script has $# arguments"
echo "$@"

$ awk -F, -v ORS=, '$1==1{print $2}' file | sed 's/,$//' | xargs ./operation.sh 
This script has 1 arguments
lakin,girding,instructorship,nonsegregation

$ cat file
1,lakin
2,chesterfield
3,sparkplug
4,unscrawling
5,sukkah
1,girding
2,gripeful
3,historied
4,hypoglossal
5,nonmathematician
1,instructorship
2,loller
3,containerized
4,duodecimally
5,oligocythemia
1,nonsegregation
2,expecter
3,enterrologist
4,tromometry
5,salvia

Without xargs it would be:

./operation.sh "$(awk -F, -v ORS=, '$1==1{print $2}' file | sed 's/,$//')"
Sign up to request clarification or add additional context in comments.

1 Comment

Thank you for thinking about this. I think the key part to your response is either piping or putting double-quotes around the parsing script. The piping or quotes seem to force the result to be treated as a string. I can pip or put double-quotes around my grep/tr/sed parsing script and get one argument too. Excellent, thanks!
1

Given:

$ echo "$tgt"
1,lakin
2,chesterfield
3,sparkplug
4,unscrawling
5,sukkah
1,girding
2,gripeful
3,historied
4,hypoglossal
5,nonmathematician
1,instructorship
2,loller
3,containerized
4,duodecimally
5,oligocythemia
1,nonsegregation
2,expecter
3,enterrologist
4,tromometry
5,salvia

In Perl:

$ echo "$tgt" | perl -F',' -lane '$A[++$#A]=$F[1] if $F[0]=="1"; END{ print join(",", @A) }'
lakin,girding,instructorship,nonsegregation

1 Comment

Thank you for thinking about this problem. Wow, if I could think in Perl I'd certainly be more effective.
1

An alternative to awk would be using command substitution in bash to fill an array with the contents of your file and then again to join all lines into a single comma separated string to pass to operation.sh:

#!/bin/bash

## function simulating operation.sh
operation() { printf "%s\n" "$1"; }

a=( $(<word_list_numbered.txt) )
b="${a[0]}$(printf ",%s" ${a[@]:1} )"

operation $b

exit 0

Output

$ bash csvlist.sh
1,lakin,2,chesterfield,3,sparkplug,4,unscrawling, ..<snip>.. 5,salvia

Comments

0

filter and get the words by awk and join them by using paste.

$ awk -F ',' '$1==1{print $2}' word_list_numbered.txt  | paste -s -d ',' -
lakin,girding,instructorship,nonsegregation
$ ./operation.sh "$(awk -F ',' '$1==1{print $2}' word_list_numbered.txt  | paste -s -d ',' - )"
This script has 1 arguments
$

UPDATE: enclosed using double quotes.

1 Comment

Thanks for thinking about this problem! That's a very nice use of paste--I'm always on the lookout for a new way to use it. Unfortunately, it didn't work for me until I put the output in double-quotes as in @user000001's answer.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.