I'm trying to implement trivial parallelization where lines of commands are distributed among separate processes. For that purpose I wrote this script that I named jobsel:
(only "#! /bin/bash" and help message is omitted)
slots=$1
sel=$2
[[ $slots -gt 0 ]] || die_usage
[[ $sel -lt $slots ]] || die_usage
i=0
while read line
do
(( i % slots == sel )) && eval $line
i=$(( i + 1 ))
done
# in case the last line does not end with EOL
if [[ $line != "" ]]; then
(( i % slots == sel )) && eval $line
i=$(( i + 1 ))
fi
I put eval because I couldn't use redirection or pipe in the commands without it.
When I run this like $HOME/util/jobsel 22 0 < cmds on a console emulator when cmds is a file that contains lines like echo 0 >> out with increasing numbers, it outputs, as expected, 0, 22, 44... in separate lines. Good so far.
So I put this to work. But when I ran this through secure shell, I ran it through at with backgrounding (ending each line with &). Then there is a problem. When I entered 8 lines, 21 processes started! ps -AFH printed processes with identical command and different pIDs. All work processes were at the same level, directly under init. My program does not create child processes anyway.
Puzzled, I tried the echo 0 >> out script through at then the output contained duplicate lines. Still finding it hard to believe, and thinking simultaneous appending might have caused the anomaly, I used other methods to confirm that some lines were run multiple times.
Moreover, there was no such anomaly when everything was run in terminal or when I created separate at job for each worker process.
But how can this happen? Is something wrong with my script? Does at/atd have some bug?
[[ $(( i % slots )) -eq $sel ]]do(( i % slots == sel )). Please see Process Management and BashFAQ/050.atseemed to do fine when final&was omitted. But that is not always the case. So I turned tonohups.((slots > 0 ))and((sel < slots ))