I am trying to read all the files that end with .env inside the config_files folder and then run the export command to have them available as environment variables.
So far I have tried:
#! /bin/bash
for file in "$(find ~/config_files -maxdepth 3 -name '*.env')"; do export $(grep -v '^#' $file | xargs); done
and
#! /bin/bash
for file in "$(find ~/config_files -regex '.*/.*\.\(env\)$')"; do export $(xargs < $file); done
Which would always end up having a declare -x issue like:
declare -x COLORTERM="truecolor"
I have also tried adding -print to the bash file like:
for file in "$(find ~/.ros/PS_AD/config_files -maxdepth 3 -name '*.env' -print)"; do export $(grep -v '^#' $file | xargs); done
But then I got:
./script: line 3: export: `/home/imr/config_files/docker-image/docker_specs.env:random=1': not a valid identifier
The *.env files look like:
random=1
What am I missing?
EDIT:
Now I am using the proper way ... with IFS and read
find ~/config_files -name '*.env' -print0 |
while IFS= read -r -d '' line; do
echo $line
set -a
. "$line"
set +a
done
but the environment variables are still not set
EDIT: Now also using the pipeline workaround doesn't solve the issue
while IFS= read -r -d '' p
do
set -a; . "$p"; set +a
done < <(find ~/config_files -name '*.env' -type f -print0)
set -aorset -o allexport.findinto your loop, usewhile ...; done < <(find ~/config_files -name '*.env' -print0)-- that way thewhileloop, and thus theenv, runs in your main shell instance, not a subshell, so the variables stay set after the loop has exited.mkfifo my_pipebut then it would get stuck. If I try to run it again it complains that the filemy_pipealready exists