Using bashBash, I have an indexed array that isrepresents a list of files:
a=("1.json" "2.json" "3.json" ... "5309.json")
I am parsing two data fields from those JSON files into two associative arrays:
declare -A idArr
declare -A valueArr
for i in "${a[@]}"; do
jqId="$(jq -M ".fileId" <"${i}")"
jqValue="$(jq -M ".value" <"${i}")"
# If there are already items in the associative array, add the new items separated by a newline
idArr[${i}]="${idArr[${i}]}${idArr[${i}]:+$'\n'}${jqId}"
valueArr[${jqId}]="${valueArr[${jqId}]}${valueArr[${jqId}]:+$'\n'}${jqValue}"
done
BecauseSince I'm iterating through one file at a time, it takes somea considerable amount of time to get throughprocess all the files. But I needrequire the associative arrays created inwithin the loop to exist outside of the loop/afterpersist beyond its scope, even after the loop is completedhas finished.
Is there a waymethod, such as using parallel processing or someany other methodapproach, that I couldwould allow me to concurrently run a bunch of theseprocess multiple array items and still haveenable them addto contribute data to the associative arrays?