I'm new to bash scripting, and I am trying to transform a file that:
- Is a CSV
- Has random line jumps that break the format (bad inputs from user in CSV)
The outcome needs to be a processed file that separates lines correctly (lines of a set # of elements/columns), removing the random jumps.
My first aproach was to remove all the line jumps by using
variable=$(cat $1)
tr -d "\n" $variable > $variable
and then though of reading char by char looking for ',' with a counter, and adding a line jump after every set number of them were found. When trying to do this, I found info on while IFS= read and tried the following:
while IFS=, read -r col1 col2 col3 col4 <<< $variable; do
echo "$col1" "$col2" "$col3" "$col4" '\n'
done
I'm clearly missing something there (I even believe the while loop should already remove the line jumps, please correct me about it) and I am not sure how to keep going about it. I am not looking for a bunch of code that does the job, but trying to find someone here who could point me towards the right direction (maybe I'm not understanding IFS, or I need to process something prior to that step... whatever it could be).
Thanks in advance.
edit: I removed the line jump I was adding in the echo and it now prints col1 col2 col3 col4 and never ends, so I am clearly not being able to link each element from the file to the variables before printing.