I have a list of remote files. I can access these files using curl or Ansible. The list looks like this:
sqldbd_12_30_2020_14_29_30.tar.gz
sqldbd_12_30_2020_14_52_32.tar.gz
sqldbd_12_31_2020_14_57_49.tar.gz
sqldbp_12_31_2020_16_00_40.tar.gz
sqldbd_01_01_2021_16_29_57.tar.gz
sqldbd_01_02_2021_18_00_35.tar.gz
sqldbd_01_03_2021_20_00_33.tar.gz
sqldbd_01_03_2021_22_00_37.tar.gz
sqldbf_01_04_2021_00_00_37.tar.gz
sqldbc_01_06_2021_02_00_33.tar.gz
The date format is: %m_%d_%Y_%H_%M_%S (custom date format). I would like to execute curl DELETE only against files older than 7 days. At the moment I managed to write:
#!/bin/bash
auth_token=$1
storage_url=$2
compare_date=$(date -d '-7 days' '+%m_%d_%Y_%H_%M_%S')
for line in $(cat backup_list.txt); do
timestamp=$(echo "${line%.*.*}" | cut -d '_' -f 2,3,4,5,6,7)
if [[ "$timestamp" < "$compare_date" ]]; then
curl -k -X DELETE "${storage_url}/sql_backups/$line" -H "X-Auth-Token: ${auth_token}"
fi
done
Unfortunately, that doesn't work as expected and eventually all the files are getting deleted. I guess I could force the backup script to create the date string in a different format.