I tweaked a little with Python script, and used supervisord instead of tini.
backup.sh
#!/bin/shecho"$(date): backup process started"echo"$(date): pg_dump started for ${POSTGRES_DB}"export BACKUP_ROOT=/backups
FILE=$BACKUP_ROOT/$POSTGRES_DB-$(date +\%FT\%H-%M-%S).sql.gz
pg_dump | /bin/gzip >$FILEecho"$(date): pg_dump completed"
python3 del.py
echo"$(date): deleted similar files / cluttered old files"
del.py
importosimportfilecmpfromdatetimeimportdatetime,timedeltaos.chdir(os.environ["BACKUP_ROOT"])latest,*recent=sorted(os.listdir())[::-1]latest_ctime=datetime.fromtimestamp(os.stat(latest).st_ctime)previous_file_ctime=latest_ctimeforfinrecent:ctime=datetime.fromtimestamp(os.stat(f).st_ctime)iflatest_ctime-ctime<timedelta(days=7):iffilecmp.cmp(latest,f):os.unlink(f)# Remove similar files
elifprevious_file_ctime-ctime<timedelta(days=7):os.unlink(f)# Remove less than 1 week apart
eliflatest_ctime-ctime<timedelta(days=180):os.unlink(f)# Remove older than 180 days
previous_file_ctime=ctime
I am also considering backing up to S3 from DigitalOcean; but not yet. Waiting until launched to production...
That's an interesting approach. I also wanted to add logic to remove only a 1-week old backup but it was not important at that time, so I skipped it. My intention in using alpine was to reduce the image size. Even I was considering setting up the cron job outside of the image, on the host.
I have created this image to take backup on GCS and I believe this will work for any object storage system which supports s3 protocol as we are doing only 1 operation, which is just uploading the zip.
I have created one more image which zips the folder and upload it to DO Spaces using go and minio. Feel free to checkout. github.com/thakkaryash94/docker-sp...
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
I tweaked a little with Python script, and used supervisord instead of tini.
backup.sh
del.py
I am also considering backing up to S3 from DigitalOcean; but not yet. Waiting until launched to production...
That's an interesting approach. I also wanted to add logic to remove only a 1-week old backup but it was not important at that time, so I skipped it. My intention in using alpine was to reduce the image size. Even I was considering setting up the cron job outside of the image, on the host.
I have created this image to take backup on GCS and I believe this will work for any object storage system which supports s3 protocol as we are doing only 1 operation, which is just uploading the zip.
I have created one more image which zips the folder and upload it to DO Spaces using go and minio. Feel free to checkout. github.com/thakkaryash94/docker-sp...