

Use the minio CLI to copy the file to the cloud bucket mcli cp /.Create the backup using the pg_dump command and use gzip to compress it.Get the current time so that we can use it for tagging the backup files.Setup the constants like the user name, database name, etc.What the script does is self-explanatory.
AWS POSTGRESQL S3 CODE
You can save the code as a bash file -> backup.sh Mcli cp $BACKUP_DIRECTORY/ $DATABASE\_ $CURRENT_ oracle/compito-backup Pg_dump -U $USER $DATABASE -h $HOST | gzip - > $BACKUP_DIRECTORY/ $DATABASE\_ $CURRENT_

A very small bash script can do this: #!/bin/bash # ConstantsīACKUP_DIRECTORY= "/home/ubuntu/backups" # Date stamp (formated YYYYMMDD) We'll create a script that can create the backup and upload the same to the object storage. Ref: Minio Client Docs Creating the backup script This should list all the buckets in the storage. Verify the connection by running the command to list the available buckets mcli ls.Connect to the storage mcli alias set.Min.io client provides us with a CLI that can be used to do all kinds of operations in our object storage.ĭpkg -i mcli_20210902092127.0.0_bĬheck the official docs for other OS: min.io/download For interacting with our bucket, we can use the Min.io client. Oracle cloud object storage has S3 compatibility and so we can easily use any S3 compatible client to access it. I'm gonna be using Object storage from Oracle Cloud (Similar to AWS S3). Uploading the backup file to Cloud Storage
AWS POSTGRESQL S3 UPDATE
AWS POSTGRESQL S3 HOW TO
So once I get hold of how to deploy DB and set up connections, it's finally time to learn how to backup data. Backing up of data is still important though. I didn't bother much as I was experimenting with deployments and stuff. The application was just running as a demo and so I didn't bother much to set up backups and DB properly. Even if you don't have all the data, there is at least some data which is much better If you ask me. That day I realized how important it is to have backups.

For instance, DO provides managed Postgres DB for $15/month while you can self host a Postgres DB in one of their droplets for $5/month. The only downside is that it can be expensive.

Everything is already taken care of for us, backups in time, restoration, guaranteed uptime, etc. With managed DB services, we don't have to worry about anything. So next I checked the DB, and to my surprise, there was no data. The API which is a node application was running fine. One day, I realized I was not able to log in to Cartella, I checked the PM2 dashboard to see if the server is running fine or not. I hosted a Postgres database in a docker container which was used by the node application running in another docker container. I recently faced an issue with Cartella, where all the data was lost somehow. There can be a lot of problems when you are new to this whole deployment scene. I have currently hosted two of my side projects - Cartella and Compito in a barebones Ubuntu VPS. As someone who is not that into DevOps, setting up a database and managing it is a difficult task. Managing a database on your own can be problematic.
