Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Automated BigTable backups

A BigTable table can be backed up through GCP for up to 30 days. (https://cloud.google.com/bigtable/docs/backups)

Is it possible to have a custom automatic backup policy?

i.e. trigger automatic backups every X days & keep up to 3 copies at a time.

like image 679
Mirodinho Avatar asked Oct 24 '25 17:10

Mirodinho


2 Answers

As mentioned in the comment, the link provides a solution which involves the use of the following GCP Products:

  • Cloud Scheduler: trigger tasks with a cron-based schedule

  • Cloud Pub/Sub: pass the message request from Cloud Scheduler to Cloud Functions

  • Cloud Functions: initiate an operation for creating a Cloud Bigtable backup

  • Cloud Logging and Monitoring (optional).

Full guide can also be seen on GitHub.

This is a good solution since you have a certain requirement that should be done with client libraries, because Big Table doesn't have an API that sets 3 copies at a time.

For normal use cases however, such as triggering automatic backups every X days, there's another solution such as calling the backups.create directly by creating a Cloud Scheduler with HTTP similar to what's done in this answer.

like image 62
Donnald Cucharo Avatar answered Oct 26 '25 08:10

Donnald Cucharo


Here is another thought on a solution:

Instead of using three GCP Products, if you are already using k8s or GKE you can replace all this functionality with a k8s CronJob. Put the BigTable API calls in a container and deploy it on a schedule using the CronJob.

In my opinion, it is a simpler solution if you are already using kubernetes.

like image 21
john_mwood Avatar answered Oct 26 '25 07:10

john_mwood