Automating back-up of your Google Drive on S3

Problem Statement:  I am working with a lot of freelancers for various small creative and content related work but they usually share the completed work as a folder in Google drive. Keep in mind that they are the owner of this folder. After 1 week or so the freelancer is no longer is tasked with us and decides to free up his drive and delete this folder.
I want to mitigate this problem. There are various ways to approach this:
1. Ask the freelancer to transfer ownership and then keep the folder safe.
2.  Download and backup the shared folder whenever you receive a submission.
3. Automate the backing up process as secondary storage.
Of course, ‘automation’ wins, as not only it removes the hassle from both the concerned parties but you have a secondary backup for long-term retrieval and safekeeping as well.

Let’s get to it.

First, set up a VM on Google Cloud, you can use AWS or any other service. I used GC because they have f1.micro(0.6GB Memory, 1 shared vCPU) always free. Not using Google’s Storage because they haven’t added GUI to it, yet.

In the GCP Console, go to the VM Instances page. Launch Instance.
Follow this quickstart guide for starting the VM. https://cloud.google.com/compute/docs/quickstart-linux
Continue reading “Automating back-up of your Google Drive on S3”

Gitlab Runner config to auto deploy your static site on S3

If you didn’t know, you can easily setup your static website and host on S3. Just follow these simple instructions and you are done.
https://docs.aws.amazon.com/AmazonS3/latest/dev/website-hosting-custom-domain-walkthrough.html

Now you might want to version control your hosted website and would want to avoid uploading this content again and again.
With the help of Gitlab Runner you can simply remove the hassle of uploading your updated code.
Continue reading “Gitlab Runner config to auto deploy your static site on S3”