What is the proper way to leverage the Google IAM and gsutil to upload data into google cloud storage.
It sounds like a simple task. As a developer or a operation guy. You will face this kind of task more often from now on. It is mainly due to the new data driven age.
We all know that big query plays a big role in the big data space. For big query to consume the data effectively, you have to make sure that the dataset has been safely upload in to cloud storage.
Without much more words, let’s get started.
Less privilege: You need to go ahead and create a service account dedicate to the upload action, and assign it to the customise role only has the upload permission.
gcloud iam roles create CloudStorageUpload --project [yourprojectname] \ --title CloudStorageUpload --description \ "The role has the permission to upload files into gcs" --permission \ storage.objects.create
You can also use the describe command to show the role details.
gcloud iam roles describe --project [yourprojectname] CloudStorageUpload
Create a Service Account:
gcloud iam service-accounts create cloud-storage-upload-account
Create a policy binding between the service account and the role:
gcloud projects add-iam-policy-binding [yourprojectname] \ --member serviceAccount:cloud-storage-upload-account@[yourprojectname].iam.gserviceaccount.com --role roles/CloudStorageUpload
Create a service account key (json file):
gcloud iam service-accounts keys create ~/key.json --iam-account [SA-NAME]@[PROJECT-ID].iam.gserviceaccount.com
Until now, you have all the requirement for the prep work. Next is to create the bucket and start to use the key to upload the file.
Create a bucket:
gsutil mb -c regional -l us-east1 gs://some-bucket
Authenticate with your json key file on your gcloud SDK.
gcloud auth activate-service-account ~/cloud-storage-upload-account
Start the upload…
gsutil cp *.txt gs://some-bucket