I am new to google cloud services and I am trying to set up an automatic build of my production requiring to download a heavy file.
I would like to download a file from a dedicated Google Storage bucket inside the Docker build process. To do so, I have added the following line to my Dockerfile:
RUN curl https://storage.cloud.google.com/[bucketname]/[filename] -o [filename]
Since files from this bucket shouldn't be publicly accessible, I disabled object level permission and added to the member [ProjectID]@cloudbuild.gserviceaccount.com the right Storage Object Viewer.
But when the docker file script run, the file downloaded is empty
Step 7/9 : RUN curl https://storage.cloud.google.com/[bucketname]/[filename] -o [filename]
---> Running in 5d1a5a1bbe87
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
Removing intermediate container 5d1a5a1bbe87
---> 42938a9cc8d1
Step 8/9 : RUN ls -l [filename]
---> Running in 34ac112051a1
-rw-r--r-- 1 root root 0 Jun 15 00:37 [filename]
This link works perfectly well if I login in google.console and access it through my navigator.
I tried changing the permission settings, and ended up adding cloud build account, storage legacy bucket reader, storage legacy object reader, storage object viewer together without much success.
I am obviously doing something wrong. But its not clear to me if:
Thanks for your help :)
After a long research, try and error, I managed to find out a good way to do it. here is the recipe for those whou may need to reproduce a similar setup, and my future me.
gcloud iam service-accounts keys create ~/key.json --iam-account [SA-NAME]@[PROJECT-ID].iam.gserviceaccount.com in Cloud Shell (https://cloud.google.com/iam/docs/creating-managing-service-account-keys#iam-service-account-keys-create-gcloud)RUN curl https://sdk.cloud.google.com | bash > /dev/null
ENV PATH="${PATH}:/root/google-cloud-sdk/bin"
RUN ./myscript.sh from your docker file, you will add:
gcloud auth activate-service-account [ACCOUNT] --key-file=~/key.json (https://cloud.google.com/sdk/gcloud/reference/auth/activate-service-account)TOKEN=`gcloud auth print-access-token [ACCOUNT]` (https://cloud.google.com/sdk/gcloud/reference/auth/print-access-token)RUN curl -L -H "Authorization: Bearer [TOKEN]" https://www.googleapis.com/storage/v1/b/[bucketname]/o/[objectname]?alt=media -o filename
I did not used gcloud cp gs://bucketname/bucketfile ./ because python2 was not available in my Docker image and python3 isn't supported by google.Congratulate yourself with a chocolat cake or a sugary treat. (:
Bonus: If like me your docker build timeout, you have to add a cloudbuild.yaml next to your Dockerfile. Here is a generic file I use for my builds:
steps:
- name: 'gcr.io/cloud-builders/docker'
args: [ 'build', '-t', 'gcr.io/$PROJECT_ID/$REPO_NAME:$BUILD_ID', '.' ]
images:
- 'gcr.io/$PROJECT_ID/$REPO_NAME:$BUILD_ID'
timeout: 900s
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With