Google ADC with short-lived credential

In this blog we will go through steps that using GCP ADC (application default credential) to impersonate another service account to generate the access token.

  1. Create a VM
gcloud compute instances create instance-1 --machine-type=f1-micro --zone=us-central1-a --scopes=https://www.googleapis.com/auth/cloud-platform

Validate the Cloud API access scopes is "Allow full access to all Cloud APIs"

Take note the service account from the created VM, we will use it in the next step.

The default service account for the compute is {project number}-compute@developer.gserviceaccount.com

2. Assign role roles/iam.serviceAccountTokenCreator to the service account.

3. Create 2 services accounts for testing purpose

Create the service account gcs-only@goingdelit.iam.gserviceaccount.com

an service account for storage operation only - gcs-only
assign 2 roles of Storage Admin and Storage Object Admin
set the service account of the VM

Repeat to create another service account bigquery-only@goingdelit.iam.gserviceaccount.com - but don't grant any role to this user to test the access token

4. Install "gcloud" and connect it to the GCP project in the VM

Follow this guide to install it.

https://cloud.google.com/sdk/install

Don't forgot the last step to connect to GCP via "gcloud init"

5. Get the access token of the ADC

This step have to run inside the VM.

curl "http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/token" -H "Metadata-Flavor: Google"

The output looks like this

The access token from the service account that running this VM
Similar with gcloud command "gcloud auth application-default print-access-token"

6. Enable "IAM Service Account Credentials API"

Open the URL below and enable the API.

https://console.developers.google.com/apis/api/iamcredentials.googleapis.com/overview

7. Generate the access for the service account "gcs-only"

This step we can run anywhere. In our local machine, make a HTTP request to this end point to generate the access token for the service account "gcs-only".

POST https://iamcredentials.googleapis.com/v1/projects/-/serviceAccounts/{email of the service account}:generateAccessToken
curl --location --request POST 'https://iamcredentials.googleapis.com/v1/projects/-/serviceAccounts/gcs-only@goingdelit.iam.gserviceaccount.com:generateAccessToken' \
--header 'Authorization: Bearer ya29.c.Kn_ZB1Iucwnl1R8u5OJtjMioQE94-_TPv8RaEbzr8AAXG1TeBZT4WY_G9xCDwIFFkoLdYgcOz6AgRN1hpXDP7uQSmhEQocZH9-sJa52DnZXvT39PFdclhNvej1dDxTw9WgiXpI_e74xge1Ss4i2ZfBt2EPxsMjR9jWNVY5ptZdDD' \
--header 'Content-Type: application/json' \
--data-raw '{
  "scope": [
      "https://www.googleapis.com/auth/cloud-platform"
  ]
}'

Note the Bearer Token is from the step #5

Generate the access token for service account "gcs-only"

Finally we have the access token for the service account "gcs-only", we will use it to perform a storage API test in next step.

8. Test the generated access token

Create a Bucket and upload some files to the bucket.

Bucket storage_for_gcs_only

Make a GET request to this endpoint to list all objects in the bucket

GET https://storage.googleapis.com/storage/v1/b/{bucket name}/o
Use Postman or CURL to test the endpoint, notice the token is the from the previous step #7

Repeat step #7 to generate access token for the service account "bigquery-only" - it won't work as the impersonated service account don't have the appropriate role to do operation on Storage Objects (look back step #3).

403 error

References

Create short-lived service account credentials

OAuth 2.0 Scopes for Google APIs

ADC - Application default credentials

Generate Access Token API

Access Instance Metadata Endpoint