Skip to content

Why are allowed scopes limited for credentials_gce? #161

@samterfa

Description

@samterfa

Thank you so much for your work on this package! I had struggled to pass in credentials to my publicly available Plumber API application running on Google Cloud Run without a bunch of gcloud commands, yaml files, or just baking the credentials into the application (not a good idea, I know). credentials_gce() allows me to grab secrets from Google Secrets Manager super easily with a quick API call during the build.

I'm wondering if there's a compelling reason that credentials_gce() limits the scopes for which a token can be fetched to Google Cloud and Big Query. The Google Secrets API call I use works because the scope needed is just "https://www.googleapis.com/auth/cloud-platform" which is on the blessed list of scopes.

# We add a special case for the cloud-platform -> bigquery scope implication.
  if ("https://www.googleapis.com/auth/cloud-platform" %in% instance_scopes) {
    instance_scopes <- c(
      "https://www.googleapis.com/auth/bigquery",
      instance_scopes
    )
  }
  if (!all(scopes %in% instance_scopes)) {
    return(NULL)
  }

By adding "https://www.googleapis.com/auth/pubsub" to the instance_scopes allowed in credentials_gce(), and giving my default compute service account in Cloud Run the Pub/Sub Publisher role, I was able to run PubSub API calls without passing in credentials to the application. As I understand it, something like this is Google's preferred method of passing in Google Cloud API credentials to an application based on this page: "If your application runs inside a Google Cloud environment that has a default service account, your application can retrieve the service account credentials to call Google Cloud APIs. Such environments include Compute Engine, Google Kubernetes Engine, App Engine, Cloud Run, and Cloud Functions. We recommend using this strategy because it is more convenient and secure than manually passing credentials."

I do understand that I could generate a json file credential, upload it to Google Secrets Manager, then load it via the same API calls I'm making, but it's less convenient. Maybe it's better practice though.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions