13

https://cloud.google.com/solutions/authentication-in-http-cloud-functions

The document suggest set up a Google Cloud Storage bucket. And then set up the service accounts' permission, "storage.buckets.get", to the bucket.

Then use this permission to authenticate access to the http Google Cloud Functions.

We are talking about authenticating the http cloud functions, but we are borrowing the permission from a Google Cloud Storage. It seems to me this is a hack solution.

If we can just set up permissions right at each Cloud Function through the Google Cloud Console, that will be great.

Are you guys using the authentication solution suggested by Google in the above document? Or you have better approaches?

To set up the ""storage.buckets.get", does it mean I grant the service account "Storage Object Viewer" permission?

enter image description here

1
  • "If we can just set up permissions right at each Cloud Function through the Google Cloud Console, that will be great" - that sounds like a feature request. You can send feature requests directly to the Firebase team here Commented Jan 31, 2018 at 11:50

2 Answers 2

7

The solution proposed in the link you brought here is indeed one of the ways. In fact, you can use any other Google Cloud Platform product (not only Storage buckets) to check the chosen account's permissions to it.

An alternative that can work is:

  1. Prepare a Cloud Function that will have the authorized users' emails listed.
  2. Cloud Function retrieves the 'Authorization' header of the incoming HTTP request that contains the token generated for the account that made the request.
  3. The function calls the tokeninfo endpoint using the mentioned header to retrieve email of the account (from the JSON response body). The url returning the email will look like this:
url = "https://www.googleapis.com/oauth2/v1/tokeninfo?fields=email&access_token
    =" + token_from_the_request_header;
  1. Verifying that the returned email is in the list of authorized ones.
  2. ... if yes, executing the function's logic.
Sign up to request clarification or add additional context in comments.

1 Comment

Say I have a gcloud Function (API) https://us-central1-gcf.cloudfunctions.net/myapi and want to utilize it by a few of my own applications. I don't think you meant every physical user must sign into Google in order to use it, right? In a corporate non-cloud environment, typical practice is to have a generic LDAP acct for all apps wish to call this function. For gcloud we can replace it with a generic gmail acct, after reading the "tokeninfo endpoint" above, my question is how do I specify this generic gmail acct say [email protected] can use my GCF?
4

For using Cloud Functions you need to put your modules in buckets. Granting the account ‘storage.buckets.get’ permission to the bucket, you grant authorization to the service account to trigger your HTTP Cloud Function; and similarly, you revoke authorization by removing ‘storage.buckets.get’ permission from another service account.

To set up the ‘storage.buckets.get’ permission you need to either select “Storage Admin” through the standard roles or ‘storage.legacyBucketReader'/’storage.legacyBucketWriter’ from legacy roles or even define a custom role with ‘storage.buckets.get’ permission.

1 Comment

this shouldn't be the case, I can set trigger on bucket, topic, pub/sub as long as they are the right value for the --trigger key for gcloud functions deploy command.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.