cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Google Ad Manager Reporting API Authentication via Databricks

ellaine
New Contributor III

Hello - I was wondering whether anyone has had any experience fetching data through GAM's Reporting API via Databricks.

The Reporting API requires installation of the "googleads" library, as well as a googleads.yaml file. I was able to find some documentation online where you can either load the yaml file from storage by specifying a location for it, or load it through a yaml string.

ad_manager:
  application_name: INSERT_APPLICATION_NAME_HERE
  network_code: INSERT_NETWORK_CODE_HERE
  path_to_private_key_file: INSERT_PATH_TO_FILE_HERE

Within the yaml file or string it also asks about a path to the specific json private key file, but I'm not sure how to attach the location relative to Databricks. The private key file also contains credentials which I want to limit user access to, so a DBFS upload may not work in this case since it's shared across the organization.

I was wondering if anyone has insight connecting via a service account/yaml to Google Ad Manager Reporting via Databricks, and how I can go about successfully authenticating the ad manager client.

14 REPLIES 14

Debayan
Esteemed Contributor III
Esteemed Contributor III

Hi, when you mention GAM's reporting API via Databricks , if you could please confirm (a rough idea on the workflow may be) on how Databricks is involved , it will be helpful.

Kaniz_Fatma
Community Manager
Community Manager

Hi @Ellaine Hoโ€‹, We havenโ€™t heard from you on the last response from @Debayan Mukherjeeโ€‹, and I was checking back to see if his suggestions helped you.

Or else, If you have any solution, please do share that with the community as it can be helpful to others.

Also, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.

ellaine
New Contributor III

Hi @Kaniz Fatmaโ€‹ - i haven't gotten an answer to the above issue but provided some info below. Would you be able to look into it for me?

Hi @Ellaine Hoโ€‹, Yes, I will look into it and get back to you. Thanks

ellaine
New Contributor III

Hi @Debayan Mukherjeeโ€‹ - I am using python code and the overall workflow should be the following:

  • GAM uses OAuth2 authentication for all of its API requests
    • Credentials are stored in a YAML file with the following params:
ad_manager:
  application_name: INSERT_APPLICATION_NAME_HERE
  network_code: INSERT_NETWORK_CODE_HERE
  path_to_private_key_file: INSERT_PATH_TO_FILE_HERE
  • Additionally it also requires a path to private key which is stored in a json object.
    • When the request is being made, it reads through the YAML file based on a default location on a local machine ~/googleads.yaml and also looks for the path to the key file in authenticating the request.
    • This process is referenced here: https://developers.google.com/ad-manager/api/start
  • Once authenticated, use the client object to fetch saved report queries from GAM and use Databricks to tranform/store this data. https://developers.google.com/ad-manager/api/reporting

The trouble I am having here is two fold:

  • For the YAML file, I didn't really see any documentation online on how we can host this file on Databricks other than DBFS, which is a shared drive. Since the YAML file contains some credential information I am not sure where I should be storing it.
    • Some further research pointed me towards parsing YAML as a string in the python code and using a LoadFromString method to authenticate the client which could work in this case, but I have not fully tested it out
  • If I am able to parse the string I'm still not sure how I can get it to load the json private key file because I need to specify a path above.

I'm looking to get advice on how I can work around this and use Databricks to make API calls and fetch reports using the ad manager client.

ellaine
New Contributor III

Hi @Debayan Mukherjeeโ€‹ - update on the YAML string. I tested this out locally and you can pass in a YAML string and use LoadFromString() to authenticate. However I would like support on how to pass the secret key file json object through databricks.

Debayan
Esteemed Contributor III
Esteemed Contributor III

Hi @Ellaine Hoโ€‹ you can pass secret json file through secret API in databricks: https://docs.databricks.com/dev-tools/api/latest/secrets.html

Also, you can use dbutils.secret in a notebook or job to read a secret. Reference: https://docs.databricks.com/security/secrets/secrets.html#read-a-secret

ellaine
New Contributor III

Thanks @Debayan Mukherjeeโ€‹ - will give that a try and follow up if I have more questions.

ellaine
New Contributor III

Hi @Debayan Mukherjeeโ€‹ - I reviewed the docs and I'm not sure those resources fit in with my needs. It seems like the secrets API and secrets manager allow you to load the json object in key vallue pairings but the access is based on using dbuutils.secret in referencing the scope and key.

For the purposes of this API i need the result to be a PATH that I can call in the YAML string for example "path_to_file.json" and pass that into the authentication process. Any insights on the above? I feel like the filestore can address my concern of the format but is there another way to upload json files without having public access?

ellaine
New Contributor III

hi @Debayan Mukherjeeโ€‹ @Kaniz Fatmaโ€‹ - any insight on the above? Thanks for both of your help!

Debayan
Esteemed Contributor III
Esteemed Contributor III

Hi @Ellaine Hoโ€‹ , Sorry for the delay! the json key file can be uploaded to your DBFS and that path_to_file can be mentioned. (https://docs.databricks.com/administration-guide/workspace/dbfs-ui-upload.html)

Please let us know if you need more clarification on the same.

ellaine
New Contributor III

Hi @Debayan Mukherjeeโ€‹ - is it publicly accessible if you upload through file store? The private key file should not be publicly accessible based on my needs.

Debayan
Esteemed Contributor III
Esteemed Contributor III

Hi @Ellaine Hoโ€‹ ,

In such cases like yours , key file can be encoded to base64. There are multiple methods to work with base64 encoded keys in databricks.

If you are uploading it in dbfs then you can encode it with base64 encoder online (https://www.base64encode.org/) upload it in DBFS and mention the path.

The best practice can be using secrets. You can use the above tool to Base64-encode the contents of your JSON key file, create a secret in a Databricks-backed scope and then you can copy & paste the Base64-encoded text into your secret value. After that, you can reference your secret with the following Spark config of your cluster:

spark.conf.set("credentials", base64_string)

If it is for any configuration in cluster:

You can encode private key file with base64 encoder online (https://www.base64encode.org/) and mention the below config in spark config. (for base64 encoding, you have to encode the entire file, example below)

{
  "type": "service_account",
  "project_id": "anything",
  "private_key_id": "***",
  "private_key": "-----BEGIN PRIVATE KEY-----\nngkzh\nVWx+z/ISmhN5x9xLINbU5IA+anbH0/lbw5s=\n-----END PRIVATE KEY-----\n",
  "client_email": "***@xyz.com",
  "client_id": "19208260",
  "auth_uri": "https://accounts.googleapis.com/o/oauth2/auth",
  "token_uri": "https://oauth2.googleapis.com/token",
  "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
  "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/cloud-sql-proxy%40fe-dev-sandbox.iam.com"
}

Please let us know if you need further clarification on the same. We are more than happy to assist you further.

Anonymous
Not applicable

Hi @Ellaine Hoโ€‹ 

Hope all is well!

Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. 

We'd love to hear from you.

Thanks!

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group