โ09-27-2022 01:01 PM
Hello - I was wondering whether anyone has had any experience fetching data through GAM's Reporting API via Databricks.
The Reporting API requires installation of the "googleads" library, as well as a googleads.yaml file. I was able to find some documentation online where you can either load the yaml file from storage by specifying a location for it, or load it through a yaml string.
ad_manager:
application_name: INSERT_APPLICATION_NAME_HERE
network_code: INSERT_NETWORK_CODE_HERE
path_to_private_key_file: INSERT_PATH_TO_FILE_HERE
Within the yaml file or string it also asks about a path to the specific json private key file, but I'm not sure how to attach the location relative to Databricks. The private key file also contains credentials which I want to limit user access to, so a DBFS upload may not work in this case since it's shared across the organization.
I was wondering if anyone has insight connecting via a service account/yaml to Google Ad Manager Reporting via Databricks, and how I can go about successfully authenticating the ad manager client.
โ09-28-2022 09:39 AM
Hi, when you mention GAM's reporting API via Databricks , if you could please confirm (a rough idea on the workflow may be) on how Databricks is involved , it will be helpful.
โ10-03-2022 07:49 AM
Hi @Kaniz Fatmaโ - i haven't gotten an answer to the above issue but provided some info below. Would you be able to look into it for me?
โ09-29-2022 07:26 AM
Hi @Debayan Mukherjeeโ - I am using python code and the overall workflow should be the following:
ad_manager:
application_name: INSERT_APPLICATION_NAME_HERE
network_code: INSERT_NETWORK_CODE_HERE
path_to_private_key_file: INSERT_PATH_TO_FILE_HERE
The trouble I am having here is two fold:
I'm looking to get advice on how I can work around this and use Databricks to make API calls and fetch reports using the ad manager client.
โ09-30-2022 09:24 AM
Hi @Debayan Mukherjeeโ - update on the YAML string. I tested this out locally and you can pass in a YAML string and use LoadFromString() to authenticate. However I would like support on how to pass the secret key file json object through databricks.
โ10-05-2022 11:33 PM
Hi @Ellaine Hoโ you can pass secret json file through secret API in databricks: https://docs.databricks.com/dev-tools/api/latest/secrets.html
Also, you can use dbutils.secret in a notebook or job to read a secret. Reference: https://docs.databricks.com/security/secrets/secrets.html#read-a-secret
โ10-06-2022 06:26 AM
Thanks @Debayan Mukherjeeโ - will give that a try and follow up if I have more questions.
โ10-07-2022 11:13 AM
Hi @Debayan Mukherjeeโ - I reviewed the docs and I'm not sure those resources fit in with my needs. It seems like the secrets API and secrets manager allow you to load the json object in key vallue pairings but the access is based on using dbuutils.secret in referencing the scope and key.
For the purposes of this API i need the result to be a PATH that I can call in the YAML string for example "path_to_file.json" and pass that into the authentication process. Any insights on the above? I feel like the filestore can address my concern of the format but is there another way to upload json files without having public access?
โ10-14-2022 07:19 AM
hi @Debayan Mukherjeeโ @Kaniz Fatmaโ - any insight on the above? Thanks for both of your help!
โ10-17-2022 06:08 AM
Hi @Ellaine Hoโ , Sorry for the delay! the json key file can be uploaded to your DBFS and that path_to_file can be mentioned. (https://docs.databricks.com/administration-guide/workspace/dbfs-ui-upload.html)
Please let us know if you need more clarification on the same.
โ10-17-2022 07:24 AM
Hi @Debayan Mukherjeeโ - is it publicly accessible if you upload through file store? The private key file should not be publicly accessible based on my needs.
โ10-18-2022 05:36 AM
Hi @Ellaine Hoโ ,
In such cases like yours , key file can be encoded to base64. There are multiple methods to work with base64 encoded keys in databricks.
If you are uploading it in dbfs then you can encode it with base64 encoder online (https://www.base64encode.org/) upload it in DBFS and mention the path.
The best practice can be using secrets. You can use the above tool to Base64-encode the contents of your JSON key file, create a secret in a Databricks-backed scope and then you can copy & paste the Base64-encoded text into your secret value. After that, you can reference your secret with the following Spark config of your cluster:
spark.conf.set("credentials", base64_string)
If it is for any configuration in cluster:
You can encode private key file with base64 encoder online (https://www.base64encode.org/) and mention the below config in spark config. (for base64 encoding, you have to encode the entire file, example below)
{
"type": "service_account",
"project_id": "anything",
"private_key_id": "***",
"private_key": "-----BEGIN PRIVATE KEY-----\nngkzh\nVWx+z/ISmhN5x9xLINbU5IA+anbH0/lbw5s=\n-----END PRIVATE KEY-----\n",
"client_email": "***@xyz.com",
"client_id": "19208260",
"auth_uri": "https://accounts.googleapis.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/cloud-sql-proxy%40fe-dev-sandbox.iam.com"
}
Please let us know if you need further clarification on the same. We are more than happy to assist you further.
โ11-20-2022 07:46 PM
Hi @Ellaine Hoโ
Hope all is well!
Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group