cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

Job compute fails due to BQ permissions

ruia-dojo
New Contributor

Hello,

My databricks workspace is associated to GCP project analytics.

But me and my team mostly work on GCP project data-science, which contains the only BQ dataset that we have write access to.

I'm trying to automate a pipeline to run on job compute and it fails when reading a table from project data-science, since reading implies writing to a temporary table that we have set to be in the project and dataset we have the rights to (materialisation_project=data-science, dataset=dst). When running the notebook myself or using our development cluster, it works as intended. When running the notebook on a pipeline using job compute, it fails with the error 

Access Denied: Project analytics: User does not have bigquery.jobs.create permission in project analytics

What could be the issue here? job compute service account? How do I edit it? It's trying to write to GCP project analytics, which it shouldn't, given the materialisation parameters passed as an argument.

Thanks,

Rui

1 REPLY 1

MoJaMa
Databricks Employee
Databricks Employee

What identity is the job running as?

Do you have any settings on the all-purpose cluster that you are not setting on the job-cluster?

Maybe you need to provide roles/bigquery.jobUser on project analytics to the job compute service account?