cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

SirCrayon
by New Contributor
  • 60 Views
  • 0 replies
  • 0 kudos

Do shared clusters have multiple drivers?

Hi,I know that with single clusters, theres a single driver node and one driver per cluster. With shared clusters, multiple jobs can run concurrently. Does this still run on a single driver container or multiple driver containers run per application?...

  • 60 Views
  • 0 replies
  • 0 kudos
Mathias
by New Contributor II
  • 1272 Views
  • 3 replies
  • 0 kudos

Different settings per target with Asset bundles

When generating the standard setup with databricks bundle init we will get databricks.yml that references resources/*. The targets are set in the databricks.yml and the resources (pipelines and jobs) are set in different files.I have dlt pipelines th...

  • 1272 Views
  • 3 replies
  • 0 kudos
Latest Reply
137292
New Contributor II
  • 0 kudos

-f is unknown shorthand flag for databricks bundle deploy. Any workaround on how to deploy different jobs with different targets?

  • 0 kudos
2 More Replies
Mumrel
by New Contributor III
  • 117 Views
  • 1 replies
  • 0 kudos

Service Principal can be deleted but permissions not managed

On Azure I added a service principal X to my databricks workspace. I therefore had the Service Prinicpal Manager role on that service principal X. I accidentally downgraded my rights to Service Principal User and now can's get my Managers role back. ...

Mumrel_0-1708705677371.png
  • 117 Views
  • 1 replies
  • 0 kudos
Latest Reply
jamessmith3
New Contributor II
  • 0 kudos

Do you have federated identity enabled on your workspace?

  • 0 kudos
ossinova
by Contributor
  • 67 Views
  • 2 replies
  • 0 kudos

Override default Personal Compute policy using terraform / disable Personal Compute policy

I want to programmatically do some adjustments to the default personal compute resource or preferably create my own custom one based on the same configuration or policy family (in which all users can gain access to) when deploying a new workspace usi...

  • 67 Views
  • 2 replies
  • 0 kudos
Latest Reply
feiyun0112
New Contributor III
  • 0 kudos

use api to create new cluster, set autotermination_minutes parameterhttps://docs.databricks.com/api/workspace/clusters/create#autotermination_minutes 

  • 0 kudos
1 More Replies
100804
by New Contributor II
  • 104 Views
  • 2 replies
  • 1 kudos

Instance Profile Access Controls

I manage instance profiles assigned to specific user groups. For example, instance profile A provides access solely to group A. Currently, any user within group A has the ability to update the permissions of a cluster using instance profile A, which ...

  • 104 Views
  • 2 replies
  • 1 kudos
Latest Reply
100804
New Contributor II
  • 1 kudos

 Hi @Kaniz,Thank you for your guidance. I am following the strategies outlined in steps 1 and 2, and I remain concerned about a specific scenario.Consider instance profile A, which is designed to grant access exclusively to group A. If user A, a memb...

  • 1 kudos
1 More Replies
ossinova
by Contributor
  • 64 Views
  • 2 replies
  • 0 kudos

Defaulting or overriding the cluster policy list order

I have numerous cluster policies varying that varies in sizes (Job - xsmall, Job - small, Job - medium...). However, when I create a new job and create a new job cluster the default policy selected from the drop down menu is on the bigger size. Is th...

Job cluster.png
  • 64 Views
  • 2 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

I checked on our sorting, and there does not seem to be any logic in it.They are not sorted by ID or description for sure.

  • 0 kudos
1 More Replies
rfreitas
by New Contributor II
  • 196 Views
  • 1 replies
  • 1 kudos

Notebook and folder owner

Hi allWe can use this API https://docs.databricks.com/api/workspace/dbsqlpermissions/transferownership to transfer the ownership of a Query.Is there anything similar for notebooks and folders?

  • 196 Views
  • 1 replies
  • 1 kudos
Latest Reply
feiyun0112
New Contributor III
  • 1 kudos

Workspace object permissions — Manage which users can read, run, edit, or manage directories, files, and notebooks.https://docs.databricks.com/api/workspace/workspace/setpermissions

  • 1 kudos
RaulPino
by New Contributor III
  • 249 Views
  • 4 replies
  • 2 kudos

Resolved! Networking reduction cost for NATGateway and Shared Catalog

Use case and context:We have a databricks workspace in a specific region, reading and writing files from/to the same region.We also read from a Shared Catalog in a different company, a data provider, which is pointing to multi-region s3 buckets.The r...

Administration & Architecture
natgateway
networking
S3
shared catalog
VPC
  • 249 Views
  • 4 replies
  • 2 kudos
Latest Reply
RaulPino
New Contributor III
  • 2 kudos

Thanks @Kaniz for all the suggestions.After some days of monitoring NAT cost, I realized that the implementation of the S3 Gateway Endpoint it was actually working, the problem was that I thought that this change would be reflected right away in term...

  • 2 kudos
3 More Replies
6502
by New Contributor III
  • 121 Views
  • 2 replies
  • 1 kudos

Resolved! Error: default auth: cannot configure default credentials, please check...

Hola all, I'm experiencing a quite strange error. The problem is that and happens inside a GITLAB pipeline:$ databricks current-user meError: default auth: cannot configure default credentials, please check https://docs.databricks.com/en/dev-tools/au...

  • 121 Views
  • 2 replies
  • 1 kudos
Latest Reply
6502
New Contributor III
  • 1 kudos

Hola Kaniz, the problem is not on Databricks CLI but is due to some interactions happening inside the Gitlab pipeline. According to the documentation reported here: Databricks personal access token authentication | Databricks on AWS ( at the bottom o...

  • 1 kudos
1 More Replies
Carsten03
by New Contributor III
  • 108 Views
  • 2 replies
  • 0 kudos

Run workflow using git integration with service principal

Hi,I want to run a dbt workflow task and would like to use the git integration for that. Using my personal user I am able to do so but I am running my workflows using a service principal.I added git credentials and the repository using terraform. I a...

  • 108 Views
  • 2 replies
  • 0 kudos
Latest Reply
Carsten03
New Contributor III
  • 0 kudos

Hi @Simranarora ,thank you for your answer. I am not sure if I have expressed myself poorly but this is not fixing the issue I actually have. I have already made the connection via git credentials using a technical user on the git provider side (I us...

  • 0 kudos
1 More Replies
Debi-Moha
by New Contributor
  • 104 Views
  • 2 replies
  • 3 kudos

External locations being shared across workspaces

Currently, we have 3 Unity Catalog enabled workspaces sharing the same metastore. Now, when we create an external location or storage credential in any of the workspaces, it gets reflected across all workspaces. We are looking for some best practices...

  • 104 Views
  • 2 replies
  • 3 kudos
Latest Reply
AlliaKhosla
New Contributor III
  • 3 kudos

Hi @Debi-Moha  Currently we do not have a mechanism to isolate the external locations and storage credentials based on workspaces, since the metastore is shared across the workspaces. Please check below document for recommendations on securing extern...

  • 3 kudos
1 More Replies
Carsten03
by New Contributor III
  • 2003 Views
  • 3 replies
  • 1 kudos

Resolved! Bitbucket Cloud Repo Integration with Token

Hey,I am using Bitbucket Cloud and I want to connect my repository to Databricks. I am able to connect with my personal app password but what I am looking for is an authentication of a technical user.I need the integration to point to my dbt repo, wh...

  • 2003 Views
  • 3 replies
  • 1 kudos
Latest Reply
Carsten03
New Contributor III
  • 1 kudos

Hi @Kaniz,thank you for your response! With this link you provided, I was able to authenticate with Bitbucket Cloud. The solution was to use x-token-auth as a username. I have tried with the generated email address before which didn't work. Thank you...

  • 1 kudos
2 More Replies
smehta_0908
by New Contributor II
  • 84 Views
  • 1 replies
  • 0 kudos

Monitor and Alert Databricks Resource Utilization and Cost Consumption

We want to build monitoring and Alerting solution for Azure Databricks that should capture Resource Utilization details (like Aggregated CPU%, Memory% etc.) and Cost consumption at the Account Level.We have Unity Catalog Enabled and there are multipl...

  • 84 Views
  • 1 replies
  • 0 kudos
Latest Reply
AlliaKhosla
New Contributor III
  • 0 kudos

@smehta_0908 Greetings! You can utilize Datadog for monitoring CPU and memory of clusters. https://docs.datadoghq.com/integrations/databricks/?tab=driveronly For Cost consumption at accounts level you can make use of billable usage logs using the Acc...

  • 0 kudos
Sikalokym
by New Contributor II
  • 129 Views
  • 3 replies
  • 0 kudos

Databricks job with a type "Python wheel" does not work if "Package name" contains dash

HelloI created a databricks job with a type "Python wheel". In the "Package name" field I assigned a python package which contains a dash in its name (see attach). The run of the job failed saying that could not import python package due to dash in t...

test_job.PNG
  • 129 Views
  • 3 replies
  • 0 kudos
Latest Reply
feiyun0112
New Contributor III
  • 0 kudos

job use package name as module name, it is not allowed in python code 

  • 0 kudos
2 More Replies