cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Carson
by New Contributor II
  • 5328 Views
  • 1 replies
  • 1 kudos

How to monitor serverless compute usage in real time

Hello, I'm using Databricks Connect to connect a dash app to my Databricks account. My use case is similar to this example: https://github.com/databricks-demos/dbconnect-examples/tree/main/python/PlotlyI've been able to get everything configured and ...

  • 5328 Views
  • 1 replies
  • 1 kudos
Latest Reply
mark_ott
Databricks Employee
  • 1 kudos

There is currently no direct, real-time equivalent in the Databricks UI’s “Compute” tab for monitoring serverless (SQL serverless or Data Engineering serverless) compute usage in the same way as classic clusters, where you see live memory, DBU/hr, an...

  • 1 kudos
JanJaros
by New Contributor
  • 1614 Views
  • 1 replies
  • 1 kudos

Databricks OAUTH(OIDC) with ORY Network

Hi,we are trying to setup OIDC AUTH for Databricks with our Ory Network account. So far we have been using it without any issues with all of our apps and now we wanted to set it up also for Databricks. Unfortunately after many attempts with different...

  • 1614 Views
  • 1 replies
  • 1 kudos
Latest Reply
mark_ott
Databricks Employee
  • 1 kudos

To debug OIDC authentication issues (“oidc_generic_token_failure”) with Databricks using Ory Network as your identity provider, there are several steps and data sources you can leverage for deeper insights. Where to Find Detailed Error Information Da...

  • 1 kudos
funsjanssen
by New Contributor III
  • 7517 Views
  • 2 replies
  • 8 kudos

Azure Databricks Multi Tenant Solution

Hello Everyone,For the past few months, we’ve been extensively exploring the use of Databricks as the core of our data warehousing product. We provide analytics dashboards to other organizations and are particularly interested in the Column-Level Sec...

  • 7517 Views
  • 2 replies
  • 8 kudos
Latest Reply
mark_ott
Databricks Employee
  • 8 kudos

Implementing robust Row-Level Security (RLS) and Column-Level Security (CLS) in Azure Databricks for multi-tenant analytics—especially with seamless SSO from Power BI and custom apps—is a common concern for B2B SaaS providers scaling to large user ba...

  • 8 kudos
1 More Replies
TSK
by New Contributor
  • 4681 Views
  • 1 replies
  • 0 kudos

GitLab on DCS, Datarbricks Container Services

I would like to set up GitLab and Grafana servers using Databricks Container Services (DCS). The reason is that our development team is small, and the management costs of using EKS are not justifiable. We want to make GitLab and Grafana accessible in...

Administration & Architecture
AWS
Container
DevOps
EKS
Kubernetes
  • 4681 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

Yes, it is possible to set up GitLab and Grafana servers using Databricks Container Services (DCS) for internal accessibility. DCS supports custom Docker containers and allows you to deploy server applications such as GitLab and Grafana, making it a ...

  • 0 kudos
Cstan
by Databricks Partner
  • 4753 Views
  • 1 replies
  • 0 kudos

VPAT Form

How do I find a Voluntary Product Accessibility Template (VPAT) from Databricks?

  • 4753 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

To obtain a Voluntary Product Accessibility Template (VPAT) from Databricks, you must request it directly from Databricks support or your designated account manager. Databricks prepares and provides the VPAT upon request, detailing how their platform...

  • 0 kudos
Isi
by Honored Contributor III
  • 8665 Views
  • 7 replies
  • 4 kudos

Unable to access Databricks Volume from job triggered via API (Container Services)

Hi everyone,We’re facing a strange issue when trying to access a Databricks Volume from a job that is triggered via the Databricks REST API (not via Workflows). These jobs are executed using container services, which may be relevant, perhaps due to i...

  • 8665 Views
  • 7 replies
  • 4 kudos
Latest Reply
mark_ott
Databricks Employee
  • 4 kudos

Databricks Volumes (especially Unity Catalog (UC) volumes) often have strict execution context requirements and typically expect the workload to run in Databricks-managed clusters or notebooks where the specialized file system and security context ar...

  • 4 kudos
6 More Replies
ThePussCat
by Databricks Partner
  • 5529 Views
  • 8 replies
  • 3 kudos

Disable local user creation when using SCIM Provisioning

We have implemented SCIM Provisioning using Azure AD (MS Entra) to Azure Databricks.All is good.Except, we would like to know if it is possible to disable the ability to create users within Azure Databricks, so that none can be "accidentally" created...

  • 5529 Views
  • 8 replies
  • 3 kudos
Latest Reply
ThePussCat
Databricks Partner
  • 3 kudos

Thank you! Thats really clear now, and hopefully helpful to others.Ours is set to (default) OFF - we do not want JIT provisioning enabled. 

  • 3 kudos
7 More Replies
slloyd
by New Contributor
  • 5212 Views
  • 1 replies
  • 0 kudos

client.openSession() : TypeError: Cannot read properties of undefined (reading '0')

I am using the Databricks SQL Driver for Node.js to create an endpoint that queries a Databricks database following the guide here Databricks SQL Driver for Node.js | Databricks on AWS . This code was working previously but now I am getting a TypeErr...

  • 5212 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

Your TypeError: Cannot read properties of undefined (reading '0') at session = await client.openSession() typically indicates an unexpected change or regression inside the Databricks SQL Node.js driver or the environment, even if your environment var...

  • 0 kudos
rjurnitos
by New Contributor II
  • 5184 Views
  • 2 replies
  • 0 kudos

GCP Cluster will not boot correctly with Libraries preconfigured - notebooks never attach

I am running Databricks 15.4 LTS on a single-node `n1-highmem-32` for a PySpark / GraphFrames app (not using builtin `graphframes` on ML image because we don't need a GPU) and I can start the cluster fine so long as libraries are not attached. I can ...

rjurnitos_0-1739831664728.png
  • 5184 Views
  • 2 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

It sounds like you are encountering a cluster “hang”/notebook attach timeout after restarting a Databricks 15.4 LTS single-node cluster with custom libraries (including GraphFrames via Maven and additional .whl and requirements.txt dependencies). You...

  • 0 kudos
1 More Replies
jonas_braun
by New Contributor II
  • 4791 Views
  • 2 replies
  • 0 kudos

Asset Bundle: inject job start_time parameter

Hey!I'm deploying a job with databricks asset bundles.When the pyspark task is started on a job cluster, I want the python code to read the job start_time and select the right data sources based on that parameter.Ideally, I would read the parameter f...

  • 4791 Views
  • 2 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

You cannot directly access a dynamic value like ${job.start_time.iso_datetime} in a Databricks Asset Bundle YAML for job parameters—Databricks jobs do not inject special variables (like the job run’s start time) automatically into job parameters at r...

  • 0 kudos
1 More Replies
Adam_Borlase
by New Contributor III
  • 1870 Views
  • 4 replies
  • 4 kudos

Resolved! Connect to a SQL Server Database with Windows Authentication

Good Day all, I am in the process of trying to connect to one of our SQL servers. It is attached to our Entra for authentication. When trying to create an external connection to the Server in Unity I am getting a failure due to the User and Password ...

  • 1870 Views
  • 4 replies
  • 4 kudos
Latest Reply
nayan_wylde
Esteemed Contributor II
  • 4 kudos

@Adam_Borlase Can you try this steps to see there is no network issue.Use SQL AuthenticationCreate a SQL Server login (not Entra ID) with a username and password.Grant it access to the required database.Use this credential in Unity Catalog's external...

  • 4 kudos
3 More Replies
Daan_Fostier
by New Contributor
  • 5795 Views
  • 1 replies
  • 0 kudos

Adding service principal with Microsoft Entra ID fails

Hi,I am trying to add a service principal using Microsoft Entre ID, but I encounter an issue as described in the following documentation: https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/oauth-m2m.I followed the instructions step by ...

Daan_Fostier_0-1725548408289.png Daan_Fostier_1-1725548706489.png
  • 5795 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

The error message you encountered—“Successfully created new service principal but failed to add the new service principal to this workspace. Error fetching user”—along with the service principal's absence in “Users,” typically points to a synchroniza...

  • 0 kudos
enr0c
by New Contributor
  • 4666 Views
  • 2 replies
  • 0 kudos

Budget Policy - Service Principals don't seem to be allowed to use budget policies

ObjectiveTransfer existing DLT pipeline to new owner (service principal). Budget policies enabled.Steps to reproduceCreated a service principalAssigned it group membership of a group that is allowed to use a budget policyEnsured it has access to the ...

a203a1bb-80d2-4c51-bc2f-dbd379b6e5e8.png Screenshot 2024-11-20 090829.png Screenshot 2024-11-20 090938.png Screenshot 2024-11-20 091107.png
Administration & Architecture
budget-policy
service-principal
  • 4666 Views
  • 2 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

The error message "Pipeline 'Run As' identity does not have access to selected budget policy" typically indicates that, while your service principal is properly configured for general pipeline ownership, it’s missing explicit permission on the budget...

  • 0 kudos
1 More Replies
Newbienewbster
by New Contributor II
  • 4437 Views
  • 1 replies
  • 1 kudos

Change AWS S3 storage class for subset of schema

I have a schema that has grown very large. There are mainly two types of tables in it. One of those types accounts for roughly 80% of the storage. Is there a way to somehow set a policy for those tables only to transfer them to a different storage cl...

  • 4437 Views
  • 1 replies
  • 1 kudos
Latest Reply
mark_ott
Databricks Employee
  • 1 kudos

Yes, it's possible to manage storage costs in Databricks and Unity Catalog by targeting specific tables for different storage classes, but Unity Catalog does add complexity since it abstracts the direct S3 (or ADLS/GCS) object paths from you. Here’s ...

  • 1 kudos
dofrey
by New Contributor II
  • 6180 Views
  • 2 replies
  • 3 kudos

Resolved! Create account group with terraform without account admin permissions

I’m trying to create an account-level group in Databricks using Terraform. When creating a group via the UI, it automatically becomes an account-level group that can be reused across workspaces. However, I’m struggling to achieve the same using Terra...

  • 6180 Views
  • 2 replies
  • 3 kudos
Latest Reply
mark_ott
Databricks Employee
  • 3 kudos

You cannot create account-level groups in Databricks with Terraform unless your authentication mechanism has account admin privileges. This is a design limitation of both the Databricks API and Terraform provider, which require admin-level permission...

  • 3 kudos
1 More Replies