cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

tom_1
by New Contributor III
  • 3909 Views
  • 1 replies
  • 0 kudos

Customer Managed VPC: Databricks IP Address Ranges

Hello,how often does Databricks change its public ip addresses (the ones that must be whitelisted in a customer managed vpc) and where can I find them?I found this list, but it seems to be incomplete.We moved from a managed vpc to a customer-managed ...

  • 3909 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Greetings @tom_1 ,  you’re right to cross-check the published list—here’s how the IPs and ports fit together and where to get the authoritative values. Where to find the current Databricks IPs The official source is the Databricks “IP addresses and d...

  • 0 kudos
EdsonDEV
by New Contributor
  • 4174 Views
  • 1 replies
  • 0 kudos

Error on github association

Hello,I'm having an error when trying to link a GitHub account to store some scripts. Looks like my profile keeps on eternal loading too. Does anyone know how I can fix that?

  • 4174 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Hello @EdsonDEV ,  Thanks for the screenshot—your Linked accounts page is showing “Error fetching credentials,” which blocks linking GitHub and can make the settings view spin indefinitely.   What typically causes this A broken or stale linked Git cr...

  • 0 kudos
trailblazer
by New Contributor III
  • 759 Views
  • 2 replies
  • 3 kudos

Resolved! Azure Databricks Cluster Pricing

Hi, I am trying to workout a rough total pricing of Azure Databricks Cluster using the following assumption. I want to spin a cluster on D13 v2 vms with 9 executors, so in total 1+9 = 10 nodes. I want to use the cluster for 10 hours a day, 30 hours a...

trailblazer_0-1762440739961.png
  • 759 Views
  • 2 replies
  • 3 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 3 kudos

Here is the simple calculation I use based on dollars and assuming the infra is in EUS.Cost ComponentsAzure VM Cost (D13 v2)On-demand price: $0.741/hour per VMMonthly VM cost:10 VMs×300 hours×$0.741=$2,223Yearly VM cost:10×3600×$0.741=$26,6762 2. Dat...

  • 3 kudos
1 More Replies
Carson
by New Contributor II
  • 4745 Views
  • 1 replies
  • 1 kudos

How to monitor serverless compute usage in real time

Hello, I'm using Databricks Connect to connect a dash app to my Databricks account. My use case is similar to this example: https://github.com/databricks-demos/dbconnect-examples/tree/main/python/PlotlyI've been able to get everything configured and ...

  • 4745 Views
  • 1 replies
  • 1 kudos
Latest Reply
mark_ott
Databricks Employee
  • 1 kudos

There is currently no direct, real-time equivalent in the Databricks UI’s “Compute” tab for monitoring serverless (SQL serverless or Data Engineering serverless) compute usage in the same way as classic clusters, where you see live memory, DBU/hr, an...

  • 1 kudos
JanJaros
by New Contributor
  • 1393 Views
  • 1 replies
  • 1 kudos

Databricks OAUTH(OIDC) with ORY Network

Hi,we are trying to setup OIDC AUTH for Databricks with our Ory Network account. So far we have been using it without any issues with all of our apps and now we wanted to set it up also for Databricks. Unfortunately after many attempts with different...

  • 1393 Views
  • 1 replies
  • 1 kudos
Latest Reply
mark_ott
Databricks Employee
  • 1 kudos

To debug OIDC authentication issues (“oidc_generic_token_failure”) with Databricks using Ory Network as your identity provider, there are several steps and data sources you can leverage for deeper insights. Where to Find Detailed Error Information Da...

  • 1 kudos
funsjanssen
by New Contributor III
  • 6880 Views
  • 2 replies
  • 8 kudos

Azure Databricks Multi Tenant Solution

Hello Everyone,For the past few months, we’ve been extensively exploring the use of Databricks as the core of our data warehousing product. We provide analytics dashboards to other organizations and are particularly interested in the Column-Level Sec...

  • 6880 Views
  • 2 replies
  • 8 kudos
Latest Reply
mark_ott
Databricks Employee
  • 8 kudos

Implementing robust Row-Level Security (RLS) and Column-Level Security (CLS) in Azure Databricks for multi-tenant analytics—especially with seamless SSO from Power BI and custom apps—is a common concern for B2B SaaS providers scaling to large user ba...

  • 8 kudos
1 More Replies
TSK
by New Contributor
  • 4584 Views
  • 1 replies
  • 0 kudos

GitLab on DCS, Datarbricks Container Services

I would like to set up GitLab and Grafana servers using Databricks Container Services (DCS). The reason is that our development team is small, and the management costs of using EKS are not justifiable. We want to make GitLab and Grafana accessible in...

Administration & Architecture
AWS
Container
DevOps
EKS
Kubernetes
  • 4584 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

Yes, it is possible to set up GitLab and Grafana servers using Databricks Container Services (DCS) for internal accessibility. DCS supports custom Docker containers and allows you to deploy server applications such as GitLab and Grafana, making it a ...

  • 0 kudos
Cstan
by New Contributor
  • 4431 Views
  • 1 replies
  • 0 kudos

VPAT Form

How do I find a Voluntary Product Accessibility Template (VPAT) from Databricks?

  • 4431 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

To obtain a Voluntary Product Accessibility Template (VPAT) from Databricks, you must request it directly from Databricks support or your designated account manager. Databricks prepares and provides the VPAT upon request, detailing how their platform...

  • 0 kudos
Isi
by Honored Contributor III
  • 7922 Views
  • 7 replies
  • 4 kudos

Unable to access Databricks Volume from job triggered via API (Container Services)

Hi everyone,We’re facing a strange issue when trying to access a Databricks Volume from a job that is triggered via the Databricks REST API (not via Workflows). These jobs are executed using container services, which may be relevant, perhaps due to i...

  • 7922 Views
  • 7 replies
  • 4 kudos
Latest Reply
mark_ott
Databricks Employee
  • 4 kudos

Databricks Volumes (especially Unity Catalog (UC) volumes) often have strict execution context requirements and typically expect the workload to run in Databricks-managed clusters or notebooks where the specialized file system and security context ar...

  • 4 kudos
6 More Replies
ThePussCat
by New Contributor III
  • 4826 Views
  • 8 replies
  • 3 kudos

Disable local user creation when using SCIM Provisioning

We have implemented SCIM Provisioning using Azure AD (MS Entra) to Azure Databricks.All is good.Except, we would like to know if it is possible to disable the ability to create users within Azure Databricks, so that none can be "accidentally" created...

  • 4826 Views
  • 8 replies
  • 3 kudos
Latest Reply
ThePussCat
New Contributor III
  • 3 kudos

Thank you! Thats really clear now, and hopefully helpful to others.Ours is set to (default) OFF - we do not want JIT provisioning enabled. 

  • 3 kudos
7 More Replies
slloyd
by New Contributor
  • 4830 Views
  • 1 replies
  • 0 kudos

client.openSession() : TypeError: Cannot read properties of undefined (reading '0')

I am using the Databricks SQL Driver for Node.js to create an endpoint that queries a Databricks database following the guide here Databricks SQL Driver for Node.js | Databricks on AWS . This code was working previously but now I am getting a TypeErr...

  • 4830 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

Your TypeError: Cannot read properties of undefined (reading '0') at session = await client.openSession() typically indicates an unexpected change or regression inside the Databricks SQL Node.js driver or the environment, even if your environment var...

  • 0 kudos
rjurnitos
by New Contributor II
  • 4918 Views
  • 2 replies
  • 0 kudos

GCP Cluster will not boot correctly with Libraries preconfigured - notebooks never attach

I am running Databricks 15.4 LTS on a single-node `n1-highmem-32` for a PySpark / GraphFrames app (not using builtin `graphframes` on ML image because we don't need a GPU) and I can start the cluster fine so long as libraries are not attached. I can ...

rjurnitos_0-1739831664728.png
  • 4918 Views
  • 2 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

It sounds like you are encountering a cluster “hang”/notebook attach timeout after restarting a Databricks 15.4 LTS single-node cluster with custom libraries (including GraphFrames via Maven and additional .whl and requirements.txt dependencies). You...

  • 0 kudos
1 More Replies
jonas_braun
by New Contributor II
  • 4209 Views
  • 2 replies
  • 0 kudos

Asset Bundle: inject job start_time parameter

Hey!I'm deploying a job with databricks asset bundles.When the pyspark task is started on a job cluster, I want the python code to read the job start_time and select the right data sources based on that parameter.Ideally, I would read the parameter f...

  • 4209 Views
  • 2 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

You cannot directly access a dynamic value like ${job.start_time.iso_datetime} in a Databricks Asset Bundle YAML for job parameters—Databricks jobs do not inject special variables (like the job run’s start time) automatically into job parameters at r...

  • 0 kudos
1 More Replies
Adam_Borlase
by New Contributor III
  • 951 Views
  • 4 replies
  • 4 kudos

Resolved! Connect to a SQL Server Database with Windows Authentication

Good Day all, I am in the process of trying to connect to one of our SQL servers. It is attached to our Entra for authentication. When trying to create an external connection to the Server in Unity I am getting a failure due to the User and Password ...

  • 951 Views
  • 4 replies
  • 4 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 4 kudos

@Adam_Borlase Can you try this steps to see there is no network issue.Use SQL AuthenticationCreate a SQL Server login (not Entra ID) with a username and password.Grant it access to the required database.Use this credential in Unity Catalog's external...

  • 4 kudos
3 More Replies
Daan_Fostier
by New Contributor
  • 5001 Views
  • 1 replies
  • 0 kudos

Adding service principal with Microsoft Entra ID fails

Hi,I am trying to add a service principal using Microsoft Entre ID, but I encounter an issue as described in the following documentation: https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/oauth-m2m.I followed the instructions step by ...

Daan_Fostier_0-1725548408289.png Daan_Fostier_1-1725548706489.png
  • 5001 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

The error message you encountered—“Successfully created new service principal but failed to add the new service principal to this workspace. Error fetching user”—along with the service principal's absence in “Users,” typically points to a synchroniza...

  • 0 kudos