cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

AlexMc
by New Contributor III
  • 1675 Views
  • 3 replies
  • 1 kudos

Library installation failed for library due to user error for pypi

Hi!I get the below error when a cluster job starts up and tries to install a Python .whl file. (Which is hosted on an Azure Artefact feed, though this seems more like a problem of trying to read from a disk/network storage). The failure is seemingly ...

  • 1675 Views
  • 3 replies
  • 1 kudos
Latest Reply
AlexMc
New Contributor III
  • 1 kudos

Thanks both - I think the problem is that this library installation is called when creating a new Job & Task via the rest endpoint. Where the libraires are specified in the .json file. So short version, don't think I can 'get at' the pip install call...

  • 1 kudos
2 More Replies
SBL
by Contributor
  • 2083 Views
  • 2 replies
  • 1 kudos

Resolved! Why am I seeing NAT Gateway in the cost? Serverless Compute.

I have Azure Databricks premium subscription. I am running the Python interactive Notebooks in the Databricks Wokrspace using Serverless compute since the last few days. Today I received an alter in my email saying the monthly Billing already crossed...

NAT Gateway.png
  • 2083 Views
  • 2 replies
  • 1 kudos
Latest Reply
SBL
Contributor
  • 1 kudos

Thanks @szymon_dybczak I deleted the Workspace and the NAT Gatway service got deleted from the vnet. I created a simple single node cluster to run my code. 

  • 1 kudos
1 More Replies
philmac750
by New Contributor III
  • 1422 Views
  • 1 replies
  • 0 kudos

Resolved! Can I add another user to Free edition

Is it possible to add another user to Free edition ?I am wanting to test what they can see when they connect as a restricted user i.e. only granted Browse on 1 catalogThanks

  • 1422 Views
  • 1 replies
  • 0 kudos
Latest Reply
philmac750
New Contributor III
  • 0 kudos

Apologies.  Yes you can - I was in the wrong console.

  • 0 kudos
Charansai
by New Contributor III
  • 3792 Views
  • 7 replies
  • 0 kudos

Failing PowerBi connection with data Bricks via SQL warehouse

  I'm encountering an 'Invalid credentials' error (Session ID: 4601-b5a6-0daf792752a2, Region: us) when connecting Power BI to an Azure Databricks SQL Warehouse using an SPN. The SPN has CAN MANAGE access at SQL warehouse, admin rights at account and...

SaiChandu_2-1752261722144.png
  • 3792 Views
  • 7 replies
  • 0 kudos
Latest Reply
Khaja_Zaffer
Contributor III
  • 0 kudos

Hello sai, sorry for the experience   I usually available at desk during EMEA time zone. Apologies for the delay. Can you please try this?In "Advanced settings" of the data source within the gateway configuration and set the "Connection Encryption se...

  • 0 kudos
6 More Replies
xaveri
by New Contributor
  • 2463 Views
  • 1 replies
  • 0 kudos

Looking for insights on enabling Databricks Automatic Provisioning

We currently have a SCIM provisioning connector set up to synchronize identities from Entra ID to Unity Catalog.We’re now considering enabling Databricks Automatic Provisioning but want to fully understand the potential impact on our environment befo...

  • 2463 Views
  • 1 replies
  • 0 kudos
Latest Reply
Khaja_Zaffer
Contributor III
  • 0 kudos

TranslateHello Xaveri.Good day!Here are few links related to databricks provisioninghttps://docs.databricks.com/aws/en/admin/users-groups/scim/aadhttps://www.databricks.com/blog/announcing-automatic-identity-management-azure-databricksBut do let me k...

  • 0 kudos
HariSelvarajan
by Databricks Employee
  • 2687 Views
  • 4 replies
  • 4 kudos

Privileged Identity Management for Databricks with Microsoft Entra ID

Privileged Identity Management (PIM) can be used to secure access to critical Databricks roles with Just-in-Time (JIT) access. This approach helps organizations enforce time-bound permissions, approval workflows, and centralized auditing for sensitiv...

  • 2687 Views
  • 4 replies
  • 4 kudos
Latest Reply
AnitPatelADB
New Contributor II
  • 4 kudos

Is this possible without SCIM?

  • 4 kudos
3 More Replies
matthiasn
by New Contributor III
  • 1526 Views
  • 2 replies
  • 0 kudos

Delete unassigned catalogs

Hi everybody,due to some not-so-optimal Infrastructure as code experiments with terraform I ended up a lot (triple digit) of catalogs in a metastore that are not assigned to any workspace and that i want to delete.Unfortunately, there is no way to ev...

  • 1526 Views
  • 2 replies
  • 0 kudos
Latest Reply
matthiasn
New Contributor III
  • 0 kudos

Yeah, I see those catalogs and i know that I could reattach and delete them. As i have around 100 those catalogs it would be nice to iterate through them by getting a list, e.g. using the cli or the rest API. And then force delete them , as described...

  • 0 kudos
1 More Replies
Sisi
by New Contributor
  • 1203 Views
  • 1 replies
  • 1 kudos

VS Code - ipynb vs py execution - spark issue

Databricks Connect works inside VS Code notebook but the same code fails in a standalone script withValueError: default auth: cannot configure default credentialsI’m developing locally with **Databricks Connect 16.1.6** and VS Code.Inside a Jupyter n...

  • 1203 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @Sisi ,I think what's happening here is when you debug with option "Debug current file with Databricks Connect" then VS Code is using Databricks extension, which automatically handles authentication and sets up proper configuration.The regular Pyt...

  • 1 kudos
Sharanya13
by Contributor III
  • 2946 Views
  • 1 replies
  • 2 kudos

Resolved! Lakebase use cases

1. What are the use cases for Lakebase?  When should I use the Lakebase Postgres over delta tables?2. What are the differences between open-source Postgres and Lakebase?3. Should I utilize Lakebase for all OLTP requirements?

  • 2946 Views
  • 1 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 2 kudos

Hi @Sharanya13 ,1. Use Lakebase whenever you have application workload (OLTP) and you require low latency. For analytical workloads use Lakehouse. Here you have couple of example use cases from documentation:Serving data and/or features from the lake...

  • 2 kudos
carlosjuribe
by New Contributor III
  • 2421 Views
  • 6 replies
  • 2 kudos

Out of memory error when installing environment dependencies of UC Python UDF

Hi,I've created a small UC Python UDF to test whether it works with custom dependencies (new PP feature), and every time I'm getting OOM errors with this message: [UDF_ENVIRONMENT_USER_ERROR.OUT_OF_MEMORY] Failed to install UDF dependencies for <cata...

  • 2421 Views
  • 6 replies
  • 2 kudos
Latest Reply
Khaja_Zaffer
Contributor III
  • 2 kudos

I tried with cluster, spent some couple of hours to load some libraries but unable to do. may be someone else can help you on this. 

  • 2 kudos
5 More Replies
mbanxp
by New Contributor III
  • 2207 Views
  • 5 replies
  • 4 kudos

Resolved! Metastore deletion issues

Good afternoon, I have an issue with my metastore in North Europe.All my workspaces got detached:If I go to Databricks console, I can see the metastore in North Europe I created.However, when I select the metastore in North Europe, I get the followin...

mbanxp_0-1751622293898.png mbanxp_1-1751621952878.png mbanxp_2-1751621985852.png mbanxp_3-1751622036461.png
  • 2207 Views
  • 5 replies
  • 4 kudos
Latest Reply
mbanxp
New Contributor III
  • 4 kudos

I solved the issue by deleting all the asignments before deleting the metastore.1. Access to Databricks CLI and authenticate 2. List metastores>> databricks account metastores list 3. List wotrkspaces and check assignments>> databricks account worksp...

  • 4 kudos
4 More Replies
EjB
by New Contributor
  • 1362 Views
  • 1 replies
  • 1 kudos

Drop schema or catalog using cascade function

Hello In Databricks (non-Unity Catalog), I have two schemas (schema_a and schema_b) that both use the same root location in DBFS or external storage like ADLS.Example:abfss://container@storage_account.dfs.core.windows.net/data/project/schema_aabfss:/...

  • 1362 Views
  • 1 replies
  • 1 kudos
Latest Reply
ilir_nuredini
Honored Contributor
  • 1 kudos

Hello @EjB For the given example, here is the response:Will DROP SCHEMA schema_a CASCADE remove or affect tables in schema_b?No, unless:1. The tables in schema_a are managed tables, AND2. Tables in schema_b store their data physically inside /schema_...

  • 1 kudos
Kutbuddin
by New Contributor III
  • 2103 Views
  • 3 replies
  • 0 kudos

[INTERNAL_ERROR] Query could not be scheduled: HTTP Response code: 503. Please try again later

We have a databricks job configured to run a dbt project. The dbt cli compute cluster being used is serverless with a serverless sql warehouse. We encountered this error during a run. SQLSTATE: XX000Any idea why this occurred?

  • 2103 Views
  • 3 replies
  • 0 kudos
Latest Reply
Amine8089
New Contributor II
  • 0 kudos

Hi,We are experiencing same recurring HTTP errors throughout the day when executing queries on Databricks. The specific error message we receive is: "[INTERNAL_ERROR] Query could not be scheduled: HTTP Response code: 503. Please try again later. SQLS...

  • 0 kudos
2 More Replies
antonionuzzo
by New Contributor III
  • 2044 Views
  • 2 replies
  • 3 kudos

Resolved! System tables performance optimization

Hi AllAre there any Databricks lab projects or GitHub repositories that leverage system tables to provide dashboards or code for monitoring, and more importantly, for optimizing workflows and clusters based on usage?

  • 2044 Views
  • 2 replies
  • 3 kudos
Latest Reply
Sharanya13
Contributor III
  • 3 kudos

+1 to @szymon_dybczak. I would also add the dashboards for DB SQL Warehouse monitoring.

  • 3 kudos
1 More Replies
MBV3
by Contributor
  • 3247 Views
  • 6 replies
  • 0 kudos

Unable to see sample data in Hive Metastore after moving to GCE

Hi,We have recently moved from GKE to GCE, it is taking forever to load the sample data in the manged delta tables.Even running simple select sql statements are taking forever. Totally clueless here, any help will be appreciatedThanks

  • 3247 Views
  • 6 replies
  • 0 kudos
Latest Reply
MBV3
Contributor
  • 0 kudos

Hi All,Strangely after struggle for 2 days we figured out that we can't run the cluster in scalable mode, so after selecting single node mode we are able to execute queries and job. It seems there is a bug in the Databrick's GKE to GCE migration. Won...

  • 0 kudos
5 More Replies