- 2825 Views
- 7 replies
- 0 kudos
Failing PowerBi connection with data Bricks via SQL warehouse
I'm encountering an 'Invalid credentials' error (Session ID: 4601-b5a6-0daf792752a2, Region: us) when connecting Power BI to an Azure Databricks SQL Warehouse using an SPN. The SPN has CAN MANAGE access at SQL warehouse, admin rights at account and...
- 2825 Views
- 7 replies
- 0 kudos
- 0 kudos
Hello sai, sorry for the experience I usually available at desk during EMEA time zone. Apologies for the delay. Can you please try this?In "Advanced settings" of the data source within the gateway configuration and set the "Connection Encryption se...
- 0 kudos
- 1277 Views
- 4 replies
- 0 kudos
Issue accessing databricks secrets from ADF
Hello -Seeing an issue where notebook triggered from ADF is not able to access secret scopes, which was working earlier. Here are the steps I did 1. Provide ADF contributor role permission in databrick workspace. - we tested this and were able to tri...
- 1277 Views
- 4 replies
- 0 kudos
- 0 kudos
Probably you can assign the managed identity admin in the workspace. I believe the permissions are not set at the keyvault. Use this code below.from databricks.sdk import WorkspaceClient w = WorkspaceClient() w.secret.put_acl(scope = {scope_name},pe...
- 0 kudos
- 345 Views
- 0 replies
- 0 kudos
databricks bundle validate: Recommendation: permissions section should explicitly include the curren
Starting from 10/07/2025 my validation bundle step from databricks bundle deploy fail with the folowing message:2025-07-11T07:07:18.5175554Z Recommendation: permissions section should explicitly include the current deployment identity '***' or one of...
- 345 Views
- 0 replies
- 0 kudos
- 4767 Views
- 6 replies
- 3 kudos
Unable to access Databricks Volume from job triggered via API (Container Services)
Hi everyone,We’re facing a strange issue when trying to access a Databricks Volume from a job that is triggered via the Databricks REST API (not via Workflows). These jobs are executed using container services, which may be relevant, perhaps due to i...
- 4767 Views
- 6 replies
- 3 kudos
- 3 kudos
Check also whether the cluster used to run the job has the right access to the specific UC Volume.
- 3 kudos
- 1160 Views
- 2 replies
- 1 kudos
Unity Catalog system tables (table_lineage, column_lineage) not populated
Hi community,We have enabled Unity Catalog system schemas (including `access`) more than 24 hours ago in sandbox. The schemas are showing ENABLE_COMPLETED, and other system tables (like query) are working fine.However, both `system.access.table_linea...
- 1160 Views
- 2 replies
- 1 kudos
- 1 kudos
I'm also running into a similar issue where our system.access.column_lineage and system.access.table_lineage tables are no longer visibile.
- 1 kudos
- 1109 Views
- 1 replies
- 0 kudos
Looking for insights on enabling Databricks Automatic Provisioning
We currently have a SCIM provisioning connector set up to synchronize identities from Entra ID to Unity Catalog.We’re now considering enabling Databricks Automatic Provisioning but want to fully understand the potential impact on our environment befo...
- 1109 Views
- 1 replies
- 0 kudos
- 0 kudos
TranslateHello Xaveri.Good day!Here are few links related to databricks provisioninghttps://docs.databricks.com/aws/en/admin/users-groups/scim/aadhttps://www.databricks.com/blog/announcing-automatic-identity-management-azure-databricksBut do let me k...
- 0 kudos
- 1805 Views
- 4 replies
- 4 kudos
Privileged Identity Management for Databricks with Microsoft Entra ID
Privileged Identity Management (PIM) can be used to secure access to critical Databricks roles with Just-in-Time (JIT) access. This approach helps organizations enforce time-bound permissions, approval workflows, and centralized auditing for sensitiv...
- 1805 Views
- 4 replies
- 4 kudos
- 1238 Views
- 2 replies
- 0 kudos
Delete unassigned catalogs
Hi everybody,due to some not-so-optimal Infrastructure as code experiments with terraform I ended up a lot (triple digit) of catalogs in a metastore that are not assigned to any workspace and that i want to delete.Unfortunately, there is no way to ev...
- 1238 Views
- 2 replies
- 0 kudos
- 0 kudos
Yeah, I see those catalogs and i know that I could reattach and delete them. As i have around 100 those catalogs it would be nice to iterate through them by getting a list, e.g. using the cli or the rest API. And then force delete them , as described...
- 0 kudos
- 1015 Views
- 1 replies
- 1 kudos
VS Code - ipynb vs py execution - spark issue
Databricks Connect works inside VS Code notebook but the same code fails in a standalone script withValueError: default auth: cannot configure default credentialsI’m developing locally with **Databricks Connect 16.1.6** and VS Code.Inside a Jupyter n...
- 1015 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @Sisi ,I think what's happening here is when you debug with option "Debug current file with Databricks Connect" then VS Code is using Databricks extension, which automatically handles authentication and sets up proper configuration.The regular Pyt...
- 1 kudos
- 796 Views
- 1 replies
- 1 kudos
Resolved! Lakebase use cases
1. What are the use cases for Lakebase? When should I use the Lakebase Postgres over delta tables?2. What are the differences between open-source Postgres and Lakebase?3. Should I utilize Lakebase for all OLTP requirements?
- 796 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @Sharanya13 ,1. Use Lakebase whenever you have application workload (OLTP) and you require low latency. For analytical workloads use Lakehouse. Here you have couple of example use cases from documentation:Serving data and/or features from the lake...
- 1 kudos
- 1991 Views
- 6 replies
- 2 kudos
Out of memory error when installing environment dependencies of UC Python UDF
Hi,I've created a small UC Python UDF to test whether it works with custom dependencies (new PP feature), and every time I'm getting OOM errors with this message: [UDF_ENVIRONMENT_USER_ERROR.OUT_OF_MEMORY] Failed to install UDF dependencies for <cata...
- 1991 Views
- 6 replies
- 2 kudos
- 2 kudos
I tried with cluster, spent some couple of hours to load some libraries but unable to do. may be someone else can help you on this.
- 2 kudos
- 987 Views
- 2 replies
- 3 kudos
Resolved! Markdown Cells Do Not Render Consistently
When I am creating a notebook in the UI editor on DataBricks, markdown cells do not always render after I run them. They still appear in 'editing mode'. See the screenshot below, it should have rendered a H1.Again, this behavior is not consistent. So...
- 987 Views
- 2 replies
- 3 kudos
- 3 kudos
Hi @bdanielatl, Thank you for reporting the issue with markdown cells not rendering consistently. This appears to be a known issue that has been encountered by other users as well. I will report it internally.
- 3 kudos
- 1765 Views
- 5 replies
- 4 kudos
Resolved! Metastore deletion issues
Good afternoon, I have an issue with my metastore in North Europe.All my workspaces got detached:If I go to Databricks console, I can see the metastore in North Europe I created.However, when I select the metastore in North Europe, I get the followin...
- 1765 Views
- 5 replies
- 4 kudos
- 4 kudos
I solved the issue by deleting all the asignments before deleting the metastore.1. Access to Databricks CLI and authenticate 2. List metastores>> databricks account metastores list 3. List wotrkspaces and check assignments>> databricks account worksp...
- 4 kudos
- 1025 Views
- 1 replies
- 1 kudos
Drop schema or catalog using cascade function
Hello In Databricks (non-Unity Catalog), I have two schemas (schema_a and schema_b) that both use the same root location in DBFS or external storage like ADLS.Example:abfss://container@storage_account.dfs.core.windows.net/data/project/schema_aabfss:/...
- 1025 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello @EjB For the given example, here is the response:Will DROP SCHEMA schema_a CASCADE remove or affect tables in schema_b?No, unless:1. The tables in schema_a are managed tables, AND2. Tables in schema_b store their data physically inside /schema_...
- 1 kudos
- 392 Views
- 0 replies
- 0 kudos
Asset Bundle Include Glob paths not resolving recursive directories
Hello,When trying to include resource definitions in nested yaml files, the recursive paths I am specifying in the include section are not resolving as would be expected.With the include path resources/**/*.yml and a directory structure structure as ...
- 392 Views
- 0 replies
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
AWS
5 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta
4 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
38 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
User | Count |
---|---|
75 | |
36 | |
25 | |
17 | |
12 |