- 2366 Views
- 2 replies
- 1 kudos
Security Consideration for OAUTH Secrets to use Service Principal to authenticate with Databricks
What are the security consideration we need to keep in mind when we want to us OAUTH Secrets to use a Service Principal to access Azure Databricks when Identity federation is disabled and workspace is not yet on boarded on to Unity Catalog? Can we co...
- 2366 Views
- 2 replies
- 1 kudos
- 1 kudos
Any updates on this?Also struggling with the OAuth security considerations. Specifically with updating the OAuth Secrets.Currently using a SP to access Databricks workspace for DevOps purposes through the Databricks CLI.I have the SP set up to renew ...
- 1 kudos
- 1642 Views
- 4 replies
- 0 kudos
Database Error in model Couldn't initialize file system for path abfss://
Recently the following error ocurs when running DBT:Database Error in model un_unternehmen_sat (models/2_un/partner/sats/un_unternehmen_sat.sql)Couldn't initialize file system for path abfss://dp-ext-fab@stcssdpextfabprd.dfs.core.windows.net/__unitys...
- 1642 Views
- 4 replies
- 0 kudos
- 0 kudos
Hi @Th0r ,Here is the explanation:Shallow clones in Databricks rely on references to data files of the original table. If the original table is dropped, recreated, or altered in a way that changes its underlying files, the shallow clone’s references ...
- 0 kudos
- 725 Views
- 0 replies
- 1 kudos
Databricks Asset Bundle a new way for amazing ETL
They say having the right tools at your disposal can make all the difference when navigating complex terrains. For organizations leveraging Databricks, simplifying deployment and scaling operations is often a key challenge.Over the years, I’ve explor...
- 725 Views
- 0 replies
- 1 kudos
- 3126 Views
- 2 replies
- 0 kudos
UserAgentEntry added to JDBC URL but not visible in Audit logs
Hi,As part of Databricks Best Practices, I have added 'UserAgentEntry' to JDBC URL that is being created when we are executing SQL statements through the JDBC driver.Sample url - jdbc:databricks://<host>:443;httpPath=<httpPath>; AuthMech=3;UID=token;...
- 3126 Views
- 2 replies
- 0 kudos
- 0 kudos
Sorry, I was mistaken. please ignore the previous response. The correct one isjdbc:databricks://<host>:443;httpPath=<httpPath>; AuthMech=3;UID=token;PWD=<token>;UserAgentEntry=<ApplicationName/Year>;
- 0 kudos
- 1864 Views
- 2 replies
- 1 kudos
Resolved! Exhausted Server when deploying a Databricks Assets Bundle (DAB)
Hello, I'm currently with a colleague inspecting the code and when trying to deploy the DAB it gets stuck: (.venv) my_user@my_pc my-dab-project % databricks bundle deploy -t=dev -p=my-dab-project-prod Building wheel... Uploading my-dab-project-...
- 1864 Views
- 2 replies
- 1 kudos
- 1 kudos
You are using a venv, the venv has too many files and is not needed to be included, try adding this on your databricks.ymlsync: exclude: - "venv" Hope it helps
- 1 kudos
- 1771 Views
- 3 replies
- 0 kudos
Delete the AWS Databricks account
I have created the aws databricks account from aws market place , but and I have cancelled the subscription after 14 days free trail from the market place. But still i see the account. How will i delete these databricks account associated with my ema...
- 1771 Views
- 3 replies
- 0 kudos
- 0 kudos
@dhruv1 As mentioned, it would be best to reach out to support for assistance.https://help.databricks.com/s/signuprequest
- 0 kudos
- 5921 Views
- 3 replies
- 1 kudos
Delta Live Table pipeline steps explanation
Does anyone have documentation on what is actually occurring in each of these steps?Creating update Waiting for resourcesInitializingSetting up tablesRendering graphFor example, what is the difference between initializing and setting up tables? I am ...
- 5921 Views
- 3 replies
- 1 kudos
- 1 kudos
Yes, loading data (full refresh/refresh) into all streaming tables and refreshing materialized views are part of the "Setting up table" step in a Delta Live Tables (DLT) pipeline when running in trigger mode.In triggered mode, materialized views are ...
- 1 kudos
- 1795 Views
- 1 replies
- 0 kudos
How to use Manged Identitify within Databricks Azure to access Blob Container?
Hi,My organization has asked that all blob storage accounts be accessed via managed identity. Several data brick notebooks are affected, so I'm currently trying to see how to set up a managed identity.We've added the Databricks resource provider to t...
- 1795 Views
- 1 replies
- 0 kudos
- 0 kudos
Have you followed the instructions available in docs https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/azure-managed-identities
- 0 kudos
- 4747 Views
- 1 replies
- 0 kudos
Resolved! Azure SCIM Provisioning Failures due to Resource Exhaustion
I had to make a significant change to the group membership in an Entra SCIM Provisioned group to Databricks, and the connector removed ALL users from the group (and sync'd) and then when I fixed, by adding a replacement dynamic group (with about 30% ...
- 4747 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @JSilverberg , This occurs when the number of requests goes beyond the rate limits mentioned here. https://learn.microsoft.com/en-us/azure/databricks/resources/limits#:~:text=Identity,No To prevent these issues in the future, it is best to do thes...
- 0 kudos
- 6865 Views
- 1 replies
- 0 kudos
Data Bricks Architect ceritification
Hello Team,I am planning to pursue the Databricks Architect certification. Could you please let me know which certification I should opt for?If you have any study material or relevant links, kindly share them.Your support would be highly appreciated....
- 6865 Views
- 1 replies
- 0 kudos
- 0 kudos
The Architect credential is an accreditation, not a certification. Accreditations are less rigorous and less expensive than certifications. You didn't say which platform you are on, so here are links to the learning plans (which have the exams) for...
- 0 kudos
- 2383 Views
- 1 replies
- 1 kudos
Pros and cons of putting all various Databricks workspaces (dev, qa , prod) under one metastore
Hi there, If we have separate workspaces for each and every environment, then how we should go about structuring the metastore? What are the pros and cons of putting all workspaces under one metastore instead of having separate metastore for each?Tha...
- 2383 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello Fatima, many thanks for your question. Please first note that if all the workspaces belong to the same account id and are on the same cloud region, they will all need to be associated with the same metastore as you can only have 1 metastore per...
- 1 kudos
- 10039 Views
- 7 replies
- 9 kudos
Resolved! Databricks best practices for azure storage account
Hello EveryoneCurrently, We are in process of building azure databricks and have some doubt regarding best practices to follow for azure storage account which we will be using to store data. Can anyone help me finding best practices to follow for sto...
- 10039 Views
- 7 replies
- 9 kudos
- 9 kudos
Thanks for sharing @Rjdudley @szymon_dybczak @filipniziol
- 9 kudos
- 1220 Views
- 3 replies
- 1 kudos
Resolved! How to add existing recipient to existing delta share
I created a recipient in the Databricks console and also set up a Delta Share. Now, I’d like to link this existing recipient to the Delta Share. Is there a way to accomplish this using Terraform?
- 1220 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @Naïm Thanks for your response. It seems your answer is helping me, but I'm facing another issue. The owner of my recipient is a group, not an individual user. I'm running this Terraform script using a service principal that is a member of that gr...
- 1 kudos
- 2761 Views
- 3 replies
- 0 kudos
Resolved! Control plane set-up
Dear all,In this video from Databricks, Azure Databricks Security Best Practices - https://www.youtube.com/watch?v=R1X8ydIR_Bc&t=623sduring this duration in the video 13.25 - 14.35the presenter talks about benefits of private endpoints. He makes the ...
- 2761 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @noorbasha534, Does this control plane then contains management services for several customers? - Yes, Control Plane has management services that are used across customers in the region. Due to which the presenter says traffic can be isolated fro...
- 0 kudos
- 4703 Views
- 5 replies
- 1 kudos
policy_id in databricks asset bundle workflow
We are using databricks asset bundle for code deployment and biggest issue I am facing is that policy_id is different in each environment.I tried with environment variable sin azure devops and also with declaring the variables in databricks.yaml and ...
- 4703 Views
- 5 replies
- 1 kudos
- 1 kudos
Solved by the lookup function https://docs.databricks.com/en/dev-tools/bundles/variables.html#retrieve-an-objects-id-value
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
57 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 118 | |
| 37 | |
| 36 | |
| 28 | |
| 25 |