cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Redford
by New Contributor
  • 4818 Views
  • 3 replies
  • 1 kudos

Can I configure Notebook Result Downloads with Databricks CLI , API or Terraform provider ?

I'm Databricks Admin and I'm looking for a solution to automate some Security Workspace settings.Those are:Notebook result downloadSQL result downloadNotebook table clipboard featuresI can't find these options in the Databricks terraform provider, Da...

  • 4818 Views
  • 3 replies
  • 1 kudos
Latest Reply
nkraj
Databricks Employee
  • 1 kudos

Hi @Redford, With the Databricks API, you have the capability to toggle the following features: Enable/Disable Features Notebook result download (key name: enableResultsDownloading)Notebook table clipboard features (key name: enableNotebo...

  • 1 kudos
2 More Replies
jakubk
by Contributor
  • 1493 Views
  • 7 replies
  • 0 kudos

unknown geo redundancy storage events (& costs) in azure databricks resource group

Hi All,I'm after some guidance on how to identify massive (100000%) spikes in bandwidth usage (and related costs) in the azure databricks provisioned/managed resource group storage account & stop themThese blips are adding 30-50% to our monthly costs...

jakubk_0-1733707523442.png
  • 1493 Views
  • 7 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Thanks for opening a case with us, we will have a look at it.

  • 0 kudos
6 More Replies
F_Goudarzi
by New Contributor III
  • 1075 Views
  • 6 replies
  • 0 kudos

Unity catalog meta store is created within undesired storage account

I came to know that our unity catalog meta store has been created in the default storage account of our databricks workspace and this storage account has some system denied access policies, therefore we don't have access to see the data inside. I'm w...

  • 1075 Views
  • 6 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

You will need to backup the current metastore including the metadata and then start recreating the catalogs, schemas and tables on the new metastore.

  • 0 kudos
5 More Replies
santosh23
by New Contributor III
  • 990 Views
  • 10 replies
  • 0 kudos

Update existing Metastores in AWS databricks

Hello Team,I am unable to update the Existing Metastore in my AWS databricks. I have new aws account and I am trying to update my existing workspace, however I am unable to update the s3 bucket details and Network configuration ( greyed out )  in the...

  • 990 Views
  • 10 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Unfortunately there is no way to move the state of the workspace manually so based on this the solution will be to recreate the workspace and migrate the data

  • 0 kudos
9 More Replies
Takuya-Omi
by Valued Contributor III
  • 1205 Views
  • 3 replies
  • 1 kudos

How Can a Workspace Admin Grant Workspace Admin Permissions to a Group?

I want to grant Workspace Admin permissions to a group instead of individual users, but I haven’t found a way to do this. I considered assigning permissions by adding the group to the Databricks-managed 'admins' group (establishing a parent-child rel...

  • 1205 Views
  • 3 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

No problem! I will check internally if there is any feature request of this nature. You can use the "admins" group for adding admin users or SPs.

  • 1 kudos
2 More Replies
JissMathew
by Valued Contributor
  • 904 Views
  • 2 replies
  • 1 kudos

Resolved! DLT TABLES schema mapping

how we map table that in delta live table to a bronze , sliver , gold schema ? is that possible to store in different schema the dlt tables?? 

  • 904 Views
  • 2 replies
  • 1 kudos
Latest Reply
JissMathew
Valued Contributor
  • 1 kudos

@Walter_C  Thank you 

  • 1 kudos
1 More Replies
TatiMun
by New Contributor II
  • 648 Views
  • 2 replies
  • 0 kudos

AzureDevOps Repos Databricks update via pipeline not working

Hi all, im working with Azure DevOps and Databricks, using an app registration which it has permission on AzureDevOps and inside databricks as manager,user and in the group admins so it has permission over the repos.Im doing a pipeline to update or c...

  • 648 Views
  • 2 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Hello @TatiMun thanks for your question, can we review the following: Verify Remote URL: Double-check that the remote Git repo URL associated with the REPO_ID in Databricks is correct and accessible.Check PAT Permissions: Ensure that the Personal Acc...

  • 0 kudos
1 More Replies
VJ3
by Contributor
  • 1870 Views
  • 2 replies
  • 1 kudos

Security Consideration for OAUTH Secrets to use Service Principal to authenticate with Databricks

What are the security consideration we need to keep in mind when we want to us OAUTH Secrets to use a Service Principal to access Azure Databricks when Identity federation is disabled and workspace is not yet on boarded on to Unity Catalog? Can we co...

  • 1870 Views
  • 2 replies
  • 1 kudos
Latest Reply
Rob_Lemmens
New Contributor III
  • 1 kudos

Any updates on this?Also struggling with the OAuth security considerations. Specifically with updating the OAuth Secrets.Currently using a SP to access Databricks workspace for DevOps purposes through the Databricks CLI.I have the SP set up to renew ...

  • 1 kudos
1 More Replies
Th0r
by New Contributor II
  • 1048 Views
  • 4 replies
  • 0 kudos

Database Error in model Couldn't initialize file system for path abfss://

Recently the following error ocurs when running DBT:Database Error in model un_unternehmen_sat (models/2_un/partner/sats/un_unternehmen_sat.sql)Couldn't initialize file system for path abfss://dp-ext-fab@stcssdpextfabprd.dfs.core.windows.net/__unitys...

  • 1048 Views
  • 4 replies
  • 0 kudos
Latest Reply
filipniziol
Esteemed Contributor
  • 0 kudos

Hi @Th0r ,Here is the explanation:Shallow clones in Databricks rely on references to data files of the original table. If the original table is dropped, recreated, or altered in a way that changes its underlying files, the shallow clone’s references ...

  • 0 kudos
3 More Replies
Databricks24
by New Contributor
  • 2695 Views
  • 2 replies
  • 0 kudos

UserAgentEntry added to JDBC URL but not visible in Audit logs

Hi,As part of Databricks Best Practices, I have added 'UserAgentEntry' to JDBC URL that is being created when we are executing SQL statements through the JDBC driver.Sample url - jdbc:databricks://<host>:443;httpPath=<httpPath>; AuthMech=3;UID=token;...

  • 2695 Views
  • 2 replies
  • 0 kudos
Latest Reply
satishdatagaps
New Contributor II
  • 0 kudos

Sorry, I was mistaken. please ignore the previous response. The correct one isjdbc:databricks://<host>:443;httpPath=<httpPath>; AuthMech=3;UID=token;PWD=<token>;UserAgentEntry=<ApplicationName/Year>;

  • 0 kudos
1 More Replies
PabloCSD
by Valued Contributor II
  • 1227 Views
  • 2 replies
  • 1 kudos

Resolved! Exhausted Server when deploying a Databricks Assets Bundle (DAB)

Hello, I'm currently with a colleague inspecting the code and when trying to deploy the DAB it gets stuck: (.venv) my_user@my_pc my-dab-project % databricks bundle deploy -t=dev -p=my-dab-project-prod Building wheel... Uploading my-dab-project-...

  • 1227 Views
  • 2 replies
  • 1 kudos
Latest Reply
TinSlim
New Contributor III
  • 1 kudos

You are using a venv, the venv has too many files and is not needed to be included, try adding this on your databricks.ymlsync: exclude: - "venv" Hope it helps

  • 1 kudos
1 More Replies
dhruv1
by New Contributor II
  • 1130 Views
  • 3 replies
  • 0 kudos

Delete the AWS Databricks account

I have created the aws databricks account from aws market place , but and I have cancelled the subscription after 14 days free trail from the market place. But still i see the account. How will i delete these databricks account associated with my ema...

  • 1130 Views
  • 3 replies
  • 0 kudos
Latest Reply
Takuya-Omi
Valued Contributor III
  • 0 kudos

@dhruv1 As mentioned, it would be best to reach out to support for assistance.https://help.databricks.com/s/signuprequest 

  • 0 kudos
2 More Replies
ac0
by Contributor
  • 3585 Views
  • 3 replies
  • 1 kudos

Delta Live Table pipeline steps explanation

Does anyone have documentation on what is actually occurring in each of these steps?Creating update Waiting for resourcesInitializingSetting up tablesRendering graphFor example, what is the difference between initializing and setting up tables? I am ...

  • 3585 Views
  • 3 replies
  • 1 kudos
Latest Reply
Mounika_Tarigop
Databricks Employee
  • 1 kudos

Yes, loading data (full refresh/refresh) into all streaming tables and refreshing materialized views are part of the "Setting up table" step in a Delta Live Tables (DLT) pipeline when running in trigger mode.In triggered mode, materialized views are ...

  • 1 kudos
2 More Replies
Dex
by New Contributor
  • 841 Views
  • 1 replies
  • 0 kudos

How to use Manged Identitify within Databricks Azure to access Blob Container?

Hi,My organization has asked that all blob storage accounts be accessed via managed identity. Several data brick notebooks are affected, so I'm currently trying to see how to set up a managed identity.We've added the Databricks resource provider to t...

  • 841 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Have you followed the instructions available in docs https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/azure-managed-identities 

  • 0 kudos