cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

SumeshD
by New Contributor II
  • 857 Views
  • 2 replies
  • 1 kudos

Errors on Databricks and Terraform upgrades for DABS

Hi AllI've recently upgraded my Databricks CLI to 0.235 and Terraform provider to 1.58 locally on my machine and my DABs deployments have broken. They did work in the past with previous versions and now I can't even run Terraform -v. The command Data...

  • 857 Views
  • 2 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

Hi @SumeshD, Can you run databricks bundle debug terraform to obtain more details on the failure? The error messages you are encountering, such as "ConflictsWith skipped for [task.0.for_each_task.0.task.0.new_cluster.0.aws_attributes task.0.for_each_...

  • 1 kudos
1 More Replies
pdiamond
by Contributor
  • 716 Views
  • 1 replies
  • 0 kudos

Resolved! Optional JDBC Parameters in external connection

Is there any way to specify optional JDBC parameters like batchSize through the External Connections created in Unity Catalog? (specifically I'm trying to speed up data retrieval from a SQL Server database)

  • 716 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

As per information I found internally seems that this option is not currently supported.

  • 0 kudos
Redford
by New Contributor
  • 5220 Views
  • 3 replies
  • 1 kudos

Can I configure Notebook Result Downloads with Databricks CLI , API or Terraform provider ?

I'm Databricks Admin and I'm looking for a solution to automate some Security Workspace settings.Those are:Notebook result downloadSQL result downloadNotebook table clipboard featuresI can't find these options in the Databricks terraform provider, Da...

  • 5220 Views
  • 3 replies
  • 1 kudos
Latest Reply
nkraj
Databricks Employee
  • 1 kudos

Hi @Redford, With the Databricks API, you have the capability to toggle the following features: Enable/Disable Features Notebook result download (key name: enableResultsDownloading)Notebook table clipboard features (key name: enableNotebo...

  • 1 kudos
2 More Replies
jakubk
by Contributor
  • 2345 Views
  • 7 replies
  • 0 kudos

unknown geo redundancy storage events (& costs) in azure databricks resource group

Hi All,I'm after some guidance on how to identify massive (100000%) spikes in bandwidth usage (and related costs) in the azure databricks provisioned/managed resource group storage account & stop themThese blips are adding 30-50% to our monthly costs...

jakubk_0-1733707523442.png
  • 2345 Views
  • 7 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Thanks for opening a case with us, we will have a look at it.

  • 0 kudos
6 More Replies
F_Goudarzi
by New Contributor III
  • 1452 Views
  • 6 replies
  • 0 kudos

Unity catalog meta store is created within undesired storage account

I came to know that our unity catalog meta store has been created in the default storage account of our databricks workspace and this storage account has some system denied access policies, therefore we don't have access to see the data inside. I'm w...

  • 1452 Views
  • 6 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

You will need to backup the current metastore including the metadata and then start recreating the catalogs, schemas and tables on the new metastore.

  • 0 kudos
5 More Replies
santosh23
by New Contributor III
  • 1409 Views
  • 10 replies
  • 0 kudos

Update existing Metastores in AWS databricks

Hello Team,I am unable to update the Existing Metastore in my AWS databricks. I have new aws account and I am trying to update my existing workspace, however I am unable to update the s3 bucket details and Network configuration ( greyed out )  in the...

  • 1409 Views
  • 10 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Unfortunately there is no way to move the state of the workspace manually so based on this the solution will be to recreate the workspace and migrate the data

  • 0 kudos
9 More Replies
Takuya-Omi
by Valued Contributor III
  • 1732 Views
  • 3 replies
  • 1 kudos

How Can a Workspace Admin Grant Workspace Admin Permissions to a Group?

I want to grant Workspace Admin permissions to a group instead of individual users, but I haven’t found a way to do this. I considered assigning permissions by adding the group to the Databricks-managed 'admins' group (establishing a parent-child rel...

  • 1732 Views
  • 3 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

No problem! I will check internally if there is any feature request of this nature. You can use the "admins" group for adding admin users or SPs.

  • 1 kudos
2 More Replies
JissMathew
by Valued Contributor
  • 1117 Views
  • 2 replies
  • 1 kudos

Resolved! DLT TABLES schema mapping

how we map table that in delta live table to a bronze , sliver , gold schema ? is that possible to store in different schema the dlt tables?? 

  • 1117 Views
  • 2 replies
  • 1 kudos
Latest Reply
JissMathew
Valued Contributor
  • 1 kudos

@Walter_C  Thank you 

  • 1 kudos
1 More Replies
TatiMun
by New Contributor II
  • 847 Views
  • 2 replies
  • 0 kudos

AzureDevOps Repos Databricks update via pipeline not working

Hi all, im working with Azure DevOps and Databricks, using an app registration which it has permission on AzureDevOps and inside databricks as manager,user and in the group admins so it has permission over the repos.Im doing a pipeline to update or c...

  • 847 Views
  • 2 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Hello @TatiMun thanks for your question, can we review the following: Verify Remote URL: Double-check that the remote Git repo URL associated with the REPO_ID in Databricks is correct and accessible.Check PAT Permissions: Ensure that the Personal Acc...

  • 0 kudos
1 More Replies
VJ3
by Contributor
  • 2181 Views
  • 2 replies
  • 1 kudos

Security Consideration for OAUTH Secrets to use Service Principal to authenticate with Databricks

What are the security consideration we need to keep in mind when we want to us OAUTH Secrets to use a Service Principal to access Azure Databricks when Identity federation is disabled and workspace is not yet on boarded on to Unity Catalog? Can we co...

  • 2181 Views
  • 2 replies
  • 1 kudos
Latest Reply
Rob_Lemmens
New Contributor III
  • 1 kudos

Any updates on this?Also struggling with the OAuth security considerations. Specifically with updating the OAuth Secrets.Currently using a SP to access Databricks workspace for DevOps purposes through the Databricks CLI.I have the SP set up to renew ...

  • 1 kudos
1 More Replies
Th0r
by New Contributor II
  • 1424 Views
  • 4 replies
  • 0 kudos

Database Error in model Couldn't initialize file system for path abfss://

Recently the following error ocurs when running DBT:Database Error in model un_unternehmen_sat (models/2_un/partner/sats/un_unternehmen_sat.sql)Couldn't initialize file system for path abfss://dp-ext-fab@stcssdpextfabprd.dfs.core.windows.net/__unitys...

  • 1424 Views
  • 4 replies
  • 0 kudos
Latest Reply
filipniziol
Esteemed Contributor
  • 0 kudos

Hi @Th0r ,Here is the explanation:Shallow clones in Databricks rely on references to data files of the original table. If the original table is dropped, recreated, or altered in a way that changes its underlying files, the shallow clone’s references ...

  • 0 kudos
3 More Replies
Databricks24
by New Contributor
  • 2973 Views
  • 2 replies
  • 0 kudos

UserAgentEntry added to JDBC URL but not visible in Audit logs

Hi,As part of Databricks Best Practices, I have added 'UserAgentEntry' to JDBC URL that is being created when we are executing SQL statements through the JDBC driver.Sample url - jdbc:databricks://<host>:443;httpPath=<httpPath>; AuthMech=3;UID=token;...

  • 2973 Views
  • 2 replies
  • 0 kudos
Latest Reply
satishdatagaps
New Contributor II
  • 0 kudos

Sorry, I was mistaken. please ignore the previous response. The correct one isjdbc:databricks://<host>:443;httpPath=<httpPath>; AuthMech=3;UID=token;PWD=<token>;UserAgentEntry=<ApplicationName/Year>;

  • 0 kudos
1 More Replies
PabloCSD
by Valued Contributor II
  • 1617 Views
  • 2 replies
  • 1 kudos

Resolved! Exhausted Server when deploying a Databricks Assets Bundle (DAB)

Hello, I'm currently with a colleague inspecting the code and when trying to deploy the DAB it gets stuck: (.venv) my_user@my_pc my-dab-project % databricks bundle deploy -t=dev -p=my-dab-project-prod Building wheel... Uploading my-dab-project-...

  • 1617 Views
  • 2 replies
  • 1 kudos
Latest Reply
TinSlim
New Contributor III
  • 1 kudos

You are using a venv, the venv has too many files and is not needed to be included, try adding this on your databricks.ymlsync: exclude: - "venv" Hope it helps

  • 1 kudos
1 More Replies
dhruv1
by New Contributor II
  • 1516 Views
  • 3 replies
  • 0 kudos

Delete the AWS Databricks account

I have created the aws databricks account from aws market place , but and I have cancelled the subscription after 14 days free trail from the market place. But still i see the account. How will i delete these databricks account associated with my ema...

  • 1516 Views
  • 3 replies
  • 0 kudos
Latest Reply
Takuya-Omi
Valued Contributor III
  • 0 kudos

@dhruv1 As mentioned, it would be best to reach out to support for assistance.https://help.databricks.com/s/signuprequest 

  • 0 kudos
2 More Replies