cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

owly
by New Contributor
  • 674 Views
  • 1 replies
  • 0 kudos

remove s3 buckets

Hi,My databricks is based on AWS S3, I deleted my buckets, now Databricks is not working, how do I delete my Databricks?regards

  • 674 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Community Manager
  • 0 kudos

Hello @owly! To delete Databricks after AWS S3 bucket deletion: - Terminate all clusters and instance pools.- Clean up associated resources, like IAM roles, S3 storage configurations, and VPCs.- Delete the workspace from the Databricks Account Consol...

  • 0 kudos
noorbasha534
by Valued Contributor II
  • 4367 Views
  • 1 replies
  • 0 kudos

Databricks delta sharing design

DearsI wanted to have a mindshare around delta sharing - how do you decide how many shares to be created and share with other departments if you are maintaining an enterprise wide data warehoouse/lakehouse using Azure Databricks. I see from the docum...

  • 4367 Views
  • 1 replies
  • 0 kudos
Latest Reply
Isi
Honored Contributor III
  • 0 kudos

Hi @noorbasha534 ,Let me share a bit about our use case and how we’re handling Delta Sharing.Delta Sharing is indeed a simple and lightweight solution, and one of its main advantages is that it’s free to use. However, it still has several limitations...

  • 0 kudos
noorbasha534
by Valued Contributor II
  • 1760 Views
  • 4 replies
  • 0 kudos

get permissions assignment done from the workspaces UI

Hi all,I am looking to capture events of permissions assigned on catalog/schemas/tables/views from the workspaces UI; example, someone gave another user USE CATALOG permission from the UI.Is it possible to capture all such events?appreciate the minds...

  • 1760 Views
  • 4 replies
  • 0 kudos
Latest Reply
noorbasha534
Valued Contributor II
  • 0 kudos

@Advika can you kindly please let me know the action name that I should filter upon...

  • 0 kudos
3 More Replies
alonisser
by Contributor II
  • 3519 Views
  • 3 replies
  • 1 kudos

misbehavior of spots with fallback to on demand on job clusters

In the last few days, I've encountered in Azure (and before that also in AWS, but a bit different) this message about failing to start a cluster"run failed with error message Cluster '0410-173007-1pjmdgi1' was terminated. Reason: INVALID_ARGUMENT (CL...

  • 3519 Views
  • 3 replies
  • 1 kudos
Latest Reply
alonisser
Contributor II
  • 1 kudos

I see "Fleet instances do not support GPU instances" so in this case it's a no-op 

  • 1 kudos
2 More Replies
RicardoAntunes
by Databricks Partner
  • 1057 Views
  • 2 replies
  • 0 kudos

Access locked out with SSO

We were locked out of our account (expired secret for login via Azure Entra ID and password-based login disabled).How can i add a new secret in databricks if i'm only able to login with SSO and this is broken?

  • 1057 Views
  • 2 replies
  • 0 kudos
Latest Reply
RicardoAntunes
Databricks Partner
  • 0 kudos

It’s a company account 

  • 0 kudos
1 More Replies
DeepankarB
by New Contributor III
  • 2013 Views
  • 1 replies
  • 0 kudos

Implementing Governance on DLT pipelines using compute policy

I am implementing governance over compute creation in the workspaces by implementing custom compute policies for all-purpose, job and dlt pipelines. I was successfully able to create compute policies for all-purpose and jobs where I could restrict th...

Administration & Architecture
administration
Delta Live Table
  • 2013 Views
  • 1 replies
  • 0 kudos
Latest Reply
Renu_
Valued Contributor II
  • 0 kudos

Hi @DeepankarB, To enforce compute policies for DLT pipelines, make sure your policy JSON includes policy_family_id: dlt and set apply_policy_default_values: true in the pipeline cluster settings. This helps apply the instance restrictions correctly ...

  • 0 kudos
AnkurMittal008
by New Contributor III
  • 2854 Views
  • 4 replies
  • 0 kudos

Databricks Predictive optimization

If we want to enable Databricks Predictive Optimization, then is it also mandatory to enable serverless Job/Notebook Compute in our account. We already have Serverless SQL warehouse available in our workspaces.

Administration & Architecture
predictive optimization
serverless
  • 2854 Views
  • 4 replies
  • 0 kudos
Latest Reply
htd350
New Contributor II
  • 0 kudos

The documentation states this:Predictive optimization identifies tables that would benefit from ANALYZE, OPTIMIZE, and VACUUM operations and queues them to run using serverless compute for jobs.If I don't have serverless workloads enabled how does pr...

  • 0 kudos
3 More Replies
zMynxx
by New Contributor III
  • 4693 Views
  • 2 replies
  • 0 kudos

Resolved! Migrate to a new account

Hey Team,We're looking into migrating our correct Databricks solution from 1 AWS account (us-east-1 region) to another (eu-central-1 region). I have no documentation left on/about how the corrent solution was provisioned, but I can see CloudFormation...

  • 4693 Views
  • 2 replies
  • 0 kudos
Latest Reply
zMynxx
New Contributor III
  • 0 kudos

I ended up using the terrafrom-databricks-provider  tool to perform an export and import of the old workspace into the new one. All that was needed was a PAT in each, export from the old, sed the region, account and PAT and apply. This got me about 7...

  • 0 kudos
1 More Replies
Skully
by New Contributor
  • 1033 Views
  • 1 replies
  • 0 kudos

Does using SDK API calls cost money?

When using the Databricks SDK to retrieve metadata—such as catalogs, schemas, or tables—through its built-in API endpoints, does this incur any cost similar to running SQL queries?Specifically, executing SQL queries via the API spins up a compute clu...

  • 1033 Views
  • 1 replies
  • 0 kudos
Latest Reply
ashraf1395
Honored Contributor
  • 0 kudos

Hi there @Skully, You are right since you are just fetching the metadata information from catalog, tables etc instead of directly interacting or running any SQL queries, it doesn't cost same as creating a compute. when we retrieve the metadata inform...

  • 0 kudos
FarBo
by New Contributor III
  • 2863 Views
  • 4 replies
  • 1 kudos

Resolved! Enable Databricks system error

Hi,We want to enable some system system tables in our databricks workspace using this command:curl -v -X PUT -H "Authorization: Bearer <PAT token>" "https://adb-0000000000.azuredatabricks.net/api/2.0/unity-catalog/metastores/<metastore-id>/systemsche...

  • 2863 Views
  • 4 replies
  • 1 kudos
Latest Reply
aranjan99
Contributor
  • 1 kudos

While disabling some system schemas we disabled billing system schema and now we cannot enable it again due to this error: billing system schema can only be enabled by Databricks.How can I re-enable billing schema?

  • 1 kudos
3 More Replies
MDV
by Databricks Partner
  • 1665 Views
  • 1 replies
  • 0 kudos

Collation problem with df.first() when different from UTF8_BINARY

I'm getting a error when I want to select the first() from a dataframe when using a collation different than UTF8_BINARYThis works :df_result = spark.sql(f"""                        SELECT 'en-us' AS ETLLanguageCode""")display(df_result)print(df_resu...

  • 1665 Views
  • 1 replies
  • 0 kudos
Latest Reply
MDV
Databricks Partner
  • 0 kudos

other example :

  • 0 kudos
jfid
by New Contributor II
  • 2125 Views
  • 1 replies
  • 0 kudos

Resolved! Can a SQL Warehouse Pro be shared across multiple workspaces

I'm currently using a SQL Warehouse Pro in one of my Databricks workspaces, and I’m trying to optimize costs. Since the Pro Warehouse can be quite expensive to run, I’d prefer not to spin up additional instances in each workspace.Is there any way to ...

  • 2125 Views
  • 1 replies
  • 0 kudos
Latest Reply
Stefan-Koch
Databricks Partner
  • 0 kudos

hi @jfid A SQL Warehouse Pro instance cannot be shared directly across multiple Databricks workspaces. Each workspace requires its own SQL Warehouse instance, even if the compute and data access needs are similar. This is because compute resources li...

  • 0 kudos
ssadbexternal
by New Contributor II
  • 3986 Views
  • 2 replies
  • 0 kudos

Convert Account to Self-managed

I am in the process of setting up a new Databricks account for AWS commercial. I mistakenly setup the account with the email: databricks-external-nonprod-account-owner@slingshotaerospace.com to not be self-managed and I would like for this new accoun...

  • 3986 Views
  • 2 replies
  • 0 kudos
Latest Reply
ssadbexternal
New Contributor II
  • 0 kudos

Or better yet if we could delete it so I can re-create the account.

  • 0 kudos
1 More Replies
Sangamswadik
by Databricks Partner
  • 2635 Views
  • 3 replies
  • 0 kudos

Resolved! Unable to submit new case on Databricks (AWS)

Hi, I wanted to submit a case, but when I try to submit one, I see this.You do not have access to submit a case. You can just view your organization's cases. In Case of any query Please contact your admin.I looked into settings, for support but could...

Sangamswadik_0-1743123823179.png
  • 2635 Views
  • 3 replies
  • 0 kudos
Latest Reply
Advika
Community Manager
  • 0 kudos

Hello @Sangamswadik! Try raising a case by visiting the site (https://help.databricks.com/s/contact-us?ReqType=training) and filling out the form shown below:

  • 0 kudos
2 More Replies
satniks_o
by New Contributor III
  • 6387 Views
  • 5 replies
  • 2 kudos

Resolved! How to get logged in user name/email in the databricks streamlit app?

I have created a Databricks App using streamlit and able to deploy and use it successfully.I need to get the user name/email address of the logged in user and display in the streamlit app. Is this possible?If not possible at the moment, any roadmap f...

  • 6387 Views
  • 5 replies
  • 2 kudos
Latest Reply
Carl_B
New Contributor II
  • 2 kudos

I have also tried to deploy a streamlit app, however I was not able to deploy it.

  • 2 kudos
4 More Replies