cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

bhanu_dp
by New Contributor III
  • 1441 Views
  • 4 replies
  • 2 kudos

How to delete or clean Legacy Hive Metastore after successful completion of UC migration

Say we have completed the migration of tables from Hive Metastore to UC. All the users, jobs and clusters are switched to UC. There is no more activity on Legacy Hive Metastore.What is the best recommendation on deleting or cleaning the Hive Metastor...

  • 1441 Views
  • 4 replies
  • 2 kudos
Latest Reply
Rjdudley
Honored Contributor
  • 2 kudos

Sometime in the last couple of days, this setting was pushed to my account, it looks like what you want:To see if you've been added, go to your Account Console and look under Previews.

  • 2 kudos
3 More Replies
noorbasha534
by Valued Contributor II
  • 1671 Views
  • 4 replies
  • 2 kudos

Resolved! Disable ability to choose PHOTON

Dear all,as an administrator, I want to restrict developers from choosing 'photon' option in job clusters. I see this in the job definition when they choose it -"runtime_engine": "PHOTON"How can I pass this as input in the policy and restrict develop...

  • 1671 Views
  • 4 replies
  • 2 kudos
Latest Reply
mnorland
Valued Contributor
  • 2 kudos

You also need to make sure the policy permissions are set up properly. You can/should fix preexisting compute affected by the policy with the wizard in the policy edit screen.

  • 2 kudos
3 More Replies
Behwar
by New Contributor III
  • 4402 Views
  • 5 replies
  • 1 kudos

Databricks App in Azure Databricks with private link cluster (no Public IP)

Hello,I've deployed Azure Databricks with a standard Private Link setup (no public IP). Everything works as expected—I can log in via the private/internal network, create clusters, and manage workloads without any issues.When I create a Databricks Ap...

  • 4402 Views
  • 5 replies
  • 1 kudos
Latest Reply
rugger-bricks
Databricks Employee
  • 1 kudos

Behwar : you should have to create a specific private DNS zone for azure.databricksapps.com - if you do a nslookup on your apps url - you will see that it points to your workspace. In Azure using Azure (recursive) DNS you can see an important behavio...

  • 1 kudos
4 More Replies
owly
by New Contributor
  • 381 Views
  • 1 replies
  • 0 kudos

remove s3 buckets

Hi,My databricks is based on AWS S3, I deleted my buckets, now Databricks is not working, how do I delete my Databricks?regards

  • 381 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Databricks Employee
  • 0 kudos

Hello @owly! To delete Databricks after AWS S3 bucket deletion: - Terminate all clusters and instance pools.- Clean up associated resources, like IAM roles, S3 storage configurations, and VPCs.- Delete the workspace from the Databricks Account Consol...

  • 0 kudos
noorbasha534
by Valued Contributor II
  • 2992 Views
  • 1 replies
  • 0 kudos

Databricks delta sharing design

DearsI wanted to have a mindshare around delta sharing - how do you decide how many shares to be created and share with other departments if you are maintaining an enterprise wide data warehoouse/lakehouse using Azure Databricks. I see from the docum...

  • 2992 Views
  • 1 replies
  • 0 kudos
Latest Reply
Isi
Honored Contributor II
  • 0 kudos

Hi @noorbasha534 ,Let me share a bit about our use case and how we’re handling Delta Sharing.Delta Sharing is indeed a simple and lightweight solution, and one of its main advantages is that it’s free to use. However, it still has several limitations...

  • 0 kudos
noorbasha534
by Valued Contributor II
  • 1113 Views
  • 4 replies
  • 0 kudos

get permissions assignment done from the workspaces UI

Hi all,I am looking to capture events of permissions assigned on catalog/schemas/tables/views from the workspaces UI; example, someone gave another user USE CATALOG permission from the UI.Is it possible to capture all such events?appreciate the minds...

  • 1113 Views
  • 4 replies
  • 0 kudos
Latest Reply
noorbasha534
Valued Contributor II
  • 0 kudos

@Advika can you kindly please let me know the action name that I should filter upon...

  • 0 kudos
3 More Replies
alonisser
by Contributor II
  • 1431 Views
  • 3 replies
  • 1 kudos

misbehavior of spots with fallback to on demand on job clusters

In the last few days, I've encountered in Azure (and before that also in AWS, but a bit different) this message about failing to start a cluster"run failed with error message Cluster '0410-173007-1pjmdgi1' was terminated. Reason: INVALID_ARGUMENT (CL...

  • 1431 Views
  • 3 replies
  • 1 kudos
Latest Reply
alonisser
Contributor II
  • 1 kudos

I see "Fleet instances do not support GPU instances" so in this case it's a no-op 

  • 1 kudos
2 More Replies
RicardoAntunes
by New Contributor II
  • 616 Views
  • 2 replies
  • 0 kudos

Access locked out with SSO

We were locked out of our account (expired secret for login via Azure Entra ID and password-based login disabled).How can i add a new secret in databricks if i'm only able to login with SSO and this is broken?

  • 616 Views
  • 2 replies
  • 0 kudos
Latest Reply
RicardoAntunes
New Contributor II
  • 0 kudos

It’s a company account 

  • 0 kudos
1 More Replies
DeepankarB
by New Contributor III
  • 1339 Views
  • 1 replies
  • 0 kudos

Implementing Governance on DLT pipelines using compute policy

I am implementing governance over compute creation in the workspaces by implementing custom compute policies for all-purpose, job and dlt pipelines. I was successfully able to create compute policies for all-purpose and jobs where I could restrict th...

Administration & Architecture
administration
Delta Live Table
  • 1339 Views
  • 1 replies
  • 0 kudos
Latest Reply
Renu_
Valued Contributor II
  • 0 kudos

Hi @DeepankarB, To enforce compute policies for DLT pipelines, make sure your policy JSON includes policy_family_id: dlt and set apply_policy_default_values: true in the pipeline cluster settings. This helps apply the instance restrictions correctly ...

  • 0 kudos
AnkurMittal008
by New Contributor III
  • 1716 Views
  • 4 replies
  • 0 kudos

Databricks Predictive optimization

If we want to enable Databricks Predictive Optimization, then is it also mandatory to enable serverless Job/Notebook Compute in our account. We already have Serverless SQL warehouse available in our workspaces.

Administration & Architecture
predictive optimization
serverless
  • 1716 Views
  • 4 replies
  • 0 kudos
Latest Reply
htd350
New Contributor II
  • 0 kudos

The documentation states this:Predictive optimization identifies tables that would benefit from ANALYZE, OPTIMIZE, and VACUUM operations and queues them to run using serverless compute for jobs.If I don't have serverless workloads enabled how does pr...

  • 0 kudos
3 More Replies
Mendi
by New Contributor
  • 2589 Views
  • 0 replies
  • 0 kudos

Azure Databricks with VNET injection and SCC

Hi,Azure databricks with VNET injection and SCC need to communicate with Azure endpoints for following,Metastore, artifact Blob storage, system tables storage, log Blob storage, and Event Hubs endpoint IP addresses.https://learn.microsoft.com/en-us/a...

  • 2589 Views
  • 0 replies
  • 0 kudos
zMynxx
by New Contributor III
  • 3769 Views
  • 2 replies
  • 0 kudos

Resolved! Migrate to a new account

Hey Team,We're looking into migrating our correct Databricks solution from 1 AWS account (us-east-1 region) to another (eu-central-1 region). I have no documentation left on/about how the corrent solution was provisioned, but I can see CloudFormation...

  • 3769 Views
  • 2 replies
  • 0 kudos
Latest Reply
zMynxx
New Contributor III
  • 0 kudos

I ended up using the terrafrom-databricks-provider  tool to perform an export and import of the old workspace into the new one. All that was needed was a PAT in each, export from the old, sed the region, account and PAT and apply. This got me about 7...

  • 0 kudos
1 More Replies
Skully
by New Contributor
  • 622 Views
  • 1 replies
  • 0 kudos

Does using SDK API calls cost money?

When using the Databricks SDK to retrieve metadata—such as catalogs, schemas, or tables—through its built-in API endpoints, does this incur any cost similar to running SQL queries?Specifically, executing SQL queries via the API spins up a compute clu...

  • 622 Views
  • 1 replies
  • 0 kudos
Latest Reply
ashraf1395
Honored Contributor
  • 0 kudos

Hi there @Skully, You are right since you are just fetching the metadata information from catalog, tables etc instead of directly interacting or running any SQL queries, it doesn't cost same as creating a compute. when we retrieve the metadata inform...

  • 0 kudos
FarBo
by New Contributor III
  • 1867 Views
  • 4 replies
  • 1 kudos

Resolved! Enable Databricks system error

Hi,We want to enable some system system tables in our databricks workspace using this command:curl -v -X PUT -H "Authorization: Bearer <PAT token>" "https://adb-0000000000.azuredatabricks.net/api/2.0/unity-catalog/metastores/<metastore-id>/systemsche...

  • 1867 Views
  • 4 replies
  • 1 kudos
Latest Reply
aranjan99
Contributor
  • 1 kudos

While disabling some system schemas we disabled billing system schema and now we cannot enable it again due to this error: billing system schema can only be enabled by Databricks.How can I re-enable billing schema?

  • 1 kudos
3 More Replies
MDV
by New Contributor III
  • 1302 Views
  • 1 replies
  • 0 kudos

Collation problem with df.first() when different from UTF8_BINARY

I'm getting a error when I want to select the first() from a dataframe when using a collation different than UTF8_BINARYThis works :df_result = spark.sql(f"""                        SELECT 'en-us' AS ETLLanguageCode""")display(df_result)print(df_resu...

  • 1302 Views
  • 1 replies
  • 0 kudos
Latest Reply
MDV
New Contributor III
  • 0 kudos

other example :

  • 0 kudos