cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Ashok_AWS
by New Contributor
  • 101 Views
  • 1 replies
  • 0 kudos

Databricks SSO enableed to Azure AD and this set up was deleted

HI Team,The SSO was enabled on the Databricks account with Azure AD and the environment is on AWS platform.The enterprise application which was used is been deleted from azure AD. No emergency user access has been set up.It is locked up, possible to ...

  • 101 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @Ashok_AWS, Could you file a case with us, we can help you. Send a mail to: help@databricks.comhttps://docs.databricks.com/aws/en/resources/support

  • 0 kudos
erigaud
by Honored Contributor
  • 20903 Views
  • 11 replies
  • 10 kudos

Resolved! Installing libraries on job clusters

Simple question : what is the way to go to install libraries on job clusters ? There does not seem to be a "Libraries" tab on the UI as opposed to regular clusters. Does it mean that the only option is to use init scripts ? 

  • 20903 Views
  • 11 replies
  • 10 kudos
Latest Reply
cleversuresh
New Contributor III
  • 10 kudos

I am not able to select the requirements.txt file from my workspace folder, I can see the file but cannot select it. How do I overcome this problem?

  • 10 kudos
10 More Replies
Teo12333
by New Contributor
  • 452 Views
  • 0 replies
  • 0 kudos

How do I get rid of the GKE cluster?

hi!In our organisation we use databricks but I do not understand why this GKE cluster keeps getting created. We deploy workspaces and compute clusters through terraform and use the GCE tag"x-databricks-nextgen-cluster" = "true"From my understanding, ...

  • 452 Views
  • 0 replies
  • 0 kudos
rjurnitos
by New Contributor II
  • 462 Views
  • 1 replies
  • 0 kudos

GCP Cluster will not boot correctly with Libraries preconfigured - notebooks never attach

I am running Databricks 15.4 LTS on a single-node `n1-highmem-32` for a PySpark / GraphFrames app (not using builtin `graphframes` on ML image because we don't need a GPU) and I can start the cluster fine so long as libraries are not attached. I can ...

rjurnitos_0-1739831664728.png
  • 462 Views
  • 1 replies
  • 0 kudos
Latest Reply
rjurnitos
New Contributor II
  • 0 kudos

Bump... anyone?

  • 0 kudos
Inna_M
by New Contributor III
  • 360 Views
  • 3 replies
  • 2 kudos

Question about moving to Serverless compute

Hi my organization is using Databricks in Canada Est region and Serverless isn't available in our region yet? at all?I would like to know if it is worth the effort to change region for the Canada Central, where Serverless compute is available. We do ...

  • 360 Views
  • 3 replies
  • 2 kudos
Latest Reply
Inna_M
New Contributor III
  • 2 kudos

hi Takuya Omi, thank you for responding. My question is: if we are to migrate our existing workspaces (3) and UC to Canada Central. Is it doable? Is it worth it? What does it imply? What are the best practices to do so?Thank you. 

  • 2 kudos
2 More Replies
Bayees
by New Contributor II
  • 296 Views
  • 2 replies
  • 0 kudos

Removing storage account location from metastore fails

I am trying to remove the storage account location for our UC metastore. I am getting the error:I have tried assigning my user and service principal permission to create external location.

Bayees_0-1739286165573.png
  • 296 Views
  • 2 replies
  • 0 kudos
Latest Reply
Bayees
New Contributor II
  • 0 kudos

I accomplished it by creating a storage account credential and external location manually. Then I was able to able the remove the metastore path.What then happen was that the path for __databricks_internal catalog was set to the storage account that ...

  • 0 kudos
1 More Replies
spd_dat
by New Contributor II
  • 355 Views
  • 2 replies
  • 0 kudos

Resolved! How does a non-admin user read a public s3 bucket on serverless?

As an admin, I can easily read a public s3 bucket from serverless:spark.read.parquet("s3://[public bucket]/[path]").display()So can a non-admin user, from classic compute. But why does a non-admin user, from serverless (both environments 1 & 2) get t...

  • 355 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @spd_dat, Is the S3 bucket in the same region as your workspace? It might required using a IAM role / S3 bucket to allow the bucket even if it is public. Just for a test can you try giving the user who is trying the below permission: GRANT SELECT ...

  • 0 kudos
1 More Replies
Wayne
by New Contributor III
  • 212 Views
  • 1 replies
  • 0 kudos

AWS custom role for Databricks clusters - no instance profile ARN

Try to follow the instructions to create custom IAM role for EC2 instance in Databricks clusters, but I can't find the instance profile ARN on the role. If I create a regual IAM role on EC2, I can find both role ARN and instance profile ARN.https://d...

  • 212 Views
  • 1 replies
  • 0 kudos
Latest Reply
Takuya-Omi
Valued Contributor II
  • 0 kudos

@Wayne I need to understand more about what you’re trying to achieve,but if you’re looking to grant permissions to the EC2 instances running behind a Databricks cluster using an instance profile, the following documentation provides a detailed explan...

  • 0 kudos
Anonymous
by Not applicable
  • 9247 Views
  • 2 replies
  • 0 kudos
  • 9247 Views
  • 2 replies
  • 0 kudos
Latest Reply
Ryan_Chynoweth
Esteemed Contributor
  • 0 kudos

You can get the job details from the jobs get api, which takes the job id as a parameter. This will give you all the information available about the job, specifically the job name. Please note that there is not a field called "job description" in the...

  • 0 kudos
1 More Replies
Behwar
by New Contributor III
  • 1083 Views
  • 4 replies
  • 1 kudos

Databricks App in Azure Databricks with private link cluster (no Public IP)

Hello,I've deployed Azure Databricks with a standard Private Link setup (no public IP). Everything works as expected—I can log in via the private/internal network, create clusters, and manage workloads without any issues.When I create a Databricks Ap...

  • 1083 Views
  • 4 replies
  • 1 kudos
Latest Reply
MariuszK
Contributor III
  • 1 kudos

Do you have a private endpoint for databricks_ui_api ? You need to establish a private endpoint for users to access web app.

  • 1 kudos
3 More Replies
PoojaD
by New Contributor II
  • 255 Views
  • 3 replies
  • 0 kudos

system schema permission

I've Databricks workspace admin permissions and want to run few queries on system.billing schema to get more info on billing of dbx. Getting below errror: [INSUFFICIENT_PERMISSIONS] Insufficient privileges: User does not have USE SCHEMA on Schema 'sy...

  • 255 Views
  • 3 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @PoojaD, You should access an admin to get you access: GRANT USE SCHEMA ON SCHEMA system.billing TO [Your User];GRANT SELECT ON TABLE system.billing.usage TO [Your User];

  • 0 kudos
2 More Replies
psgcbe
by New Contributor
  • 139 Views
  • 1 replies
  • 0 kudos

Removing the trial version as it is running cost

HI, I have a trial version on my AWS which keeps running and is eating up a dollar per day for the last couple of days. How do I disable it and use it only when required or completely remove it? 

  • 139 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @psgcbe, You can follow below steps: Terminate All Compute Resources:First, navigate to the AWS Management Console.Go to the EC2 Dashboard.Select Instances and terminate any running instances related to your trial. Cancel Your Subscription:Afte...

  • 0 kudos
ErikApption
by New Contributor II
  • 260 Views
  • 1 replies
  • 1 kudos

Resolved! Timeout settings for Postgresql external catalog connection?

Is there any way to configure timeouts for external catalog connections? We are getting some timeouts with complex queries accessing a pgsql database through the catalog. We tried configuring the connection and we got this error  │ Error: cannot upda...

  • 260 Views
  • 1 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

Hello @ErikApption, there is no direct support for a connectTimeout option in the connection settings through Unity Catalog as of now. You might need to explore these alternative timeout configurations or consider adjusting your database handling to ...

  • 1 kudos
MaximeGendre
by New Contributor III
  • 422 Views
  • 0 replies
  • 0 kudos

Dataiku connector limitation

Hello,I'm trying to read data from Unity Catalog and insert it into an Oracle Database using an "On Premise" Dataiku.It works well for a small dataset ~600Kb/~150 000 rows.[14:51:20] [INFO] [dku.datasets.sql] - Read 2000 records from DB [14:51:20] [I...

MaximeGendre_0-1739993758870.png
  • 422 Views
  • 0 replies
  • 0 kudos
Christian_j
by New Contributor
  • 955 Views
  • 3 replies
  • 0 kudos

Cannot create a workspace on GCP

Hi,I have been using Databricks for a couple of months and been spinning up workspaces with Terraform. The other day we decided to end our POC and move on to a MVP. This meant cleaning up all workspaces and GCP. after the cleanup was done I wanted to...

  • 955 Views
  • 3 replies
  • 0 kudos
Latest Reply
MariuszK
Contributor III
  • 0 kudos

Did you try from Marketplace? You may get there more detailed error.

  • 0 kudos
2 More Replies