cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

rtreves
by Contributor
  • 3729 Views
  • 15 replies
  • 0 kudos

Resolved! Permissions error on cluster requirements.txt installation

Hi Databricks Community,I'm looking to resolve the following error:Library installation attempted on the driver node of cluster {My cluster ID} and failed. Please refer to the following error message to fix the library or contact Databricks support. ...

  • 3729 Views
  • 15 replies
  • 0 kudos
Latest Reply
rtreves
Contributor
  • 0 kudos

Noting here for other users: I was able to resolve the issue on a shared cluster by cloning the cluster and using the clone.

  • 0 kudos
14 More Replies
ambigus9
by Contributor
  • 2895 Views
  • 8 replies
  • 3 kudos

PrivateLink Validation Error - When trying to access to Workspace

We have a workspace that had been deployed on AWS customer architecture using Terraform privatelink: https://registry.terraform.io/providers/databricks/databricks/latest/docs/guides/aws-private-link-workspaceThe fact is when we disable the Public Acc...

ambigus9_0-1732035784493.png ambigus9_1-1732035847145.png ambigus9_2-1732037994364.png ambigus9_3-1732038098998.png
  • 2895 Views
  • 8 replies
  • 3 kudos
Latest Reply
Walter_C
Databricks Employee
  • 3 kudos

Can you share your workspace id so I can do a validation?  

  • 3 kudos
7 More Replies
jjsnlee
by New Contributor II
  • 748 Views
  • 2 replies
  • 0 kudos

Can't create cluster in AWS with p3 instance type

Hi, I'm trying to create a `p3.2xlarge` in my workspace, but the cluster fails to instantiate, specifically getting this error message: `No zone supports both the driver instance type [p3.2xlarge] and the worker instance type [p3.2xlarge]` (though I ...

  • 748 Views
  • 2 replies
  • 0 kudos
Latest Reply
jjsnlee
New Contributor II
  • 0 kudos

Yes sorry for the double post (I couldn't figure out how to delete this one)

  • 0 kudos
1 More Replies
John_OC
by New Contributor
  • 458 Views
  • 1 replies
  • 0 kudos

Querying on multi-node cluster on AWS does not complete

Querying in isolation mode is completely fine but when trying to run the same query using the multi-node it does complete or error out. Any assistance to troubleshoot this issue? oh, Happy New year if you're reading this.

  • 458 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

hello John,Happy new year to you, can you please confirm what is the error message received? when you say isolation mode do you mean single node or do you refer to single user cluster while the other is shared mode?

  • 0 kudos
staskh
by New Contributor III
  • 2111 Views
  • 5 replies
  • 2 kudos

Resolved! S3 access credentials: Pandas vs Spark

Hi,I need to read Parquet files located in S3 into the Pandas dataframe.I configured "external location" to access my S3 bucket and havedf = spark.read.parquet(s3_parquet_file_path)working perfectly well.However, df = pd.read_parquet(s3_parquet_file_...

  • 2111 Views
  • 5 replies
  • 2 kudos
Latest Reply
Walter_C
Databricks Employee
  • 2 kudos

Yes, you understand correctly. The Spark library in Databricks uses the Unity Catalog credential model, which includes the use of "external locations" for managing data access. This model ensures that access control and permissions are centrally mana...

  • 2 kudos
4 More Replies
nanda_
by New Contributor
  • 1825 Views
  • 2 replies
  • 1 kudos

Assistance Required: Integrating Databricks ODBC Connector with Azure App Service

Hi,I have successfully established an ODBC connection with Databricks to retrieve data from the Unity Catalog in a local C# application using the Simba Spark ODBC Driver, and it is working as expected.I now need to integrate this functionality into a...

  • 1825 Views
  • 2 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @nanda_ ,So basically what you need to do is to install simba odbc driver on your Azure App Service environment. Then your code should work in the same way as in your local machine.One possibility is to use Windows or Linux Containers on Azure App...

  • 1 kudos
1 More Replies
soumiknow
by Contributor II
  • 4216 Views
  • 1 replies
  • 0 kudos

Resolved! How to add 'additionallyAllowedTenants' in Databricks config or PySpark config?

I have a multi-tenant Azure app. I am using this app's credentials to read ADLS container files from Databricks cluster using PySpark dataframe.I need to set this 'additionallyAllowedTenants' flag value to '*' or a specific tenant_id of the multi-ten...

  • 4216 Views
  • 1 replies
  • 0 kudos
Latest Reply
soumiknow
Contributor II
  • 0 kudos

Update: Currently spark does not have this feature.

  • 0 kudos
SumeshD
by New Contributor II
  • 1011 Views
  • 2 replies
  • 1 kudos

Errors on Databricks and Terraform upgrades for DABS

Hi AllI've recently upgraded my Databricks CLI to 0.235 and Terraform provider to 1.58 locally on my machine and my DABs deployments have broken. They did work in the past with previous versions and now I can't even run Terraform -v. The command Data...

  • 1011 Views
  • 2 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

Hi @SumeshD, Can you run databricks bundle debug terraform to obtain more details on the failure? The error messages you are encountering, such as "ConflictsWith skipped for [task.0.for_each_task.0.task.0.new_cluster.0.aws_attributes task.0.for_each_...

  • 1 kudos
1 More Replies
pdiamond
by Contributor
  • 906 Views
  • 1 replies
  • 0 kudos

Resolved! Optional JDBC Parameters in external connection

Is there any way to specify optional JDBC parameters like batchSize through the External Connections created in Unity Catalog? (specifically I'm trying to speed up data retrieval from a SQL Server database)

  • 906 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

As per information I found internally seems that this option is not currently supported.

  • 0 kudos
Redford
by New Contributor
  • 5485 Views
  • 3 replies
  • 1 kudos

Can I configure Notebook Result Downloads with Databricks CLI , API or Terraform provider ?

I'm Databricks Admin and I'm looking for a solution to automate some Security Workspace settings.Those are:Notebook result downloadSQL result downloadNotebook table clipboard featuresI can't find these options in the Databricks terraform provider, Da...

  • 5485 Views
  • 3 replies
  • 1 kudos
Latest Reply
nkraj
Databricks Employee
  • 1 kudos

Hi @Redford, With the Databricks API, you have the capability to toggle the following features: Enable/Disable Features Notebook result download (key name: enableResultsDownloading)Notebook table clipboard features (key name: enableNotebo...

  • 1 kudos
2 More Replies
jakubk
by Contributor
  • 2669 Views
  • 7 replies
  • 0 kudos

unknown geo redundancy storage events (& costs) in azure databricks resource group

Hi All,I'm after some guidance on how to identify massive (100000%) spikes in bandwidth usage (and related costs) in the azure databricks provisioned/managed resource group storage account & stop themThese blips are adding 30-50% to our monthly costs...

jakubk_0-1733707523442.png
  • 2669 Views
  • 7 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Thanks for opening a case with us, we will have a look at it.

  • 0 kudos
6 More Replies
F_Goudarzi
by New Contributor III
  • 1735 Views
  • 6 replies
  • 0 kudos

Unity catalog meta store is created within undesired storage account

I came to know that our unity catalog meta store has been created in the default storage account of our databricks workspace and this storage account has some system denied access policies, therefore we don't have access to see the data inside. I'm w...

  • 1735 Views
  • 6 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

You will need to backup the current metastore including the metadata and then start recreating the catalogs, schemas and tables on the new metastore.

  • 0 kudos
5 More Replies
santosh23
by New Contributor III
  • 1729 Views
  • 10 replies
  • 0 kudos

Update existing Metastores in AWS databricks

Hello Team,I am unable to update the Existing Metastore in my AWS databricks. I have new aws account and I am trying to update my existing workspace, however I am unable to update the s3 bucket details and Network configuration ( greyed out )  in the...

  • 1729 Views
  • 10 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Unfortunately there is no way to move the state of the workspace manually so based on this the solution will be to recreate the workspace and migrate the data

  • 0 kudos
9 More Replies
Takuya-Omi
by Valued Contributor III
  • 2057 Views
  • 3 replies
  • 1 kudos

How Can a Workspace Admin Grant Workspace Admin Permissions to a Group?

I want to grant Workspace Admin permissions to a group instead of individual users, but I haven’t found a way to do this. I considered assigning permissions by adding the group to the Databricks-managed 'admins' group (establishing a parent-child rel...

  • 2057 Views
  • 3 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

No problem! I will check internally if there is any feature request of this nature. You can use the "admins" group for adding admin users or SPs.

  • 1 kudos
2 More Replies
JissMathew
by Valued Contributor
  • 1267 Views
  • 2 replies
  • 1 kudos

Resolved! DLT TABLES schema mapping

how we map table that in delta live table to a bronze , sliver , gold schema ? is that possible to store in different schema the dlt tables?? 

  • 1267 Views
  • 2 replies
  • 1 kudos
Latest Reply
JissMathew
Valued Contributor
  • 1 kudos

@Walter_C  Thank you 

  • 1 kudos
1 More Replies