cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

radothede
by Valued Contributor II
  • 5190 Views
  • 1 replies
  • 0 kudos

Databricks DBU pre-purchase

Hello there,are pre-purchased DBU still valid? Can we use it?https://learn.microsoft.com/en-us/azure/cost-management-billing/reservations/reservation-discount-databricksCan someone please explain how it works in practice, by example?What if I pre-puc...

Administration & Architecture
DBCU
optimize costs
Pre-purchase
reservation
  • 5190 Views
  • 1 replies
  • 0 kudos
Latest Reply
radothede
Valued Contributor II
  • 0 kudos

@Retired_mod could you please kindly look at this one? Thank You in advance.

  • 0 kudos
Snoonan
by Contributor
  • 6570 Views
  • 3 replies
  • 0 kudos

Resolved! Secrete management

Hi all, I am trying to use secrets to connect to my Azure storage account. I want to be able to read the data form the storage account using a pyspark notebook.Has anyone experience setting up such a connection or has good documentation to do so?I ha...

  • 6570 Views
  • 3 replies
  • 0 kudos
Latest Reply
DonatienTessier
Databricks Partner
  • 0 kudos

Hi Sean,There are two ways to handle secret scopes:databricks-backed scopes: scope is related to a workspace. You will have to handle the update of the secrets.Azure Key Vault-backed scopes: scope is related to a Key Vault. It means than you configur...

  • 0 kudos
2 More Replies
Aiden-Z
by New Contributor III
  • 4429 Views
  • 3 replies
  • 1 kudos

Resolved! Init script failure after workspace upload

We have a pipeline in Azure Devops that deploys init scripts to the workspace folder on an Azure Databricks resource using the workspace API (/api/2.0/workspace/import), we use format "AUTO" and overwrite "true" to achieve this. After being uploaded ...

  • 4429 Views
  • 3 replies
  • 1 kudos
Latest Reply
Aiden-Z
New Contributor III
  • 1 kudos

If anyone else comes across this problem, the issue was a deployment powershell script was changing LF to CRLF before upload in the init script. The solution was to upload with LF line endings in the pipeline. 

  • 1 kudos
2 More Replies
pawelmitrus
by Contributor
  • 4621 Views
  • 4 replies
  • 0 kudos

Azure Databricks account api can't auth

Hi,does anyone know about any existing issue with azure databricks account api? I cannot do below:1. login with cli `databricks auth login --account-id <acc_id>, this is what I get https://adb-MY_ID.azuredatabricks.net/oidc/accounts/MY_ACC_ID/v1/auth...

  • 4621 Views
  • 4 replies
  • 0 kudos
Latest Reply
pawelmitrus
Contributor
  • 0 kudos

UPDATE Solved, the very same solution started to work today from running a pipeline with tf - M2M auth. with a service principal with fed auth. That's the 2 from my above post.When trying to follow these steps https://learn.microsoft.com/en-us/azure/...

  • 0 kudos
3 More Replies
NadithK
by Contributor
  • 3087 Views
  • 1 replies
  • 0 kudos

Public exposure for clusters in SCC enabled workspaces

Hi,We are facing a requirement where we need to somehow expose one of our Databricks clusters to an external service. Our organization's cyber team is running a security audit of all of the resource we use and they have some tools which they use to r...

  • 3087 Views
  • 1 replies
  • 0 kudos
Latest Reply
NadithK
Contributor
  • 0 kudos

Hi @Retired_mod ,Thank you very much for the reply. But I don't think this actually resolves our concern.All these solutions talk about utilizing the databricks cluster to access/read data in Databricks. They focus on getting to the Databricks data t...

  • 0 kudos
leungi
by Contributor
  • 6185 Views
  • 2 replies
  • 1 kudos

Resolved! Install system libraries on the cluster

The `Library` option in cluster config allows installation of language-specific libraries - e.g., PyPi for Python, CRAN for R.Some of these libraries - e.g., `sf` - require system libraries - e.g., `libudunits2-dev`, `libgdal-dev`.How may one install...

  • 6185 Views
  • 2 replies
  • 1 kudos
Latest Reply
feiyun0112
Honored Contributor
  • 1 kudos

you can install in init scripthttps://docs.databricks.com/en/init-scripts/index.html 

  • 1 kudos
1 More Replies
otydos
by New Contributor II
  • 5382 Views
  • 2 replies
  • 0 kudos

Authenticate with Terraform to Databricks Account level using Azure MSI(System assigned)

Hello, I want to authenticate with terraform to databricks account level with : Azure Managed Identity(System-assigned) of my Azure VMto perform operation like create group. I followed differents tutorial and the documentation on Azure and Databricks...

  • 5382 Views
  • 2 replies
  • 0 kudos
Latest Reply
DonatienTessier
Databricks Partner
  • 0 kudos

Hello,On my side, I always have to add the provider in each resource block.You can try that:  resource "databricks_group" "xxxxx" { provider = databricks.accounts display_name = "xxxxx" }  About authentication, you can also try to add:auth_type  ...

  • 0 kudos
1 More Replies
Flask
by New Contributor II
  • 4397 Views
  • 2 replies
  • 0 kudos

Problem loading catalog data from multi node cluster after changing Vnet IP range in AzureDatabricks

We've changed address range for Vnet and subnet that the Azure Databricks workspace(standard sku) was using, after that when we try to access the catalog data, we're getting socket closed error. This error is only with Multi node cluster, for single ...

  • 4397 Views
  • 2 replies
  • 0 kudos
Latest Reply
Flask
New Contributor II
  • 0 kudos

Yes, it is mentioned that we cannot change the Vnet. I've changed the range in the same vnet but not the Vnet. Is there any troubleshooting that I can do to find this issue. The problem is, I don't want to recreate the workspace. It is a worst case s...

  • 0 kudos
1 More Replies
Snoonan
by Contributor
  • 2920 Views
  • 1 replies
  • 0 kudos

Terraform for Databricks

Hi all,I can't find guidance on how to create a Databricks access connector for connecting catalogs to external data locations, using Terraform.Also, I want to create my catalogs, set-up external locations etc using Terraform. Has anyone got a good r...

  • 2920 Views
  • 1 replies
  • 0 kudos
Ryan512
by New Contributor III
  • 3029 Views
  • 1 replies
  • 1 kudos

keyrings.google-artifactregistry-auth fails to install backend on runtimes > 10.4

We run Databricks on GCP.  We store our private Python packages in the Google Artifact Registry.  When we need to install the private packages we a global init script to install `keyring` and `keyrings.google-artifactregistry-auth`.  The we `pip inst...

  • 3029 Views
  • 1 replies
  • 1 kudos
adurand-accure
by Databricks Partner
  • 1780 Views
  • 1 replies
  • 0 kudos

SQL Warehouse tag list from system table ?

Hello,  Is there a way to get the tags of SQL Warehouse clusters from system tables ?  like you do with system.compute.clustersThanks, 

  • 1780 Views
  • 1 replies
  • 0 kudos
Latest Reply
adurand-accure
Databricks Partner
  • 0 kudos

Answering my own question : system.billing.usage.custom_tags['cluster-owner'] @ databricks : I don't really understand the logic here

  • 0 kudos
Daniela_Boamba
by New Contributor III
  • 7406 Views
  • 2 replies
  • 0 kudos

Resolved! Databricks SSO Azure AD

Hello,I'm trying to test SSO with Azure AD.The test sso is passing on dtabricks and I can connect  to databricks using SSO.When I try to  test with postman to obtain a token I have the next error message :{"error_description":"OAuth application with ...

Administration & Architecture
AWS
Azure AD
Databricks
  • 7406 Views
  • 2 replies
  • 0 kudos
Latest Reply
Daniela_Boamba
New Contributor III
  • 0 kudos

Hello,The issue was with the postman.In postman you don't have to give the client id from your IDP but the client id from databricks "App connections".it is working well now.thank you. 

  • 0 kudos
1 More Replies
corp
by New Contributor II
  • 1802 Views
  • 2 replies
  • 1 kudos

inter connected notebook

How to use inter connected notebook, available in databricks?

  • 1802 Views
  • 2 replies
  • 1 kudos
Latest Reply
mhiltner
Databricks Employee
  • 1 kudos

Do you mean running one notebook from another and using variables and functions defined in the other one? If that's what you're seeking, try using the magic command %run + notebook path.  You can find some documentation about it here: https://docs.da...

  • 1 kudos
1 More Replies
Hubert-Dudek
by Databricks MVP
  • 2531 Views
  • 2 replies
  • 0 kudos

Asset Bundles -> creation of Azure DevOps pipeline

If you choose in asset bundles mlops-stacks, it will create for you out of the box many nice things, including a pipeline to deploy to dev/stage/prod. #databricks

cicd.png
  • 2531 Views
  • 2 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 0 kudos

Thank you for sharing this @Hubert-Dudek 

  • 0 kudos
1 More Replies