- 57 Views
- 0 replies
- 0 kudos
Hello,With ref to docs https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/usage-detail-tags cluster tags are not propagated to VM when created within a pool.Is there any workaround for monitoring VM costs using cluster pools (j...
- 57 Views
- 0 replies
- 0 kudos
- 128 Views
- 0 replies
- 0 kudos
Hello there,are pre-purchased DBU still valid? Can we use it?https://learn.microsoft.com/en-us/azure/cost-management-billing/reservations/reservation-discount-databricksCan someone please explain how it works in practice, by example?What if I pre-puc...
- 128 Views
- 0 replies
- 0 kudos
- 247 Views
- 4 replies
- 0 kudos
Hi,does anyone know about any existing issue with azure databricks account api? I cannot do below:1. login with cli `databricks auth login --account-id <acc_id>, this is what I get https://adb-MY_ID.azuredatabricks.net/oidc/accounts/MY_ACC_ID/v1/auth...
- 247 Views
- 4 replies
- 0 kudos
Latest Reply
UPDATE Solved, the very same solution started to work today from running a pipeline with tf - M2M auth. with a service principal with fed auth. That's the 2 from my above post.When trying to follow these steps https://learn.microsoft.com/en-us/azure/...
3 More Replies
- 714 Views
- 2 replies
- 0 kudos
Hi,We are facing a requirement where we need to somehow expose one of our Databricks clusters to an external service. Our organization's cyber team is running a security audit of all of the resource we use and they have some tools which they use to r...
- 714 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @Kaniz ,Thank you very much for the reply. But I don't think this actually resolves our concern.All these solutions talk about utilizing the databricks cluster to access/read data in Databricks. They focus on getting to the Databricks data through...
1 More Replies
- 582 Views
- 7 replies
- 0 kudos
Tried to use M2M OAuth connectivity on Databricks SQL Warehouse in Python:from databricks.sdk.core import Config, oauth_service_principal
from databricks import sql
....
config = Config(host=f"https://{host}",
client_...
- 582 Views
- 7 replies
- 0 kudos
Latest Reply
I am facing the same issue with the same error logs as @harripy. Can you please help @Yeshwanth @Dani ?
6 More Replies
- 375 Views
- 3 replies
- 0 kudos
We have a pipeline in Azure Devops that deploys init scripts to the workspace folder on an Azure Databricks resource using the workspace API (/api/2.0/workspace/import), we use format "AUTO" and overwrite "true" to achieve this. After being uploaded ...
- 375 Views
- 3 replies
- 0 kudos
Latest Reply
The init script only generates logs in the scenario that it runs, not when it fails.
2 More Replies
by
leungi
• New Contributor II
- 1122 Views
- 2 replies
- 1 kudos
The `Library` option in cluster config allows installation of language-specific libraries - e.g., PyPi for Python, CRAN for R.Some of these libraries - e.g., `sf` - require system libraries - e.g., `libudunits2-dev`, `libgdal-dev`.How may one install...
- 1122 Views
- 2 replies
- 1 kudos
Latest Reply
you can install in init scripthttps://docs.databricks.com/en/init-scripts/index.html
1 More Replies
by
otydos
• New Contributor II
- 989 Views
- 2 replies
- 0 kudos
Hello, I want to authenticate with terraform to databricks account level with : Azure Managed Identity(System-assigned) of my Azure VMto perform operation like create group. I followed differents tutorial and the documentation on Azure and Databricks...
- 989 Views
- 2 replies
- 0 kudos
Latest Reply
Hello,On my side, I always have to add the provider in each resource block.You can try that: resource "databricks_group" "xxxxx" {
provider = databricks.accounts
display_name = "xxxxx"
} About authentication, you can also try to add:auth_type ...
1 More Replies
- 497 Views
- 2 replies
- 0 kudos
Hi all, I am trying to use secrets to connect to my Azure storage account. I want to be able to read the data form the storage account using a pyspark notebook.Has anyone experience setting up such a connection or has good documentation to do so?I ha...
- 497 Views
- 2 replies
- 0 kudos
Latest Reply
Hi Sean,There are two ways to handle secret scopes:databricks-backed scopes: scope is related to a workspace. You will have to handle the update of the secrets.Azure Key Vault-backed scopes: scope is related to a Key Vault. It means than you configur...
1 More Replies
by
Flask
• New Contributor II
- 164 Views
- 2 replies
- 0 kudos
We've changed address range for Vnet and subnet that the Azure Databricks workspace(standard sku) was using, after that when we try to access the catalog data, we're getting socket closed error. This error is only with Multi node cluster, for single ...
- 164 Views
- 2 replies
- 0 kudos
Latest Reply
Yes, it is mentioned that we cannot change the Vnet. I've changed the range in the same vnet but not the Vnet. Is there any troubleshooting that I can do to find this issue. The problem is, I don't want to recreate the workspace. It is a worst case s...
1 More Replies
by
807326
• New Contributor II
- 2566 Views
- 3 replies
- 1 kudos
Hello! We tried to update our integration scripts and use SQL warehouses instead of general compute clusters to fetch and update data, but we faced a problem. We use automatic schema evolution when we merge tables, but with SQL warehouse, when we try...
- 2566 Views
- 3 replies
- 1 kudos
Latest Reply
why can we not enable autoMerge in SQL warehouse when my tables are delta tables?
2 More Replies
- 4709 Views
- 5 replies
- 1 kudos
I'm trying to deploy using Databricks Asset Bundles via an Azure DevOps pipeline. I keep getting this error when trying to use oauth:Error: default auth: oauth-m2m: oidc: databricks OAuth is not supported for this host. Config: host=https://<workspac...
- 4709 Views
- 5 replies
- 1 kudos
Latest Reply
Hi @bradleyjamrozik, thank you for posting your question. You will need to use ARM_ variables to make it work
Specifically
ARM_CLIENT_ID
ARM_TENANT_ID
ARM_CLIENT_SECRET
https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth#environment-3 f...
4 More Replies
- 447 Views
- 2 replies
- 0 kudos
Hi all,I can't find guidance on how to create a Databricks access connector for connecting catalogs to external data locations, using Terraform.Also, I want to create my catalogs, set-up external locations etc using Terraform. Has anyone got a good r...
- 447 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @Snoonan, Creating a Databricks access connector for connecting catalogs to external data locations using Terraform is a great way to manage your Databricks workspaces and associated cloud infrastructure.
Let’s break it down:
Databricks Access...
1 More Replies
- 323 Views
- 1 replies
- 0 kudos
Hello everyone,I was lead in a data platform modernization project. This was my first time administrating databricks and I got myself into quite the situation. Essentially i made the mistake of linking our enterprise wide Unity Catalog to our DEV Azu...
- 323 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @Daalip808, Managing the Unity Catalog in Azure Databricks is crucial for data governance and organization.
Let’s explore some best practices and potential options for backing up and restoring your Unity Catalog in your current situation.
...
- 263 Views
- 1 replies
- 0 kudos
I'm curious if there's a SQL query method to retrieve counts from delta table metadata without individually performing count(*) on each table. I'm wondering if this information is stored in any of the INFORMATION_SCHEMA tables.I have a scenario where...
- 263 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @Gkumar43, To estimate the row count for an entire Delta Lake table without individually performing COUNT(*) on each table, you can use the following SQL query:
SELECT SUM(numRecords) AS estimated_row_count
FROM delta.`/path/to/my_table`
Repla...