cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

pawelmitrus
by New Contributor III
  • 53 Views
  • 0 replies
  • 0 kudos

Azure Databricks account api can't auth

Hi,does anyone know about any existing issue with azure databricks account api? I cannot do below:1. login with cli `databricks auth login --account-id <acc_id>, this is what I get https://adb-MY_ID.azuredatabricks.net/oidc/accounts/MY_ACC_ID/v1/auth...

  • 53 Views
  • 0 replies
  • 0 kudos
NadithK
by Contributor
  • 492 Views
  • 2 replies
  • 0 kudos

Public exposure for clusters in SCC enabled workspaces

Hi,We are facing a requirement where we need to somehow expose one of our Databricks clusters to an external service. Our organization's cyber team is running a security audit of all of the resource we use and they have some tools which they use to r...

  • 492 Views
  • 2 replies
  • 0 kudos
Latest Reply
NadithK
Contributor
  • 0 kudos

Hi @Kaniz ,Thank you very much for the reply. But I don't think this actually resolves our concern.All these solutions talk about utilizing the databricks cluster to access/read data in Databricks. They focus on getting to the Databricks data through...

  • 0 kudos
1 More Replies
harripy
by New Contributor III
  • 391 Views
  • 7 replies
  • 0 kudos

Databricks SQL connectivity in Python with Service Principals

Tried to use M2M OAuth connectivity on Databricks SQL Warehouse in Python:from databricks.sdk.core import Config, oauth_service_principal from databricks import sql .... config = Config(host=f"https://{host}", client_...

  • 391 Views
  • 7 replies
  • 0 kudos
Latest Reply
kavyakavuri
New Contributor
  • 0 kudos

I am facing the same issue with the same error logs as @harripy. Can you please help @Yeshwanth @Dani ? 

  • 0 kudos
6 More Replies
Aiden-Z
by New Contributor
  • 315 Views
  • 3 replies
  • 0 kudos

Init script failure after workspace upload

We have a pipeline in Azure Devops that deploys init scripts to the workspace folder on an Azure Databricks resource using the workspace API (/api/2.0/workspace/import), we use format "AUTO" and overwrite "true" to achieve this. After being uploaded ...

  • 315 Views
  • 3 replies
  • 0 kudos
Latest Reply
Aiden-Z
New Contributor
  • 0 kudos

The init script only generates logs in the scenario that it runs, not when it fails.

  • 0 kudos
2 More Replies
leungi
by New Contributor II
  • 1072 Views
  • 2 replies
  • 1 kudos

Resolved! Install system libraries on the cluster

The `Library` option in cluster config allows installation of language-specific libraries - e.g., PyPi for Python, CRAN for R.Some of these libraries - e.g., `sf` - require system libraries - e.g., `libudunits2-dev`, `libgdal-dev`.How may one install...

  • 1072 Views
  • 2 replies
  • 1 kudos
Latest Reply
feiyun0112
Contributor III
  • 1 kudos

you can install in init scripthttps://docs.databricks.com/en/init-scripts/index.html 

  • 1 kudos
1 More Replies
otydos
by New Contributor II
  • 709 Views
  • 2 replies
  • 0 kudos

Authenticate with Terraform to Databricks Account level using Azure MSI(System assigned)

Hello, I want to authenticate with terraform to databricks account level with : Azure Managed Identity(System-assigned) of my Azure VMto perform operation like create group. I followed differents tutorial and the documentation on Azure and Databricks...

  • 709 Views
  • 2 replies
  • 0 kudos
Latest Reply
DonatienTessier
New Contributor III
  • 0 kudos

Hello,On my side, I always have to add the provider in each resource block.You can try that:  resource "databricks_group" "xxxxx" { provider = databricks.accounts display_name = "xxxxx" }  About authentication, you can also try to add:auth_type  ...

  • 0 kudos
1 More Replies
Snoonan
by New Contributor III
  • 451 Views
  • 2 replies
  • 0 kudos

Secrete management

Hi all, I am trying to use secrets to connect to my Azure storage account. I want to be able to read the data form the storage account using a pyspark notebook.Has anyone experience setting up such a connection or has good documentation to do so?I ha...

  • 451 Views
  • 2 replies
  • 0 kudos
Latest Reply
DonatienTessier
New Contributor III
  • 0 kudos

Hi Sean,There are two ways to handle secret scopes:databricks-backed scopes: scope is related to a workspace. You will have to handle the update of the secrets.Azure Key Vault-backed scopes: scope is related to a Key Vault. It means than you configur...

  • 0 kudos
1 More Replies
Flask
by New Contributor II
  • 128 Views
  • 2 replies
  • 0 kudos

Problem loading catalog data from multi node cluster after changing Vnet IP range in AzureDatabricks

We've changed address range for Vnet and subnet that the Azure Databricks workspace(standard sku) was using, after that when we try to access the catalog data, we're getting socket closed error. This error is only with Multi node cluster, for single ...

  • 128 Views
  • 2 replies
  • 0 kudos
Latest Reply
Flask
New Contributor II
  • 0 kudos

Yes, it is mentioned that we cannot change the Vnet. I've changed the range in the same vnet but not the Vnet. Is there any troubleshooting that I can do to find this issue. The problem is, I don't want to recreate the workspace. It is a worst case s...

  • 0 kudos
1 More Replies
807326
by New Contributor II
  • 2492 Views
  • 3 replies
  • 1 kudos

Resolved! Enable automatic schema evolution for Delta Lake merge for an SQL warehouse

Hello! We tried to update our integration scripts and use SQL warehouses instead of general compute clusters to fetch and update data, but we faced a problem. We use automatic schema evolution when we merge tables, but with SQL warehouse, when we try...

  • 2492 Views
  • 3 replies
  • 1 kudos
Latest Reply
TheKnightCoder
New Contributor II
  • 1 kudos

why can we not enable autoMerge in SQL warehouse when my tables are delta tables?

  • 1 kudos
2 More Replies
bradleyjamrozik
by New Contributor III
  • 4608 Views
  • 5 replies
  • 1 kudos

Resolved! databricks OAuth is not supported for this host

I'm trying to deploy using Databricks Asset Bundles via an Azure DevOps pipeline. I keep getting this error when trying to use oauth:Error: default auth: oauth-m2m: oidc: databricks OAuth is not supported for this host. Config: host=https://<workspac...

  • 4608 Views
  • 5 replies
  • 1 kudos
Latest Reply
saadansari-db
New Contributor III
  • 1 kudos

Hi @bradleyjamrozik, thank you for posting your question. You will need to use ARM_ variables to make it work Specifically ARM_CLIENT_ID ARM_TENANT_ID ARM_CLIENT_SECRET https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth#environment-3 f...

  • 1 kudos
4 More Replies
Snoonan
by New Contributor III
  • 408 Views
  • 2 replies
  • 0 kudos

Terraform for Databricks

Hi all,I can't find guidance on how to create a Databricks access connector for connecting catalogs to external data locations, using Terraform.Also, I want to create my catalogs, set-up external locations etc using Terraform. Has anyone got a good r...

  • 408 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Snoonan, Creating a Databricks access connector for connecting catalogs to external data locations using Terraform is a great way to manage your Databricks workspaces and associated cloud infrastructure. Let’s break it down: Databricks Access...

  • 0 kudos
1 More Replies
Daalip808
by New Contributor
  • 287 Views
  • 1 replies
  • 0 kudos

Unity Catalog - Created UC and linked it to my DEV storage account for the entire org

Hello everyone,I was lead in a data platform modernization project. This was my first time administrating databricks and I got myself into quite the situation. Essentially i made the mistake of linking our enterprise wide Unity Catalog to our DEV Azu...

Administration & Architecture
Backup & Restore
governance
Unity Catalog
  • 287 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

  Hi @Daalip808, Managing the Unity Catalog in Azure Databricks is crucial for data governance and organization. Let’s explore some best practices and potential options for backing up and restoring your Unity Catalog in your current situation. ...

  • 0 kudos
Gkumar43
by New Contributor
  • 231 Views
  • 1 replies
  • 0 kudos

Retrieve the row count of a Delta Lake table from its metadata without using count(*).

I'm curious if there's a SQL query method to retrieve counts from delta table metadata without individually performing count(*) on each table. I'm wondering if this information is stored in any of the INFORMATION_SCHEMA tables.I have a scenario where...

  • 231 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Gkumar43, To estimate the row count for an entire Delta Lake table without individually performing COUNT(*) on each table, you can use the following SQL query: SELECT SUM(numRecords) AS estimated_row_count FROM delta.`/path/to/my_table` Repla...

  • 0 kudos
Ryan512
by New Contributor III
  • 484 Views
  • 2 replies
  • 1 kudos

keyrings.google-artifactregistry-auth fails to install backend on runtimes > 10.4

We run Databricks on GCP.  We store our private Python packages in the Google Artifact Registry.  When we need to install the private packages we a global init script to install `keyring` and `keyrings.google-artifactregistry-auth`.  The we `pip inst...

  • 484 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @Ryan512, It seems you’re encountering an issue with keyrings.google-artifactregistry-auth not setting up the necessary backend with keyring on Databricks runtimes greater than 10.4. Check Compatibility: First, let’s verify if keyrings.google-art...

  • 1 kudos
1 More Replies
camilo_s
by New Contributor II
  • 332 Views
  • 2 replies
  • 1 kudos

Hard reset programatically

Is it possible to trigger a git reset --hard programatically?I'm running a platform service where, as part of CI/CD, repos get deployed into the Databricks workspace. Normally, our developers work with upstream repos both from their local IDEs and fr...

  • 332 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @camilo_s, When dealing with Git repositories programmatically, you can indeed trigger a git reset --hard to revert to a specific commit. Let’s break down the process: Understanding git reset --hard: The git reset --hard command discards all c...

  • 1 kudos
1 More Replies
Labels