- 2017 Views
- 4 replies
- 0 kudos
Azure Databricks account api can't auth
Hi,does anyone know about any existing issue with azure databricks account api? I cannot do below:1. login with cli `databricks auth login --account-id <acc_id>, this is what I get https://adb-MY_ID.azuredatabricks.net/oidc/accounts/MY_ACC_ID/v1/auth...
- 2017 Views
- 4 replies
- 0 kudos
- 0 kudos
UPDATE Solved, the very same solution started to work today from running a pipeline with tf - M2M auth. with a service principal with fed auth. That's the 2 from my above post.When trying to follow these steps https://learn.microsoft.com/en-us/azure/...
- 0 kudos
- 1873 Views
- 2 replies
- 0 kudos
Public exposure for clusters in SCC enabled workspaces
Hi,We are facing a requirement where we need to somehow expose one of our Databricks clusters to an external service. Our organization's cyber team is running a security audit of all of the resource we use and they have some tools which they use to r...
- 1873 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Kaniz_Fatma ,Thank you very much for the reply. But I don't think this actually resolves our concern.All these solutions talk about utilizing the databricks cluster to access/read data in Databricks. They focus on getting to the Databricks data t...
- 0 kudos
- 3164 Views
- 7 replies
- 0 kudos
Databricks SQL connectivity in Python with Service Principals
Tried to use M2M OAuth connectivity on Databricks SQL Warehouse in Python:from databricks.sdk.core import Config, oauth_service_principal from databricks import sql .... config = Config(host=f"https://{host}", client_...
- 3164 Views
- 7 replies
- 0 kudos
- 0 kudos
I am facing the same issue with the same error logs as @harripy. Can you please help @Yeshwanth @Dani ?
- 0 kudos
- 3531 Views
- 2 replies
- 1 kudos
Resolved! Install system libraries on the cluster
The `Library` option in cluster config allows installation of language-specific libraries - e.g., PyPi for Python, CRAN for R.Some of these libraries - e.g., `sf` - require system libraries - e.g., `libudunits2-dev`, `libgdal-dev`.How may one install...
- 3531 Views
- 2 replies
- 1 kudos
- 1 kudos
you can install in init scripthttps://docs.databricks.com/en/init-scripts/index.html
- 1 kudos
- 2254 Views
- 2 replies
- 0 kudos
Authenticate with Terraform to Databricks Account level using Azure MSI(System assigned)
Hello, I want to authenticate with terraform to databricks account level with : Azure Managed Identity(System-assigned) of my Azure VMto perform operation like create group. I followed differents tutorial and the documentation on Azure and Databricks...
- 2254 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello,On my side, I always have to add the provider in each resource block.You can try that: resource "databricks_group" "xxxxx" { provider = databricks.accounts display_name = "xxxxx" } About authentication, you can also try to add:auth_type ...
- 0 kudos
- 710 Views
- 2 replies
- 0 kudos
Problem loading catalog data from multi node cluster after changing Vnet IP range in AzureDatabricks
We've changed address range for Vnet and subnet that the Azure Databricks workspace(standard sku) was using, after that when we try to access the catalog data, we're getting socket closed error. This error is only with Multi node cluster, for single ...
- 710 Views
- 2 replies
- 0 kudos
- 0 kudos
Yes, it is mentioned that we cannot change the Vnet. I've changed the range in the same vnet but not the Vnet. Is there any troubleshooting that I can do to find this issue. The problem is, I don't want to recreate the workspace. It is a worst case s...
- 0 kudos
- 4426 Views
- 3 replies
- 1 kudos
Resolved! Enable automatic schema evolution for Delta Lake merge for an SQL warehouse
Hello! We tried to update our integration scripts and use SQL warehouses instead of general compute clusters to fetch and update data, but we faced a problem. We use automatic schema evolution when we merge tables, but with SQL warehouse, when we try...
- 4426 Views
- 3 replies
- 1 kudos
- 1 kudos
why can we not enable autoMerge in SQL warehouse when my tables are delta tables?
- 1 kudos
- 8738 Views
- 5 replies
- 1 kudos
Resolved! databricks OAuth is not supported for this host
I'm trying to deploy using Databricks Asset Bundles via an Azure DevOps pipeline. I keep getting this error when trying to use oauth:Error: default auth: oauth-m2m: oidc: databricks OAuth is not supported for this host. Config: host=https://<workspac...
- 8738 Views
- 5 replies
- 1 kudos
- 1 kudos
Hi @bradleyjamrozik, thank you for posting your question. You will need to use ARM_ variables to make it work Specifically ARM_CLIENT_ID ARM_TENANT_ID ARM_CLIENT_SECRET https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth#environment-3 f...
- 1 kudos
- 1343 Views
- 2 replies
- 0 kudos
Terraform for Databricks
Hi all,I can't find guidance on how to create a Databricks access connector for connecting catalogs to external data locations, using Terraform.Also, I want to create my catalogs, set-up external locations etc using Terraform. Has anyone got a good r...
- 1343 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Snoonan, Creating a Databricks access connector for connecting catalogs to external data locations using Terraform is a great way to manage your Databricks workspaces and associated cloud infrastructure. Let’s break it down: Databricks Access...
- 0 kudos
- 1390 Views
- 1 replies
- 0 kudos
Unity Catalog - Created UC and linked it to my DEV storage account for the entire org
Hello everyone,I was lead in a data platform modernization project. This was my first time administrating databricks and I got myself into quite the situation. Essentially i made the mistake of linking our enterprise wide Unity Catalog to our DEV Azu...
- 1390 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Daalip808, Managing the Unity Catalog in Azure Databricks is crucial for data governance and organization. Let’s explore some best practices and potential options for backing up and restoring your Unity Catalog in your current situation. ...
- 0 kudos
- 1131 Views
- 2 replies
- 1 kudos
keyrings.google-artifactregistry-auth fails to install backend on runtimes > 10.4
We run Databricks on GCP. We store our private Python packages in the Google Artifact Registry. When we need to install the private packages we a global init script to install `keyring` and `keyrings.google-artifactregistry-auth`. The we `pip inst...
- 1131 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @Ryan512, It seems you’re encountering an issue with keyrings.google-artifactregistry-auth not setting up the necessary backend with keyring on Databricks runtimes greater than 10.4. Check Compatibility: First, let’s verify if keyrings.google-art...
- 1 kudos
- 1412 Views
- 2 replies
- 1 kudos
Hard reset programatically
Is it possible to trigger a git reset --hard programatically?I'm running a platform service where, as part of CI/CD, repos get deployed into the Databricks workspace. Normally, our developers work with upstream repos both from their local IDEs and fr...
- 1412 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @camilo_s, When dealing with Git repositories programmatically, you can indeed trigger a git reset --hard to revert to a specific commit. Let’s break down the process: Understanding git reset --hard: The git reset --hard command discards all c...
- 1 kudos
- 827 Views
- 1 replies
- 0 kudos
Unity Catalog Enabled Clusters using PrivateNIC
Hello,When reviewing the VM settings for Databricks worker VMs, we can see that there are two(2) NICs.A primary ( PublicNIC (primary)) and a secondary (PrivateNIC (primary)).The workers VM is always assigned the PublicNIC and this is reachable from w...
- 827 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @newuser12445, It seems like you’re dealing with some networking configuration issues related to Databricks worker VMs and their network interfaces. Let’s break down the situation: NICs (Network Interface Cards): You mentioned that each Databr...
- 0 kudos
- 697 Views
- 1 replies
- 0 kudos
SQL Warehouse tag list from system table ?
Hello, Is there a way to get the tags of SQL Warehouse clusters from system tables ? like you do with system.compute.clustersThanks,
- 697 Views
- 1 replies
- 0 kudos
- 0 kudos
Answering my own question : system.billing.usage.custom_tags['cluster-owner'] @ databricks : I don't really understand the logic here
- 0 kudos
- 4024 Views
- 2 replies
- 0 kudos
Resolved! Databricks SSO Azure AD
Hello,I'm trying to test SSO with Azure AD.The test sso is passing on dtabricks and I can connect to databricks using SSO.When I try to test with postman to obtain a token I have the next error message :{"error_description":"OAuth application with ...
- 4024 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello,The issue was with the postman.In postman you don't have to give the client id from your IDP but the client id from databricks "App connections".it is working well now.thank you.
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
36 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
26 | |
9 | |
8 | |
8 | |
7 |