- 3313 Views
- 2 replies
- 0 kudos
Authenticate with Terraform to Databricks Account level using Azure MSI(System assigned)
Hello, I want to authenticate with terraform to databricks account level with : Azure Managed Identity(System-assigned) of my Azure VMto perform operation like create group. I followed differents tutorial and the documentation on Azure and Databricks...
- 3313 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello,On my side, I always have to add the provider in each resource block.You can try that: resource "databricks_group" "xxxxx" { provider = databricks.accounts display_name = "xxxxx" } About authentication, you can also try to add:auth_type ...
- 0 kudos
- 1225 Views
- 2 replies
- 0 kudos
Problem loading catalog data from multi node cluster after changing Vnet IP range in AzureDatabricks
We've changed address range for Vnet and subnet that the Azure Databricks workspace(standard sku) was using, after that when we try to access the catalog data, we're getting socket closed error. This error is only with Multi node cluster, for single ...
- 1225 Views
- 2 replies
- 0 kudos
- 0 kudos
Yes, it is mentioned that we cannot change the Vnet. I've changed the range in the same vnet but not the Vnet. Is there any troubleshooting that I can do to find this issue. The problem is, I don't want to recreate the workspace. It is a worst case s...
- 0 kudos
- 5651 Views
- 2 replies
- 0 kudos
Enable automatic schema evolution for Delta Lake merge for an SQL warehouse
Hello! We tried to update our integration scripts and use SQL warehouses instead of general compute clusters to fetch and update data, but we faced a problem. We use automatic schema evolution when we merge tables, but with SQL warehouse, when we try...
- 5651 Views
- 2 replies
- 0 kudos
- 0 kudos
why can we not enable autoMerge in SQL warehouse when my tables are delta tables?
- 0 kudos
- 15408 Views
- 4 replies
- 1 kudos
Resolved! databricks OAuth is not supported for this host
I'm trying to deploy using Databricks Asset Bundles via an Azure DevOps pipeline. I keep getting this error when trying to use oauth:Error: default auth: oauth-m2m: oidc: databricks OAuth is not supported for this host. Config: host=https://<workspac...
- 15408 Views
- 4 replies
- 1 kudos
- 1 kudos
Hi @bradleyjamrozik, thank you for posting your question. You will need to use ARM_ variables to make it work Specifically ARM_CLIENT_ID ARM_TENANT_ID ARM_CLIENT_SECRET https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth#environment-3 f...
- 1 kudos
- 2102 Views
- 1 replies
- 0 kudos
Terraform for Databricks
Hi all,I can't find guidance on how to create a Databricks access connector for connecting catalogs to external data locations, using Terraform.Also, I want to create my catalogs, set-up external locations etc using Terraform. Has anyone got a good r...
- 2102 Views
- 1 replies
- 0 kudos
- 1687 Views
- 1 replies
- 1 kudos
keyrings.google-artifactregistry-auth fails to install backend on runtimes > 10.4
We run Databricks on GCP. We store our private Python packages in the Google Artifact Registry. When we need to install the private packages we a global init script to install `keyring` and `keyrings.google-artifactregistry-auth`. The we `pip inst...
- 1687 Views
- 1 replies
- 1 kudos
- 1023 Views
- 1 replies
- 0 kudos
SQL Warehouse tag list from system table ?
Hello, Is there a way to get the tags of SQL Warehouse clusters from system tables ? like you do with system.compute.clustersThanks,
- 1023 Views
- 1 replies
- 0 kudos
- 0 kudos
Answering my own question : system.billing.usage.custom_tags['cluster-owner'] @ databricks : I don't really understand the logic here
- 0 kudos
- 5101 Views
- 2 replies
- 0 kudos
Resolved! Databricks SSO Azure AD
Hello,I'm trying to test SSO with Azure AD.The test sso is passing on dtabricks and I can connect to databricks using SSO.When I try to test with postman to obtain a token I have the next error message :{"error_description":"OAuth application with ...
- 5101 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello,The issue was with the postman.In postman you don't have to give the client id from your IDP but the client id from databricks "App connections".it is working well now.thank you.
- 0 kudos
- 3810 Views
- 1 replies
- 0 kudos
Databricks on premise GDCE
Hello, Any plans for supporting Databricks on GDCE or other on private cloud-native stack/HW on premise?Regards, Patrick
- 3810 Views
- 1 replies
- 0 kudos
- 1186 Views
- 2 replies
- 1 kudos
inter connected notebook
How to use inter connected notebook, available in databricks?
- 1186 Views
- 2 replies
- 1 kudos
- 1 kudos
Do you mean running one notebook from another and using variables and functions defined in the other one? If that's what you're seeking, try using the magic command %run + notebook path. You can find some documentation about it here: https://docs.da...
- 1 kudos
- 1912 Views
- 2 replies
- 0 kudos
Asset Bundles -> creation of Azure DevOps pipeline
If you choose in asset bundles mlops-stacks, it will create for you out of the box many nice things, including a pipeline to deploy to dev/stage/prod. #databricks
- 1912 Views
- 2 replies
- 0 kudos
- 0 kudos
Thank you for sharing this @Hubert-Dudek
- 0 kudos
- 1465 Views
- 1 replies
- 0 kudos
databricks on azure jdbc
Hello Databricks teamI have one question regarding data bricks on azure configuration using jdbc [Simba][SparkJDBCDriver](700100)I am getting below error message : java.sql.SQLException: [Simba][SparkJDBCDriver](700100) Connection timeout expired. De...
- 1465 Views
- 1 replies
- 0 kudos
- 0 kudos
Check your network connection. Try "%sh nc -zv {hostname} {port}"
- 0 kudos
- 1894 Views
- 0 replies
- 0 kudos
Unity Catalog - Created UC and linked it to my DEV storage account for the entire org
Hello everyone,I was lead in a data platform modernization project. This was my first time administrating databricks and I got myself into quite the situation. Essentially i made the mistake of linking our enterprise wide Unity Catalog to our DEV Azu...
- 1894 Views
- 0 replies
- 0 kudos
- 2629 Views
- 1 replies
- 0 kudos
Resolved! Creating Databricks workspace
Hi all,I am creating a Databricks workspace that has its own virtual network.When I create it I get this error:'The workspace 'xxxxxx' is in a failed state and cannot be launched. Please review error details in the activity log tab and retry your ope...
- 2629 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi all,I resolved the issue.My subnets did not have the correct delegations.Thanks,Sean
- 0 kudos
- 6477 Views
- 3 replies
- 0 kudos
New admin question: How do you enable R on a existing cluster?
Hello Community. I have a user trying to use R and receive the error message illustrated on the attachment. I can't seem to find correct documentation on enabling R on an existing cluster. Would anyone be able to point me in the right direction? Than...
- 6477 Views
- 3 replies
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
AWS
5 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Delta
4 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
LTS
2 -
Model Serving
1 -
Partner
85 -
Public Preview
1 -
Rest API
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
User | Count |
---|---|
42 | |
26 | |
25 | |
17 | |
10 |