- 3542 Views
- 5 replies
- 0 kudos
Resolved! Databricks Serverless Job : sudden random failure
Hi, I've been running a job on Azure Databricks serverless, which just does some batch data processing every 4 hours. This job, deployed with bundles has been running fine for weeks, and all of a sudden, yesterday, it started failing with an error th...
- 3542 Views
- 5 replies
- 0 kudos
- 0 kudos
Hey @thibault , Glad to hear it is working again. I don't see any specific mention of a bug internally that would be related to this, but it is likely that it was due to a change in the underlying runtime for serverless compute. This may be one of th...
- 0 kudos
- 5296 Views
- 2 replies
- 1 kudos
azure databricks automatic user provisioning via terraform
Hi community, Azure databricks recently announced a new user management feature (now in public preview) called automatic-identity-management , which allows Azure databricks to access Azure Entra ID directly and grant users and groups permissions and ...
- 5296 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi , I think The automatic identity management feature provisions Azure Entra ID users and groups directly into Databricks. However, Terraform's databricks_group and databricks_group_member resources are designed for managing groups and memberships w...
- 1 kudos
- 3058 Views
- 1 replies
- 0 kudos
workflow not pickingup correct host value (While working with MLflow model registry URI)
Exception: mlflow.exceptions.MlflowException: An API request to https://canada.cloud.databricks.com/api/2.0/mlflow/model-versions/list-artifacts failed due to a timeout. The error message was: HTTPSConnectionPool(host='canada.cloud.databricks.com', p...
- 3058 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @Dharma25! It looks like this post duplicates one you shared earlier. A response has already been provided in the Original thread. I recommend continuing the discussion there to keep the conversation focused and organized.
- 0 kudos
- 2841 Views
- 4 replies
- 2 kudos
Azure databricks workspace Power BI connector type
In Power BI, there's an "Azure Databricks workspace" connector which, unlike the "Azure Databricks" connector, allows you to connect using a service principal defined in Azure Entra ID (rather than within Databricks).While I can create this connector...
- 2841 Views
- 4 replies
- 2 kudos
- 2 kudos
The "Azure Databricks Workspace" connector in Power BI allows authentication using a service principal from Azure Entra ID, providing more secure and scalable access management compared to the traditional personal token-based "Azure Databricks" conne...
- 2 kudos
- 8725 Views
- 4 replies
- 2 kudos
Internal error. Attach your notebook to a different compute or restart the current compute. java.lan
Internal error. Attach your notebook to a different compute or restart the current compute.java.lang.RuntimeException: abort: DriverClient destroyed at com.databricks.backend.daemon.driver.DriverClient.$anonfun$poll$3(DriverClient.scala:577) at scala...
- 8725 Views
- 4 replies
- 2 kudos
- 2 kudos
The error is caused due to overlap of connectors or instance, if you see an error as below: And you can see Multiple clusters with same name, which is caused due to running the notebook_1 under a cluster attached to it and re-running a notebook_2 wit...
- 2 kudos
- 3571 Views
- 3 replies
- 3 kudos
Resolved! Why catalog API does not include the catalog ID in the response?
Hi!I'm using Terraform(TF) to manage the Databricks resources.I would like to rename the Unity catalog using TF, but I could not. (similar issues have been reported for this:- https://github.com/databricks/terraform-provider-databricks/issues?q=is%3A...
- 3571 Views
- 3 replies
- 3 kudos
- 3 kudos
I will pass your request along; however, there is nothing I can do to escalate the issue. I can only make the request. Cheers, Lou.
- 3 kudos
- 1223 Views
- 2 replies
- 0 kudos
Network Connectivity Configurations - assign to workspace
Hi,Following these api calls Databricks has not actually applied the NCC to the workspace, despite returning a success status. All values are correct (ncc id , workspace id) What could be the issue?:# 1. Get list of NCCs to confirm ID and region - th...
- 1223 Views
- 2 replies
- 0 kudos
- 0 kudos
Thanks for the reply. The region is the same and the workspace is running.
- 0 kudos
- 5446 Views
- 6 replies
- 0 kudos
System.billing.usage table - cannot match job_id from databricks api/UI
Hello, I have multiple continuous jobs that are running for many days (Kafka stream), however querying System.billing.usage table by job_id from UI or databricks job api not return any results for those jobs.1. What is the reason behind that?2. If I ...
- 5446 Views
- 6 replies
- 0 kudos
- 0 kudos
What is the update on this? I am also unable to see my continuous Jobs usage in system.billing.usage table - Although the run information, Task information is available in system.lakeflow.job_run_timeline and system.lakeflow.job_task_run_timeline. Pl...
- 0 kudos
- 2863 Views
- 4 replies
- 0 kudos
Unable to Change Pipeline Ownership to Service Principal Despite Admin Permissions
Hello Databricks Community,I am facing an issue with changing the ownership of an ETL pipeline to a service principal. Despite being the Databricks administrator with workspace and metastore admin permissions, I receive the following error: "Only adm...
- 2863 Views
- 4 replies
- 0 kudos
- 0 kudos
Are you saying that you can't create a PAT token in the Databricks Workspace, or you can create a PAT token but don't have a secure way of using it with the API request? Please explain a bit more. Thanks, Lou.
- 0 kudos
- 944 Views
- 0 replies
- 2 kudos
Databricks Serverless: Real World Implementations and Experiences
How have you leveraged Databricks Serverless to streamline your data workflows? We invite you to share your real-world experiences-which workloads have benefited most, what challenges you’ve encountered, and any best practices or tips you have for ma...
- 944 Views
- 0 replies
- 2 kudos
- 5082 Views
- 2 replies
- 0 kudos
Resolved! Tableau removal from catalog explorer
Is it possible to remove Tableau as a option from the Catalog Exproler (BI tools) when you want to open or explore the data?
- 5082 Views
- 2 replies
- 0 kudos
- 0 kudos
It is currently not possible to remove the Tableau option from the "BI tools" integration menu in the Databricks Catalog Explorer. Hope this helps. Cheers, Lou.
- 0 kudos
- 6258 Views
- 3 replies
- 0 kudos
Can we turn off Playground & Marketplace for some users?
Hi Everyone,Hope all is well with you!I'm reaching out to the community for some advice on customizing our workspace settings. Specifically, I have two things I'm trying to figure out:Disabling the "Playground" Option: Is there a way to turn off the ...
- 6258 Views
- 3 replies
- 0 kudos
- 0 kudos
how can we turn on the playground option, i cannot find it
- 0 kudos
- 785 Views
- 1 replies
- 1 kudos
Task level Definition
Hello, According to the databricks_job resource terraform documentation, the description argument is supported at the task level. However, when I update the Terraform configuration to include a task description, the terraform plan reflects the change...
- 785 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @ChinuLee, the task-level description appears to be hidden in the UI, both during manual creation and when viewing tasks after creation. However, it is visible when you choose the "View as Code" option, as shown below:So it is supported, just hidd...
- 1 kudos
- 7948 Views
- 8 replies
- 0 kudos
Resolved! Cannot use Terraform to create Databricks Storage Credential
Hi all,When I use Terraform in an Azure DevOps pipeline to create Databricks Storage Credential, I got the following error. Has anybody met the same error before? Or is there any idea how to debug it? Error: cannot create storage credential: failed d...
- 7948 Views
- 8 replies
- 0 kudos
- 0 kudos
How exactly do you need to configure auth_type in this case? I tried different options but nothing seems to work. I also would like to use the Service Connection from Azure DevOps Pipeline to deploy Databricks via TerraformTaskV4@4.
- 0 kudos
- 5746 Views
- 1 replies
- 3 kudos
Creating Group in Terraform using external_id
The documentation here doesn't give much information about how to use `external_id` when creating a new group. If I reference the object_id for an Azure AD Group, the databricks group gets created but the members from the AD group are not added, nor ...
- 5746 Views
- 1 replies
- 3 kudos
- 3 kudos
Greetings from the future! Now it is clear that external_id, which IS Azure's ObjectID, comes from the internal sync mechanism, that can be enabled in your account under previews:I was able to reference my security group in Terraform and create that ...
- 3 kudos
-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
75 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 118 | |
| 53 | |
| 38 | |
| 36 | |
| 25 |