- 2232 Views
- 3 replies
- 3 kudos
Resolved! Why catalog API does not include the catalog ID in the response?
Hi!I'm using Terraform(TF) to manage the Databricks resources.I would like to rename the Unity catalog using TF, but I could not. (similar issues have been reported for this:- https://github.com/databricks/terraform-provider-databricks/issues?q=is%3A...
- 2232 Views
- 3 replies
- 3 kudos
- 3 kudos
I will pass your request along; however, there is nothing I can do to escalate the issue. I can only make the request. Cheers, Lou.
- 3 kudos
- 720 Views
- 2 replies
- 0 kudos
Network Connectivity Configurations - assign to workspace
Hi,Following these api calls Databricks has not actually applied the NCC to the workspace, despite returning a success status. All values are correct (ncc id , workspace id) What could be the issue?:# 1. Get list of NCCs to confirm ID and region - th...
- 720 Views
- 2 replies
- 0 kudos
- 0 kudos
Thanks for the reply. The region is the same and the workspace is running.
- 0 kudos
- 4597 Views
- 6 replies
- 0 kudos
System.billing.usage table - cannot match job_id from databricks api/UI
Hello, I have multiple continuous jobs that are running for many days (Kafka stream), however querying System.billing.usage table by job_id from UI or databricks job api not return any results for those jobs.1. What is the reason behind that?2. If I ...
- 4597 Views
- 6 replies
- 0 kudos
- 0 kudos
What is the update on this? I am also unable to see my continuous Jobs usage in system.billing.usage table - Although the run information, Task information is available in system.lakeflow.job_run_timeline and system.lakeflow.job_task_run_timeline. Pl...
- 0 kudos
- 511 Views
- 0 replies
- 0 kudos
Issues when configuring keystore spark config for pyspark to mongo atlas X.509 connectivity
Step followed - Step1: To add init script that will copy the keystore file in the tmp location.Step2: To add spark config in cluster advance options - spark.driver.extraJavaOptions -Djavax.net.ssl.keyStore=/tmp/keystore.jks -Djavax.net.ssl.keyStorePa...
- 511 Views
- 0 replies
- 0 kudos
- 1622 Views
- 4 replies
- 0 kudos
Unable to Change Pipeline Ownership to Service Principal Despite Admin Permissions
Hello Databricks Community,I am facing an issue with changing the ownership of an ETL pipeline to a service principal. Despite being the Databricks administrator with workspace and metastore admin permissions, I receive the following error: "Only adm...
- 1622 Views
- 4 replies
- 0 kudos
- 0 kudos
Are you saying that you can't create a PAT token in the Databricks Workspace, or you can create a PAT token but don't have a secure way of using it with the API request? Please explain a bit more. Thanks, Lou.
- 0 kudos
- 505 Views
- 0 replies
- 0 kudos
Databricks Serverless: Real World Implementations and Experiences
How have you leveraged Databricks Serverless to streamline your data workflows? We invite you to share your real-world experiences-which workloads have benefited most, what challenges you’ve encountered, and any best practices or tips you have for ma...
- 505 Views
- 0 replies
- 0 kudos
- 4870 Views
- 2 replies
- 0 kudos
Resolved! Tableau removal from catalog explorer
Is it possible to remove Tableau as a option from the Catalog Exproler (BI tools) when you want to open or explore the data?
- 4870 Views
- 2 replies
- 0 kudos
- 0 kudos
It is currently not possible to remove the Tableau option from the "BI tools" integration menu in the Databricks Catalog Explorer. Hope this helps. Cheers, Lou.
- 0 kudos
- 5427 Views
- 3 replies
- 0 kudos
Can we turn off Playground & Marketplace for some users?
Hi Everyone,Hope all is well with you!I'm reaching out to the community for some advice on customizing our workspace settings. Specifically, I have two things I'm trying to figure out:Disabling the "Playground" Option: Is there a way to turn off the ...
- 5427 Views
- 3 replies
- 0 kudos
- 0 kudos
how can we turn on the playground option, i cannot find it
- 0 kudos
- 561 Views
- 1 replies
- 1 kudos
Task level Definition
Hello, According to the databricks_job resource terraform documentation, the description argument is supported at the task level. However, when I update the Terraform configuration to include a task description, the terraform plan reflects the change...
- 561 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @ChinuLee, the task-level description appears to be hidden in the UI, both during manual creation and when viewing tasks after creation. However, it is visible when you choose the "View as Code" option, as shown below:So it is supported, just hidd...
- 1 kudos
- 4906 Views
- 8 replies
- 0 kudos
Resolved! Cannot use Terraform to create Databricks Storage Credential
Hi all,When I use Terraform in an Azure DevOps pipeline to create Databricks Storage Credential, I got the following error. Has anybody met the same error before? Or is there any idea how to debug it? Error: cannot create storage credential: failed d...
- 4906 Views
- 8 replies
- 0 kudos
- 0 kudos
How exactly do you need to configure auth_type in this case? I tried different options but nothing seems to work. I also would like to use the Service Connection from Azure DevOps Pipeline to deploy Databricks via TerraformTaskV4@4.
- 0 kudos
- 4771 Views
- 1 replies
- 3 kudos
Creating Group in Terraform using external_id
The documentation here doesn't give much information about how to use `external_id` when creating a new group. If I reference the object_id for an Azure AD Group, the databricks group gets created but the members from the AD group are not added, nor ...
- 4771 Views
- 1 replies
- 3 kudos
- 3 kudos
Greetings from the future! Now it is clear that external_id, which IS Azure's ObjectID, comes from the internal sync mechanism, that can be enabled in your account under previews:I was able to reference my security group in Terraform and create that ...
- 3 kudos
- 967 Views
- 2 replies
- 1 kudos
How to Optimize Delta Table Performance in Databricks?
I'm working with large Delta tables in Databricks and noticing slower performance during read operations. I've already enabled Z-ordering and auto-optimize, but it still feels sluggish at scale. Are there best practices or settings I should adjust fo...
- 967 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @gardenmap, if possible can you detail more?For example, in my case what I've done:For tables above 1TB as it's can segregated by date, we've decided to enable a partition by the date column;Independent if it's partitioned or not, we decided to ma...
- 1 kudos
- 2398 Views
- 3 replies
- 6 kudos
Refresh permission on Lakeview Dashboard
Hi folks!I'm sure I'm not the only one, but our users have the tendency to click the big Refresh button on all dashboards every time they open them.Using resources efficiently is something I value deeply, so our team came up with a schedule policy - ...
- 2398 Views
- 3 replies
- 6 kudos
- 6 kudos
Hey @leo-machado , yes I did request it formally and shared this post with the team - heads up, once something has been prioritised it can still take between 6-18 months to build.
- 6 kudos
- 497 Views
- 1 replies
- 1 kudos
restrict workspace admin from creating service principal
Hello,I would like to restrict workspace admins from creating service principals and leave this privilege only to the account admin. Is this possible? I am aware of the RestrictWorkspaceAdmins command, but it does not meet my needs. Additionally, I h...
- 497 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello @antonionuzzo! Based on the documentation and my understanding, there isn’t a built-in way to restrict the creation of service principals exclusively to account admins. And as you mentioned, the RestrictWorkspaceAdmins setting doesn’t cover thi...
- 1 kudos
- 931 Views
- 1 replies
- 0 kudos
How to enable / setup OAuth for DBT with Databricks
I tried configure / setup DBT to authenticate with OAuth to Databricks, following the tutorials https://community.databricks.com/t5/technical-blog/using-dbt-core-with-oauth-on-azure-databricks/ba-p/46605 and https://docs.databricks.com/aws/en/partner...
- 931 Views
- 1 replies
- 0 kudos
- 0 kudos
Here are some things to consider: Debugging and Setting Up OAuth Authentication with dbt and Databricks Steps to Analyze and Setup Authentication: 1. Verify Credentials and Configuration: - Double-check the auth_type: oauth is correctly specified, as...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
43 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
User | Count |
---|---|
97 | |
37 | |
26 | |
25 | |
18 |