- 4977 Views
- 6 replies
- 0 kudos
System.billing.usage table - cannot match job_id from databricks api/UI
Hello, I have multiple continuous jobs that are running for many days (Kafka stream), however querying System.billing.usage table by job_id from UI or databricks job api not return any results for those jobs.1. What is the reason behind that?2. If I ...
- 4977 Views
- 6 replies
- 0 kudos
- 0 kudos
What is the update on this? I am also unable to see my continuous Jobs usage in system.billing.usage table - Although the run information, Task information is available in system.lakeflow.job_run_timeline and system.lakeflow.job_task_run_timeline. Pl...
- 0 kudos
- 654 Views
- 0 replies
- 0 kudos
Issues when configuring keystore spark config for pyspark to mongo atlas X.509 connectivity
Step followed - Step1: To add init script that will copy the keystore file in the tmp location.Step2: To add spark config in cluster advance options - spark.driver.extraJavaOptions -Djavax.net.ssl.keyStore=/tmp/keystore.jks -Djavax.net.ssl.keyStorePa...
- 654 Views
- 0 replies
- 0 kudos
- 2198 Views
- 4 replies
- 0 kudos
Unable to Change Pipeline Ownership to Service Principal Despite Admin Permissions
Hello Databricks Community,I am facing an issue with changing the ownership of an ETL pipeline to a service principal. Despite being the Databricks administrator with workspace and metastore admin permissions, I receive the following error: "Only adm...
- 2198 Views
- 4 replies
- 0 kudos
- 0 kudos
Are you saying that you can't create a PAT token in the Databricks Workspace, or you can create a PAT token but don't have a secure way of using it with the API request? Please explain a bit more. Thanks, Lou.
- 0 kudos
- 711 Views
- 0 replies
- 0 kudos
Databricks Serverless: Real World Implementations and Experiences
How have you leveraged Databricks Serverless to streamline your data workflows? We invite you to share your real-world experiences-which workloads have benefited most, what challenges you’ve encountered, and any best practices or tips you have for ma...
- 711 Views
- 0 replies
- 0 kudos
- 4978 Views
- 2 replies
- 0 kudos
Resolved! Tableau removal from catalog explorer
Is it possible to remove Tableau as a option from the Catalog Exproler (BI tools) when you want to open or explore the data?
- 4978 Views
- 2 replies
- 0 kudos
- 0 kudos
It is currently not possible to remove the Tableau option from the "BI tools" integration menu in the Databricks Catalog Explorer. Hope this helps. Cheers, Lou.
- 0 kudos
- 5783 Views
- 3 replies
- 0 kudos
Can we turn off Playground & Marketplace for some users?
Hi Everyone,Hope all is well with you!I'm reaching out to the community for some advice on customizing our workspace settings. Specifically, I have two things I'm trying to figure out:Disabling the "Playground" Option: Is there a way to turn off the ...
- 5783 Views
- 3 replies
- 0 kudos
- 0 kudos
how can we turn on the playground option, i cannot find it
- 0 kudos
- 653 Views
- 1 replies
- 1 kudos
Task level Definition
Hello, According to the databricks_job resource terraform documentation, the description argument is supported at the task level. However, when I update the Terraform configuration to include a task description, the terraform plan reflects the change...
- 653 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @ChinuLee, the task-level description appears to be hidden in the UI, both during manual creation and when viewing tasks after creation. However, it is visible when you choose the "View as Code" option, as shown below:So it is supported, just hidd...
- 1 kudos
- 7126 Views
- 8 replies
- 0 kudos
Resolved! Cannot use Terraform to create Databricks Storage Credential
Hi all,When I use Terraform in an Azure DevOps pipeline to create Databricks Storage Credential, I got the following error. Has anybody met the same error before? Or is there any idea how to debug it? Error: cannot create storage credential: failed d...
- 7126 Views
- 8 replies
- 0 kudos
- 0 kudos
How exactly do you need to configure auth_type in this case? I tried different options but nothing seems to work. I also would like to use the Service Connection from Azure DevOps Pipeline to deploy Databricks via TerraformTaskV4@4.
- 0 kudos
- 5205 Views
- 1 replies
- 3 kudos
Creating Group in Terraform using external_id
The documentation here doesn't give much information about how to use `external_id` when creating a new group. If I reference the object_id for an Azure AD Group, the databricks group gets created but the members from the AD group are not added, nor ...
- 5205 Views
- 1 replies
- 3 kudos
- 3 kudos
Greetings from the future! Now it is clear that external_id, which IS Azure's ObjectID, comes from the internal sync mechanism, that can be enabled in your account under previews:I was able to reference my security group in Terraform and create that ...
- 3 kudos
- 1280 Views
- 2 replies
- 1 kudos
How to Optimize Delta Table Performance in Databricks?
I'm working with large Delta tables in Databricks and noticing slower performance during read operations. I've already enabled Z-ordering and auto-optimize, but it still feels sluggish at scale. Are there best practices or settings I should adjust fo...
- 1280 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @gardenmap, if possible can you detail more?For example, in my case what I've done:For tables above 1TB as it's can segregated by date, we've decided to enable a partition by the date column;Independent if it's partitioned or not, we decided to ma...
- 1 kudos
- 2894 Views
- 3 replies
- 6 kudos
Refresh permission on Lakeview Dashboard
Hi folks!I'm sure I'm not the only one, but our users have the tendency to click the big Refresh button on all dashboards every time they open them.Using resources efficiently is something I value deeply, so our team came up with a schedule policy - ...
- 2894 Views
- 3 replies
- 6 kudos
- 6 kudos
Hey @leo-machado , yes I did request it formally and shared this post with the team - heads up, once something has been prioritised it can still take between 6-18 months to build.
- 6 kudos
- 592 Views
- 1 replies
- 1 kudos
restrict workspace admin from creating service principal
Hello,I would like to restrict workspace admins from creating service principals and leave this privilege only to the account admin. Is this possible? I am aware of the RestrictWorkspaceAdmins command, but it does not meet my needs. Additionally, I h...
- 592 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello @antonionuzzo! Based on the documentation and my understanding, there isn’t a built-in way to restrict the creation of service principals exclusively to account admins. And as you mentioned, the RestrictWorkspaceAdmins setting doesn’t cover thi...
- 1 kudos
- 1153 Views
- 1 replies
- 0 kudos
How to enable / setup OAuth for DBT with Databricks
I tried configure / setup DBT to authenticate with OAuth to Databricks, following the tutorials https://community.databricks.com/t5/technical-blog/using-dbt-core-with-oauth-on-azure-databricks/ba-p/46605 and https://docs.databricks.com/aws/en/partner...
- 1153 Views
- 1 replies
- 0 kudos
- 0 kudos
Here are some things to consider: Debugging and Setting Up OAuth Authentication with dbt and Databricks Steps to Analyze and Setup Authentication: 1. Verify Credentials and Configuration: - Double-check the auth_type: oauth is correctly specified, as...
- 0 kudos
- 4035 Views
- 1 replies
- 1 kudos
Cannot remove users group "CAN_MANAGE" from /Shared
I have a Unity Catalog enabled workspace and I have full privileges including Account Admin. I would like to be able to remove the "CAN_MANAGE" privilege from the "users" group. According to the documentation, this should be possible. According to...
- 4035 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi, Can you please share the entire stacktrace to the cause of Cannot modify permissions of directory. Thanks!
- 1 kudos
- 6343 Views
- 11 replies
- 3 kudos
Resolved! Deploy Workflow only to specific target (Databricks Asset Bundles)
I am using Databricks Asset Bundles to deploy Databricks workflows to all of my target environments (dev, staging, prod). However, I have one specific workflow that is supposed to be deployed only to the dev target environment.How can I implement tha...
- 6343 Views
- 11 replies
- 3 kudos
- 3 kudos
Hi, also curious about this update, this would be a really helpful feature for us since we have humongous job specifications that are specific for dev and test environments. Adding these to the top level databricks.yml file is really cluttering up ou...
- 3 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
60 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 119 | |
| 39 | |
| 37 | |
| 28 | |
| 25 |