- 5555 Views
- 1 replies
- 3 kudos
Creating Group in Terraform using external_id
The documentation here doesn't give much information about how to use `external_id` when creating a new group. If I reference the object_id for an Azure AD Group, the databricks group gets created but the members from the AD group are not added, nor ...
- 5555 Views
- 1 replies
- 3 kudos
- 3 kudos
Greetings from the future! Now it is clear that external_id, which IS Azure's ObjectID, comes from the internal sync mechanism, that can be enabled in your account under previews:I was able to reference my security group in Terraform and create that ...
- 3 kudos
- 1618 Views
- 2 replies
- 1 kudos
How to Optimize Delta Table Performance in Databricks?
I'm working with large Delta tables in Databricks and noticing slower performance during read operations. I've already enabled Z-ordering and auto-optimize, but it still feels sluggish at scale. Are there best practices or settings I should adjust fo...
- 1618 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @gardenmap, if possible can you detail more?For example, in my case what I've done:For tables above 1TB as it's can segregated by date, we've decided to enable a partition by the date column;Independent if it's partitioned or not, we decided to ma...
- 1 kudos
- 3120 Views
- 3 replies
- 6 kudos
Refresh permission on Lakeview Dashboard
Hi folks!I'm sure I'm not the only one, but our users have the tendency to click the big Refresh button on all dashboards every time they open them.Using resources efficiently is something I value deeply, so our team came up with a schedule policy - ...
- 3120 Views
- 3 replies
- 6 kudos
- 6 kudos
Hey @leo-machado , yes I did request it formally and shared this post with the team - heads up, once something has been prioritised it can still take between 6-18 months to build.
- 6 kudos
- 650 Views
- 1 replies
- 1 kudos
restrict workspace admin from creating service principal
Hello,I would like to restrict workspace admins from creating service principals and leave this privilege only to the account admin. Is this possible? I am aware of the RestrictWorkspaceAdmins command, but it does not meet my needs. Additionally, I h...
- 650 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello @antonionuzzo! Based on the documentation and my understanding, there isn’t a built-in way to restrict the creation of service principals exclusively to account admins. And as you mentioned, the RestrictWorkspaceAdmins setting doesn’t cover thi...
- 1 kudos
- 1512 Views
- 1 replies
- 0 kudos
How to enable / setup OAuth for DBT with Databricks
I tried configure / setup DBT to authenticate with OAuth to Databricks, following the tutorials https://community.databricks.com/t5/technical-blog/using-dbt-core-with-oauth-on-azure-databricks/ba-p/46605 and https://docs.databricks.com/aws/en/partner...
- 1512 Views
- 1 replies
- 0 kudos
- 0 kudos
Here are some things to consider: Debugging and Setting Up OAuth Authentication with dbt and Databricks Steps to Analyze and Setup Authentication: 1. Verify Credentials and Configuration: - Double-check the auth_type: oauth is correctly specified, as...
- 0 kudos
- 4179 Views
- 1 replies
- 1 kudos
Cannot remove users group "CAN_MANAGE" from /Shared
I have a Unity Catalog enabled workspace and I have full privileges including Account Admin. I would like to be able to remove the "CAN_MANAGE" privilege from the "users" group. According to the documentation, this should be possible. According to...
- 4179 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi, Can you please share the entire stacktrace to the cause of Cannot modify permissions of directory. Thanks!
- 1 kudos
- 2354 Views
- 4 replies
- 0 kudos
Serverless python version
In the documentation about serverless compute release notes, it is stated that "Serverless compute for notebooks and jobs uses environment versions", and that "Serverless compute always runs using the most recently released version listed here.". At ...
- 2354 Views
- 4 replies
- 0 kudos
- 0 kudos
It seems the configuration aligns with doc https://docs.databricks.com/aws/en/release-notes/serverless/environment-version-two which specifies that the Python version in 3.11.10
- 0 kudos
- 1073 Views
- 1 replies
- 1 kudos
Resolved! service principal control plane access management
hi, our account admin has created a service principal to automate job execution. however, our security team is concerned that, by design, anyone with the service principal credentials might access the control plane, where the service principal is def...
- 1073 Views
- 1 replies
- 1 kudos
- 1 kudos
On the docs it states: Service principals give automated tools and scripts API-only access to Databricks resources, providing greater security than using users accounts.https://docs.databricks.com/gcp/en/admin/users-groups/service-principals#what-is-...
- 1 kudos
- 858 Views
- 1 replies
- 0 kudos
Issue with Verification Code Input on Login
Hello, I hope you're well.​I'd like to report a bug encountered when entering the verification code to log into the platform. When I type the code without Caps Lock enabled, the input field displays the characters in uppercase, but the code isn't acc...
- 858 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @lucasbergamo! Thank you for bringing this to our attention. I'll share this with the relevant team for further investigation. In the meantime, as a workaround you can continue using Caps Lock while entering the verification code to log in.
- 0 kudos
- 1913 Views
- 1 replies
- 0 kudos
Is it possible expand/extend subnet CIDR of an existing azure databricks workspace
Is it possible expand/extend subnet CIDR of an existing azure databricks workspace. Currently our workspace is maxed out, Is it possible expand/extend subnet CIDR of an existing azure databricks workspace without having create a new one
- 1913 Views
- 1 replies
- 0 kudos
- 0 kudos
Yes, it is possible to expand or extend the subnet CIDR of an existing Azure Databricks workspace without creating a new one, but this capability is specifically applicable if the workspace is deployed with VNet injection. For workspaces that use V...
- 0 kudos
- 6229 Views
- 3 replies
- 1 kudos
Azure Databricks Status
Dear all,I wanted to check if anyone implemented the solution of capturing information from Databricks status page in real-time 24x7 and load that into a log or table...https://learn.microsoft.com/en-us/azure/databricks/resources/statuswhat is the be...
- 6229 Views
- 3 replies
- 1 kudos
- 1 kudos
It seems that the webhook is the way!There is nothing about system status in Databricks REST API.There is nothing about system status in the System Tables schema.
- 1 kudos
- 902 Views
- 1 replies
- 2 kudos
for_each_task with pool clusters
I am trying to run a `for_each_task` across different inputs of length `N` and `concurrency` `M` where N >> M. To mitigate cluster setup time I want to use pool clusters.Now, when I set everything up, I notice that instead of `M` concurrent clusters...
- 902 Views
- 1 replies
- 2 kudos
- 2 kudos
Hi @david_btmpl When you set up a Databricks workflow using for_each_task with a cluster pool (instance_pool_id), Databricks will, by default, reuse the same cluster for all concurrent tasks in that job. So even if you’ve set a higher concurrency (li...
- 2 kudos
- 3771 Views
- 2 replies
- 3 kudos
Query has been timed out due to inactivity.
Hi,We're experiencing an issue with SQL Serverless Warehouse when running queries through the dbx-sql-connector in Python. The error we get is: "Query has been timed out due to inactivity."This happens intermittently, even for queries that should com...
- 3771 Views
- 2 replies
- 3 kudos
- 3 kudos
Getting the same error while trying to run Tableau flow on Databricks. Is there a solution for this issue?
- 3 kudos
- 1651 Views
- 1 replies
- 2 kudos
Service Principal Authentication / Terraform
Hello Databricks Community,I'm encountering an issue when trying to apply my Terraform configuration to create a Databricks MWS network on GCP. The terraform apply command fails with the following error: Error: cannot create mws networks: failed duri...
- 1651 Views
- 1 replies
- 2 kudos
- 2 kudos
Databricks account-level APIs can only be called by account owners and account admins and can only be authenticated using Google-issued OIDC tokens.In Terraform 0.13 and later, data resources have the same dependency resolution behavior as defined fo...
- 2 kudos
- 5527 Views
- 7 replies
- 2 kudos
Exact cost for job execution calculation
Hi everybody,I want to calculate the exact cost of single job execution. In all examples I can find on the internet it uses the tables system.billing.usage and system.billing.list_prices. It makes sense to calculate the sum of DBUs consumed and multi...
- 5527 Views
- 7 replies
- 2 kudos
- 2 kudos
And what about the costs for the disks of the VMs of the cluster?
- 2 kudos
-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
70 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 126 | |
| 53 | |
| 38 | |
| 38 | |
| 25 |