- 1283 Views
- 1 replies
- 0 kudos
How to enable / setup OAuth for DBT with Databricks
I tried configure / setup DBT to authenticate with OAuth to Databricks, following the tutorials https://community.databricks.com/t5/technical-blog/using-dbt-core-with-oauth-on-azure-databricks/ba-p/46605 and https://docs.databricks.com/aws/en/partner...
- 1283 Views
- 1 replies
- 0 kudos
- 0 kudos
Here are some things to consider: Debugging and Setting Up OAuth Authentication with dbt and Databricks Steps to Analyze and Setup Authentication: 1. Verify Credentials and Configuration: - Double-check the auth_type: oauth is correctly specified, as...
- 0 kudos
- 4090 Views
- 1 replies
- 1 kudos
Cannot remove users group "CAN_MANAGE" from /Shared
I have a Unity Catalog enabled workspace and I have full privileges including Account Admin. I would like to be able to remove the "CAN_MANAGE" privilege from the "users" group. According to the documentation, this should be possible. According to...
- 4090 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi, Can you please share the entire stacktrace to the cause of Cannot modify permissions of directory. Thanks!
- 1 kudos
- 6659 Views
- 11 replies
- 3 kudos
Resolved! Deploy Workflow only to specific target (Databricks Asset Bundles)
I am using Databricks Asset Bundles to deploy Databricks workflows to all of my target environments (dev, staging, prod). However, I have one specific workflow that is supposed to be deployed only to the dev target environment.How can I implement tha...
- 6659 Views
- 11 replies
- 3 kudos
- 3 kudos
Hi, also curious about this update, this would be a really helpful feature for us since we have humongous job specifications that are specific for dev and test environments. Adding these to the top level databricks.yml file is really cluttering up ou...
- 3 kudos
- 2164 Views
- 4 replies
- 0 kudos
Serverless python version
In the documentation about serverless compute release notes, it is stated that "Serverless compute for notebooks and jobs uses environment versions", and that "Serverless compute always runs using the most recently released version listed here.". At ...
- 2164 Views
- 4 replies
- 0 kudos
- 0 kudos
It seems the configuration aligns with doc https://docs.databricks.com/aws/en/release-notes/serverless/environment-version-two which specifies that the Python version in 3.11.10
- 0 kudos
- 1016 Views
- 1 replies
- 1 kudos
Resolved! service principal control plane access management
hi, our account admin has created a service principal to automate job execution. however, our security team is concerned that, by design, anyone with the service principal credentials might access the control plane, where the service principal is def...
- 1016 Views
- 1 replies
- 1 kudos
- 1 kudos
On the docs it states: Service principals give automated tools and scripts API-only access to Databricks resources, providing greater security than using users accounts.https://docs.databricks.com/gcp/en/admin/users-groups/service-principals#what-is-...
- 1 kudos
- 810 Views
- 1 replies
- 0 kudos
Issue with Verification Code Input on Login
Hello, I hope you're well.​I'd like to report a bug encountered when entering the verification code to log into the platform. When I type the code without Caps Lock enabled, the input field displays the characters in uppercase, but the code isn't acc...
- 810 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @lucasbergamo! Thank you for bringing this to our attention. I'll share this with the relevant team for further investigation. In the meantime, as a workaround you can continue using Caps Lock while entering the verification code to log in.
- 0 kudos
- 1633 Views
- 1 replies
- 0 kudos
Is it possible expand/extend subnet CIDR of an existing azure databricks workspace
Is it possible expand/extend subnet CIDR of an existing azure databricks workspace. Currently our workspace is maxed out, Is it possible expand/extend subnet CIDR of an existing azure databricks workspace without having create a new one
- 1633 Views
- 1 replies
- 0 kudos
- 0 kudos
Yes, it is possible to expand or extend the subnet CIDR of an existing Azure Databricks workspace without creating a new one, but this capability is specifically applicable if the workspace is deployed with VNet injection. For workspaces that use V...
- 0 kudos
- 5324 Views
- 3 replies
- 1 kudos
Azure Databricks Status
Dear all,I wanted to check if anyone implemented the solution of capturing information from Databricks status page in real-time 24x7 and load that into a log or table...https://learn.microsoft.com/en-us/azure/databricks/resources/statuswhat is the be...
- 5324 Views
- 3 replies
- 1 kudos
- 1 kudos
It seems that the webhook is the way!There is nothing about system status in Databricks REST API.There is nothing about system status in the System Tables schema.
- 1 kudos
- 796 Views
- 1 replies
- 2 kudos
for_each_task with pool clusters
I am trying to run a `for_each_task` across different inputs of length `N` and `concurrency` `M` where N >> M. To mitigate cluster setup time I want to use pool clusters.Now, when I set everything up, I notice that instead of `M` concurrent clusters...
- 796 Views
- 1 replies
- 2 kudos
- 2 kudos
Hi @david_btmpl When you set up a Databricks workflow using for_each_task with a cluster pool (instance_pool_id), Databricks will, by default, reuse the same cluster for all concurrent tasks in that job. So even if you’ve set a higher concurrency (li...
- 2 kudos
- 3442 Views
- 2 replies
- 3 kudos
Query has been timed out due to inactivity.
Hi,We're experiencing an issue with SQL Serverless Warehouse when running queries through the dbx-sql-connector in Python. The error we get is: "Query has been timed out due to inactivity."This happens intermittently, even for queries that should com...
- 3442 Views
- 2 replies
- 3 kudos
- 3 kudos
Getting the same error while trying to run Tableau flow on Databricks. Is there a solution for this issue?
- 3 kudos
- 1388 Views
- 1 replies
- 2 kudos
Service Principal Authentication / Terraform
Hello Databricks Community,I'm encountering an issue when trying to apply my Terraform configuration to create a Databricks MWS network on GCP. The terraform apply command fails with the following error: Error: cannot create mws networks: failed duri...
- 1388 Views
- 1 replies
- 2 kudos
- 2 kudos
Databricks account-level APIs can only be called by account owners and account admins and can only be authenticated using Google-issued OIDC tokens.In Terraform 0.13 and later, data resources have the same dependency resolution behavior as defined fo...
- 2 kudos
- 5026 Views
- 7 replies
- 2 kudos
Exact cost for job execution calculation
Hi everybody,I want to calculate the exact cost of single job execution. In all examples I can find on the internet it uses the tables system.billing.usage and system.billing.list_prices. It makes sense to calculate the sum of DBUs consumed and multi...
- 5026 Views
- 7 replies
- 2 kudos
- 2 kudos
And what about the costs for the disks of the VMs of the cluster?
- 2 kudos
- 1506 Views
- 2 replies
- 2 kudos
Impossible to access Terraform created external location?!
Hi all,There seems to be an external location created that nobody within the organization can actually see or manage, because it has been created with a Google service account in Terraform.Here is the problem:DESCRIBE EXTERNAL LOCATION `gcsbucketname...
- 1506 Views
- 2 replies
- 2 kudos
- 2 kudos
I would agree that the metastore admin(s) should be able to see the external location. This issue can happen with terraform scripts if the script doesn't grant additional rights on the external location.
- 2 kudos
- 1015 Views
- 1 replies
- 0 kudos
Unexpected Behavior with Azure Databricks and Entra ID SCIM Integration
Hi everyone,I'm currently running some tests for a company that uses Entra ID as the backbone of its authentication system. Every employee with a corporate email address is mapped within the organization's Entra ID.Our company's Azure Databricks is c...
- 1015 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @antonionuzzo, This behavior is occurring because Azure Databricks allows workspace administrators to invite users from their organization's Entra ID directory into the Databricks workspace. This capability functions independently of whether th...
- 0 kudos
- 1618 Views
- 3 replies
- 1 kudos
Monitor workspace admin activities
Hello everyone,I am conducting tests on Databricks AWS and have noticed that in an organization with multiple workspaces, each with different workspace admins, a workspace admin can invite a user who is not mapped within their workspace but is alread...
- 1618 Views
- 3 replies
- 1 kudos
- 1 kudos
You do have some control over what workspace admins can do. Databricks allows account admins to restrict workspace admin permissions by enabling the RestrictWorkspaceAdmins setting. Have a look here: https://docs.databricks.com/aws/en/admin/workspace...
- 1 kudos
-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
63 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 121 | |
| 42 | |
| 37 | |
| 30 | |
| 25 |