- 1525 Views
- 2 replies
- 0 kudos
Okta SSO Unified login in GCP
Hi,There are versions of this question posted already but they seem to refer to legacy features. Our organisation uses google workspace IP provisioned via Okta as the first landing point and all apps are secured behind this. We have purchased Databri...
- 1525 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello @dtb_usr, It is possible to use OKTA IdP to log into Databricks in GCP, please refer to: https://docs.gcp.databricks.com/en/admin/users-groups/scim/okta.html
- 0 kudos
- 1992 Views
- 1 replies
- 0 kudos
Datadog, OpenTelemetry, and Databricks container service
We have successfully gotten Datadog agent(s) installed and running on databricks clusters via init script - this part seems to be working fine. We are working on instrumenting our jobs using the OpenTelemetry endpoint feature of the Datadog agent, wh...
- 1992 Views
- 1 replies
- 0 kudos
- 0 kudos
The agent installations via the init script would install the agents in the Spark containers (All user workloads + spark processes run in the container). The users don't have direct access to the host machine and can't install any agents. You may nee...
- 0 kudos
- 703 Views
- 1 replies
- 0 kudos
Community edition login
Hi Am not able to login to community edition, its saying not a member , can someone please help?
- 703 Views
- 1 replies
- 0 kudos
- 0 kudos
I'll ask the dumb question first--did you sign up for it? Although both Databricks Community and Databricks Community Cloud Edition have similar names and are run by Databricks, they do not share a login. You need to register separately for each.
- 0 kudos
- 1638 Views
- 2 replies
- 1 kudos
Resolved! Enable Predictive optimization
In case to use predictive optimization we should first enable this at account level ? If this is the case then by doing this each of the catalogue/schema/table in Account will start using predictive optimization by default? should we first disable t...
- 1638 Views
- 2 replies
- 1 kudos
- 1 kudos
Thanks a lot @SparkJun , In documentation I am not able to find answer to one scenario Let's say we have explicitly disable predictive optimization for a Catalog named "CatalogXYZ" and then after that we have enabled this at Account level. Later a us...
- 1 kudos
- 1896 Views
- 3 replies
- 1 kudos
Terraform Failed to get oauth access token. Please retry after logout and login again. with GCP
Hi I'm having trouble creating a databricks_mws_vpc_endpoint with Terraform.I already created 2 Private Service Connect (PSC) and I'm trying to create the vpc endpoint for Databricks but I'm getting this error:BAD_REQUEST: Failed to get oauth access ...
- 1896 Views
- 3 replies
- 1 kudos
- 1 kudos
Thank you @NelsonE ! This helped me as well. Tried messing around with all kinds of authentication methods but this was what worked.For the record, I am also using service account impersonation to register VPC endpoints on Terraform / GCP for Databri...
- 1 kudos
- 902 Views
- 3 replies
- 1 kudos
Cannot downgrade workspace object permissions using API
Hi!I'd like to restrict some users' permissions using REST API and got an issue while trying to update a permission on 'directories'.I'm trying to set a user's permission on their default username folder in the workspace to 'can edit' so that they ca...
- 902 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @takak, Greetings from Databricks! What is the REST API you are making the call to? Looks like this might not be supported programmatically, but will try to test it internally. it appears that the CAN_MANAGE permission is a higher-level permission...
- 1 kudos
- 976 Views
- 2 replies
- 0 kudos
Global init script fails on Databricks 16.0
#!/bin/bashpip install package1 --index-url https://link-to-indexpip install package2 --index-url https://link-to-indexThis init script fails witherror: externally-managed-environment× This environment is externally managed╰─> To install Python packa...
- 976 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @k1t3k, Are you installing a custom package? could you please share the package name you are installing to validate? The error you are encountering, "externally-managed-environment," when running your global init script with Databricks Runtime 16....
- 0 kudos
- 3985 Views
- 2 replies
- 1 kudos
Resolved! Compute configuration : single user with service principal of azure data datafactory ?
Is it possible to have the service principal (ID) of an Azure data factory as the Single user access on an databricks cluster ?Reason I'm asking is because we are starting to use unity catalog , but would still have the need to execute stored procedu...
- 3985 Views
- 2 replies
- 1 kudos
- 1 kudos
Yes, this is possible. First, create a new service principal in Azure or use an existing one. This could be either a managed identity from Azure Data Factory or a manually created service principal in Microsoft Entra ID (formerly Azure AD). Next, in ...
- 1 kudos
- 892 Views
- 2 replies
- 0 kudos
How do I identify who triggered my Databricks job?
How can I identify who triggered my Databricks job? The Databricks job is running via a service principal. One of my runs initially failed, but a repair occurred 30 minutes later, causing the job to enter a successful state. I would like to determine...
- 892 Views
- 2 replies
- 0 kudos
- 0 kudos
thanks for sharing the infocan you please share the audit logs query so I can pass.
- 0 kudos
- 1269 Views
- 3 replies
- 1 kudos
Resolved! How to add existing recipient to existing delta share
I am facing an issue while adding a recipient to Delta Share using Terraform. The owner of my recipient is a group, not an individual user. I'm running this Terraform script using a service principal member of that group. However, I'm encountering th...
- 1269 Views
- 3 replies
- 1 kudos
- 1 kudos
I was able to fix the issue. The problem was that the service principal I was using didn’t have the correct permissions assigned
- 1 kudos
- 2882 Views
- 10 replies
- 0 kudos
Resolved! Python function "go to definition" and "peek definition" do not work
When using notebooks with python in Databricks I would really like to easily see the defintion of the functions I am using within the editor. Which the "Go to definition (F12)" and "Peek definition" options when right clicking on the functions will h...
- 2882 Views
- 10 replies
- 0 kudos
- 0 kudos
This has now been resolved and is working as expected. Do not know why or how, but something has changed that made it work
- 0 kudos
- 3903 Views
- 5 replies
- 2 kudos
Adhoc workflows - managing resource usage on shared clusters
We run a shared cluster that is used for general purpose adhoc analytics, which I assume is a relatively common use case to try to keep costs down. However, the technical experience of users of this cluster varies a lot, so we run into situations whe...
- 3903 Views
- 5 replies
- 2 kudos
- 2 kudos
Hi, @JameDavi_51481 , were you able to figure something out?Planning a Databricks migration and realized we might need something similar too.
- 2 kudos
- 648 Views
- 2 replies
- 0 kudos
Sporadic HTTP failure with SQL Serverless (bug?)
Our SQL Serverless installation has sporadic failures to our blob container in Azure. The blob container is locked down to a vnet, and we are using the private endpoint to enable serverless access. It will work fine for several hours, and then show...
- 648 Views
- 2 replies
- 0 kudos
- 0 kudos
I've confirmed all of that. This seems like an AI generated response. It seems more likely that Databricks rolled out a feature a week ago that is causing instability in the serverless warehouses. Any other specific things to check would be apprec...
- 0 kudos
- 1024 Views
- 1 replies
- 0 kudos
I want to create custom tag in cluster policy so that clusters created using that policy get those
"I want to create custom tags in a cluster policy so that clusters created using this policy will automatically include those tags for billing purposes. Consider the following example:"cluster_type": {"type": "fixed","value": "all-purpose"},"custom_t...
- 1024 Views
- 1 replies
- 0 kudos
- 0 kudos
Are you having any issue while running this code in the policy?
- 0 kudos
- 1121 Views
- 1 replies
- 0 kudos
Databricks Serverless best practices
Hi All, We are configuring a Databricks serverless that adjusts according to the workload type,like choosing different cluster sizes such as extra small ,small ,large etc, and auto scale option.We're also looking at the average time it takes to compl...
- 1121 Views
- 1 replies
- 0 kudos
- 0 kudos
You can refer to our Serverless Compute Best practices: https://docs.databricks.com/en/compute/serverless/best-practices.htmlIf you refer to the Serverless Warehouses you can refer to https://docs.databricks.com/en/compute/sql-warehouse/warehouse-beh...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
AWS
5 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta
4 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
40 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
User | Count |
---|---|
89 | |
37 | |
25 | |
23 | |
17 |