- 799 Views
- 1 replies
- 0 kudos
Python User Input During Run-time
I'm new to Advana and was putting together a Python script that requires user interactions during run-time. However, the program pauses at an 'input()' command without a console cell to accept responses. Am I missing something about this instance of ...
- 799 Views
- 1 replies
- 0 kudos
- 0 kudos
This is a common issue because Jupyter Notebooks are not designed to handle interactive input in the same way as a standard Python script run in a terminal. In Jupyter Notebooks, the input() function does not work as expected because the notebook int...
- 0 kudos
- 819 Views
- 1 replies
- 3 kudos
Disable Catalog for predictive optimization
Let's we disabled predictive optimization for a specific Catalog name "CatalogXYZ" and after that enabled the predictive optimization at Account level. After that can schema owner for the Schema "CatalogXYZ.TestSchema" can enable the predictive optim...
- 819 Views
- 1 replies
- 3 kudos
- 3 kudos
If predictive optimization is disabled for the catalog "CatalogXYZ" and then enabled at the account level, the schema owner for "CatalogXYZ.TestSchema" cannot enable predictive optimization for this schema. This is because the predictive optimization...
- 3 kudos
- 612 Views
- 1 replies
- 0 kudos
Downstream usage control on Serverless
Hi All, We've noticed a significant increase in our Databricks Serverless usage due to downstream system activity. We would like to reduce overall consumption by serverless. Please suggest us the possible ways and best practices we can implement to...
- 612 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @Phani1, You might want to review this document: https://docs.databricks.com/en/compute/serverless/best-practices.html Let me know if you have any further question.
- 0 kudos
- 695 Views
- 1 replies
- 0 kudos
How to only allow one git branch, one folder?
Users are able to switch branches in the git UI. How to restrict or only allow one branch?Also, for the sparse checkout, how to view only one folder (without files from root)
- 695 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @amberleong, To restrict users from switching branches in the Git UI and only allow one branch, you can implement branch protection rules in your Git repository, directly from your source code git tool.
- 0 kudos
- 1831 Views
- 2 replies
- 0 kudos
Okta SSO Unified login in GCP
Hi,There are versions of this question posted already but they seem to refer to legacy features. Our organisation uses google workspace IP provisioned via Okta as the first landing point and all apps are secured behind this. We have purchased Databri...
- 1831 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello @dtb_usr, It is possible to use OKTA IdP to log into Databricks in GCP, please refer to: https://docs.gcp.databricks.com/en/admin/users-groups/scim/okta.html
- 0 kudos
- 2102 Views
- 1 replies
- 0 kudos
Datadog, OpenTelemetry, and Databricks container service
We have successfully gotten Datadog agent(s) installed and running on databricks clusters via init script - this part seems to be working fine. We are working on instrumenting our jobs using the OpenTelemetry endpoint feature of the Datadog agent, wh...
- 2102 Views
- 1 replies
- 0 kudos
- 0 kudos
The agent installations via the init script would install the agents in the Spark containers (All user workloads + spark processes run in the container). The users don't have direct access to the host machine and can't install any agents. You may nee...
- 0 kudos
- 752 Views
- 1 replies
- 0 kudos
Community edition login
Hi Am not able to login to community edition, its saying not a member , can someone please help?
- 752 Views
- 1 replies
- 0 kudos
- 0 kudos
I'll ask the dumb question first--did you sign up for it? Although both Databricks Community and Databricks Community Cloud Edition have similar names and are run by Databricks, they do not share a login. You need to register separately for each.
- 0 kudos
- 1760 Views
- 2 replies
- 1 kudos
Resolved! Enable Predictive optimization
In case to use predictive optimization we should first enable this at account level ? If this is the case then by doing this each of the catalogue/schema/table in Account will start using predictive optimization by default? should we first disable t...
- 1760 Views
- 2 replies
- 1 kudos
- 1 kudos
Thanks a lot @SparkJun , In documentation I am not able to find answer to one scenario Let's say we have explicitly disable predictive optimization for a Catalog named "CatalogXYZ" and then after that we have enabled this at Account level. Later a us...
- 1 kudos
- 2003 Views
- 3 replies
- 1 kudos
Terraform Failed to get oauth access token. Please retry after logout and login again. with GCP
Hi I'm having trouble creating a databricks_mws_vpc_endpoint with Terraform.I already created 2 Private Service Connect (PSC) and I'm trying to create the vpc endpoint for Databricks but I'm getting this error:BAD_REQUEST: Failed to get oauth access ...
- 2003 Views
- 3 replies
- 1 kudos
- 1 kudos
Thank you @NelsonE ! This helped me as well. Tried messing around with all kinds of authentication methods but this was what worked.For the record, I am also using service account impersonation to register VPC endpoints on Terraform / GCP for Databri...
- 1 kudos
- 988 Views
- 3 replies
- 1 kudos
Cannot downgrade workspace object permissions using API
Hi!I'd like to restrict some users' permissions using REST API and got an issue while trying to update a permission on 'directories'.I'm trying to set a user's permission on their default username folder in the workspace to 'can edit' so that they ca...
- 988 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @takak, Greetings from Databricks! What is the REST API you are making the call to? Looks like this might not be supported programmatically, but will try to test it internally. it appears that the CAN_MANAGE permission is a higher-level permission...
- 1 kudos
- 1043 Views
- 2 replies
- 0 kudos
Global init script fails on Databricks 16.0
#!/bin/bashpip install package1 --index-url https://link-to-indexpip install package2 --index-url https://link-to-indexThis init script fails witherror: externally-managed-environment× This environment is externally managed╰─> To install Python packa...
- 1043 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @k1t3k, Are you installing a custom package? could you please share the package name you are installing to validate? The error you are encountering, "externally-managed-environment," when running your global init script with Databricks Runtime 16....
- 0 kudos
- 4233 Views
- 2 replies
- 1 kudos
Resolved! Compute configuration : single user with service principal of azure data datafactory ?
Is it possible to have the service principal (ID) of an Azure data factory as the Single user access on an databricks cluster ?Reason I'm asking is because we are starting to use unity catalog , but would still have the need to execute stored procedu...
- 4233 Views
- 2 replies
- 1 kudos
- 1 kudos
Yes, this is possible. First, create a new service principal in Azure or use an existing one. This could be either a managed identity from Azure Data Factory or a manually created service principal in Microsoft Entra ID (formerly Azure AD). Next, in ...
- 1 kudos
- 970 Views
- 2 replies
- 0 kudos
How do I identify who triggered my Databricks job?
How can I identify who triggered my Databricks job? The Databricks job is running via a service principal. One of my runs initially failed, but a repair occurred 30 minutes later, causing the job to enter a successful state. I would like to determine...
- 970 Views
- 2 replies
- 0 kudos
- 0 kudos
thanks for sharing the infocan you please share the audit logs query so I can pass.
- 0 kudos
- 1365 Views
- 3 replies
- 1 kudos
Resolved! How to add existing recipient to existing delta share
I am facing an issue while adding a recipient to Delta Share using Terraform. The owner of my recipient is a group, not an individual user. I'm running this Terraform script using a service principal member of that group. However, I'm encountering th...
- 1365 Views
- 3 replies
- 1 kudos
- 1 kudos
I was able to fix the issue. The problem was that the service principal I was using didn’t have the correct permissions assigned
- 1 kudos
- 3082 Views
- 10 replies
- 0 kudos
Resolved! Python function "go to definition" and "peek definition" do not work
When using notebooks with python in Databricks I would really like to easily see the defintion of the functions I am using within the editor. Which the "Go to definition (F12)" and "Peek definition" options when right clicking on the functions will h...
- 3082 Views
- 10 replies
- 0 kudos
- 0 kudos
This has now been resolved and is working as expected. Do not know why or how, but something has changed that made it work
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
43 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 106 | |
| 37 | |
| 29 | |
| 25 | |
| 19 |