- 2653 Views
- 2 replies
- 2 kudos
Resolved! system schemas permission
Hi,I'm an account admin on Databricks and when I try to set select permission for system schemasI take "PERMISSION_DENIED: User is not an owner of Schema 'system.compute'." When I try to set permission for system catalog,I take "Requires ownership o...
- 2653 Views
- 2 replies
- 2 kudos
- 2504 Views
- 3 replies
- 2 kudos
Networking configuration of Azure Databricks managed storage account
Hi all,I created an Azure Databricks Workspace, and the workspace creates an Azure Databricks managed storage account.The networking configuration of the storage account is "Enabled from all networks".Shall I change it to "Enabled from selected virtu...
- 2504 Views
- 3 replies
- 2 kudos
- 2 kudos
You dont need view on the subnets itself.In regards the Disabling key access you could use any of the other authentication methods listed here: https://learn.microsoft.com/en-us/azure/databricks/connect/storage/azure-storage#connect-to-azure-data-lak...
- 2 kudos
- 1911 Views
- 2 replies
- 0 kudos
Create Databricks managed service principal programatically ?
For the current Databricks service principal API or the Databricks SDK, an ID is required. However, when dealing with Databricks-managed service principals, you typically only have the name. For registering with cloud providers, like Microsoft Entra ...
- 1911 Views
- 2 replies
- 0 kudos
- 0 kudos
Have you found a solution on how to programmatically create a Databricks managed service principal?
- 0 kudos
- 1583 Views
- 2 replies
- 2 kudos
Resolved! Default schema in SQL Editor is not 'default' when unity catalog is set as default catalog
In workspace settings: Workspace admin - advanced - other - Default catalog for the workspace is set to different than hive_metastore, it is set to a `Unity Catalog` catalog - the expected behaviour is copied here from the related more info panel:"Se...
- 1583 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi @Alberto_Umana ,Thank you for the explanation. I mark your comment as the accepted solution as it contains the current implementation logic and the work around. Good to know that the more info panel is a bit misleading as of now because the SQL Ed...
- 2 kudos
- 3507 Views
- 10 replies
- 3 kudos
Error "Integrating Apache Spark with Databricks Unity Catalog Assets via Open APIs" on Azure
Great blog post: https://community.databricks.com/t5/technical-blog/integrating-apache-spark-with-databricks-unity-catalog-assets/ba-p/97533I have attempted to reproduce this with Azure Databricks, and ADLS gen2 as the storage backend.Although I'm ab...
- 3507 Views
- 10 replies
- 3 kudos
- 3 kudos
Thanks @dkushari I looked at the github issue you posted, but it has to do specifically with DELTA_UNSUPPORTED_SCHEMA_DURING_READ when streaming *from* a delta table.The specific error I'm seeing is a key error for the Azure storage account hosting t...
- 3 kudos
- 836 Views
- 1 replies
- 0 kudos
Python User Input During Run-time
I'm new to Advana and was putting together a Python script that requires user interactions during run-time. However, the program pauses at an 'input()' command without a console cell to accept responses. Am I missing something about this instance of ...
- 836 Views
- 1 replies
- 0 kudos
- 0 kudos
This is a common issue because Jupyter Notebooks are not designed to handle interactive input in the same way as a standard Python script run in a terminal. In Jupyter Notebooks, the input() function does not work as expected because the notebook int...
- 0 kudos
- 856 Views
- 1 replies
- 3 kudos
Disable Catalog for predictive optimization
Let's we disabled predictive optimization for a specific Catalog name "CatalogXYZ" and after that enabled the predictive optimization at Account level. After that can schema owner for the Schema "CatalogXYZ.TestSchema" can enable the predictive optim...
- 856 Views
- 1 replies
- 3 kudos
- 3 kudos
If predictive optimization is disabled for the catalog "CatalogXYZ" and then enabled at the account level, the schema owner for "CatalogXYZ.TestSchema" cannot enable predictive optimization for this schema. This is because the predictive optimization...
- 3 kudos
- 621 Views
- 1 replies
- 0 kudos
Downstream usage control on Serverless
Hi All, We've noticed a significant increase in our Databricks Serverless usage due to downstream system activity. We would like to reduce overall consumption by serverless. Please suggest us the possible ways and best practices we can implement to...
- 621 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @Phani1, You might want to review this document: https://docs.databricks.com/en/compute/serverless/best-practices.html Let me know if you have any further question.
- 0 kudos
- 737 Views
- 1 replies
- 0 kudos
How to only allow one git branch, one folder?
Users are able to switch branches in the git UI. How to restrict or only allow one branch?Also, for the sparse checkout, how to view only one folder (without files from root)
- 737 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @amberleong, To restrict users from switching branches in the Git UI and only allow one branch, you can implement branch protection rules in your Git repository, directly from your source code git tool.
- 0 kudos
- 1951 Views
- 2 replies
- 0 kudos
Okta SSO Unified login in GCP
Hi,There are versions of this question posted already but they seem to refer to legacy features. Our organisation uses google workspace IP provisioned via Okta as the first landing point and all apps are secured behind this. We have purchased Databri...
- 1951 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello @dtb_usr, It is possible to use OKTA IdP to log into Databricks in GCP, please refer to: https://docs.gcp.databricks.com/en/admin/users-groups/scim/okta.html
- 0 kudos
- 2166 Views
- 1 replies
- 0 kudos
Datadog, OpenTelemetry, and Databricks container service
We have successfully gotten Datadog agent(s) installed and running on databricks clusters via init script - this part seems to be working fine. We are working on instrumenting our jobs using the OpenTelemetry endpoint feature of the Datadog agent, wh...
- 2166 Views
- 1 replies
- 0 kudos
- 0 kudos
The agent installations via the init script would install the agents in the Spark containers (All user workloads + spark processes run in the container). The users don't have direct access to the host machine and can't install any agents. You may nee...
- 0 kudos
- 798 Views
- 1 replies
- 0 kudos
Community edition login
Hi Am not able to login to community edition, its saying not a member , can someone please help?
- 798 Views
- 1 replies
- 0 kudos
- 0 kudos
I'll ask the dumb question first--did you sign up for it? Although both Databricks Community and Databricks Community Cloud Edition have similar names and are run by Databricks, they do not share a login. You need to register separately for each.
- 0 kudos
- 1821 Views
- 2 replies
- 1 kudos
Resolved! Enable Predictive optimization
In case to use predictive optimization we should first enable this at account level ? If this is the case then by doing this each of the catalogue/schema/table in Account will start using predictive optimization by default? should we first disable t...
- 1821 Views
- 2 replies
- 1 kudos
- 1 kudos
Thanks a lot @SparkJun , In documentation I am not able to find answer to one scenario Let's say we have explicitly disable predictive optimization for a Catalog named "CatalogXYZ" and then after that we have enabled this at Account level. Later a us...
- 1 kudos
- 2068 Views
- 3 replies
- 1 kudos
Terraform Failed to get oauth access token. Please retry after logout and login again. with GCP
Hi I'm having trouble creating a databricks_mws_vpc_endpoint with Terraform.I already created 2 Private Service Connect (PSC) and I'm trying to create the vpc endpoint for Databricks but I'm getting this error:BAD_REQUEST: Failed to get oauth access ...
- 2068 Views
- 3 replies
- 1 kudos
- 1 kudos
Thank you @NelsonE ! This helped me as well. Tried messing around with all kinds of authentication methods but this was what worked.For the record, I am also using service account impersonation to register VPC endpoints on Terraform / GCP for Databri...
- 1 kudos
- 1027 Views
- 3 replies
- 1 kudos
Cannot downgrade workspace object permissions using API
Hi!I'd like to restrict some users' permissions using REST API and got an issue while trying to update a permission on 'directories'.I'm trying to set a user's permission on their default username folder in the workspace to 'can edit' so that they ca...
- 1027 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @takak, Greetings from Databricks! What is the REST API you are making the call to? Looks like this might not be supported programmatically, but will try to test it internally. it appears that the CAN_MANAGE permission is a higher-level permission...
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
47 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 108 | |
| 37 | |
| 34 | |
| 25 | |
| 24 |