- 870 Views
- 1 replies
- 0 kudos
Unable to add a databricks permission to existing policy
Hi, We're using databricks provider v1.49.1 to manager our Azure databricks cluster and other resources. Having an issue setting permissions with the databricks terraform resource "databricks_permissions" where the error indicates that the clust...
- 870 Views
- 1 replies
- 0 kudos
- 0 kudos
Is this cluster policy a custom policy? If you try for testing purposes to modify it in the UI does it allows you to
- 0 kudos
- 2397 Views
- 1 replies
- 0 kudos
Compute terminated. Reason: Control Plane Request Failure
Hi,I have started to get this error: Failed to get instance bootstrap steps from the Databricks Control Plane. Please check that instances have connectivity to the Databricks Control Plane. and I am suspecting it has to do with networking.I am just a...
- 2397 Views
- 1 replies
- 0 kudos
- 0 kudos
Do you still facing issues? I would suggest to check the Securitygroups and make sure they match with:https://docs.databricks.com/en/security/network/classic/customer-managed-vpc.html#security-groupsAdditionally check if the inbound and outbound addr...
- 0 kudos
- 2126 Views
- 1 replies
- 0 kudos
Updating databricks git repo from github action - how to
HiMy company is migrating from azuredevops to github and we have a pipeline in azuredevops which updates/syncs databricks repos whenever a pull request is made to the development branch. The azure devops pipeline (which works) looks like this: trigge...
- 2126 Views
- 1 replies
- 0 kudos
- 0 kudos
It seems like the issue you're encountering is related to missing Git provider credentials when trying to update the Databricks repo via GitHub Actions. Based on the context provided, here are a few steps you can take to resolve this issue: Verify...
- 0 kudos
- 788 Views
- 1 replies
- 0 kudos
Customer Facing Integration
Is Databricks intended to be used in customer facing application architectures? I have heard that Databricks is primarily intended to be internally facing. Is this true?If you are using it for customer facing ML applications, what tool stack are yo...
- 788 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @hucklebarryrees ,Databricks is indeed primarily designed as an analytical platform rather than a transactional system. It’s optimized for data processing, machine learning, and analytics rather than handling high-frequency, parallel transactional...
- 0 kudos
- 1428 Views
- 4 replies
- 0 kudos
Unity Group management, Group: Manager role
We would like to have the ability to assign an individual and/or group to the "Group: Manager" role, providing them with the ability to add/remove users without the need to be an account or workspace administrator. Ideally this would be an option fo...
- 1428 Views
- 4 replies
- 0 kudos
- 0 kudos
thanks @NandiniN , we have looked through that documentation and still have not been able to get anything to work without the user also being an account or workspace admin. The way i'm interpreting the documentation (screenshot) is the API currently...
- 0 kudos
- 2489 Views
- 2 replies
- 0 kudos
List files in Databricks Workspace with Databricks CLI
I want to list all files in my Workspace with the CLIThere's a command for it: databricks fs ls dbfs:/When I run this, I get this result: I can then list the content of databricks-datasets, but no other directory. How can I list the content of the Wo...
- 2489 Views
- 2 replies
- 0 kudos
- 0 kudos
I know it's possible with Databricks SDK, but I want to solve it with the CLI on the Terminal.
- 0 kudos
- 686 Views
- 1 replies
- 0 kudos
Enabling Object Lock for the S3 bucket that is delivering audit logs
Hello Community,I am trying to enable Object Lock on the S3 bucket to which the audit log is delivered, but the following error occurs if Object Lock is enabled when the delivery settings are enabled.> {"error_code":"PERMISSION_DENIED","message":"Fai...
- 686 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @hiro12 Enabling Object Lock on an S3 bucket after configuring the delivery settings should not affect the ongoing delivery of audit logs. But I would say, it is better to understand the root cause of the error. The error you encountered when ena...
- 0 kudos
- 3960 Views
- 8 replies
- 3 kudos
Open Delta Sharing and Deletion Vectors
Hi,Just experimenting with open delta sharing and running into a few technical traps. Mainly that if deletion vectors are enabled on a delta table (which they are by default now) we get errors when trying to query a table (specifically with PowerBI)...
- 3960 Views
- 8 replies
- 3 kudos
- 3 kudos
@NandiniN we are talking of PowerBI connection, so you cannot set that option.@F_Goudarzi I have just tried out with PBI Desktop Version: 2.132.1053.0 and it is running (I did not disable Deletion Vectors into my table.) I also tried with last versio...
- 3 kudos
- 3433 Views
- 3 replies
- 0 kudos
Proxy (Zscaler) & Databricks/Spark Connect "Cannot check peer: missing selected ALPN property"
Summary:We use Zscaler and are trying to use Databricks Connect to develop pyspark code locally. At first, we received SSL HTTP errors, which we resolved by ensuring Python's request library could find Zscaler's CA cert (setting REQUESTS_CA_BUNDLE en...
- 3433 Views
- 3 replies
- 0 kudos
- 0 kudos
Hello Stevenayers-bge,checking if you come across any solution on above mentioned issue?if yes could you please post here, I really appreciate
- 0 kudos
- 1741 Views
- 3 replies
- 0 kudos
Generate a Workflow that Waits for Library Installation
I have a process in DBX/DAB and I am using Service Principal for generating a token for reaching the artifacts feed, for security this token lasts 1 hour.import requests YOUR_AZURE_TENANT_ID = ... YOUR_SERVICE_PRINCIPAL_CLIENT_ID = ... YOUR_SECRET_S...
- 1741 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @PabloCSD,here are some refined solutions that keep costs low and ensure the main workflow waits until the token is generated:Instead of separating the token generation and main tasks, consider generating the token directly within the initializati...
- 0 kudos
- 5376 Views
- 7 replies
- 2 kudos
Resolved! Could not access databricks on iphone any webbrowser
Hi around passed 2 weeks, I tried to access databricks via my ios(17.6) safari and chrome.both web browser could not access.I tried to clear cache and login with Global Administrator in Azure AD before already it same still blank page.could you pleas...
- 5376 Views
- 7 replies
- 2 kudos
- 2 kudos
Hi,I can confirm it works as well. According to my info, there was a release to fiz some Chrome issues which had a positive impact on iOS devices.Glad it works again.
- 2 kudos
- 1981 Views
- 3 replies
- 0 kudos
Resolved! Terraform Destroy not able to prune Databricks Provisioned GKE Cluster on GCP
Hi there,newbie here in Databricks on GCP. I provisioned my Databricks workspace with Terraform and all worked well. Now when I would like to target destroy my workspace, issues occur:When I do terraform destroy -target module.workspace, the workspac...
- 1981 Views
- 3 replies
- 0 kudos
- 0 kudos
Ha, that's true, too. I forget how long it takes things to delete, but I've run into it many time. Best of luck to you!
- 0 kudos
- 7368 Views
- 3 replies
- 0 kudos
Resolved! I have questions about "Premium Automated Serverless Compute - Promo DBU."
"Premium Automated Serverless Compute - Promo DBU" expenses arise from what, how can I disable it, and why are the costs so high? In the picture, I am using AzureThank you in advance for the advice
- 7368 Views
- 3 replies
- 0 kudos
- 0 kudos
Hello everyone!I had the same problem for two months. I went crazy looking for what was spending my entire subscription amount. Premium Automated Serverless Compute - Promo DBU > €8,236 After searching everywhere for information about it and reading ...
- 0 kudos
- 12527 Views
- 2 replies
- 1 kudos
Notebook and folder owner
Hi allWe can use this API https://docs.databricks.com/api/workspace/dbsqlpermissions/transferownership to transfer the ownership of a Query.Is there anything similar for notebooks and folders?
- 12527 Views
- 2 replies
- 1 kudos
- 1 kudos
This api only allows to set permissions based on permission level , which doesn't include changing OWNER. Any suggestions on this particular request.
- 1 kudos
- 899 Views
- 3 replies
- 0 kudos
How to get cost per job which runs on ALL_PURPOSE_COMPUTE ??
with system.billing.usage table i could get cost per jobs which are runs on JOB_COMPUTE but not for jobs which runs on ALL_PURPOSE_COMPUTE.
- 899 Views
- 3 replies
- 0 kudos
- 0 kudos
If nowhere DBU is captured for jobs under ALL_PURPOSE_COMPUTE then cost breakdown-based cluster events is very difficult as more than 2 jobs can parallel. So mapping is very difficult to break down cost for specific job.let me know if I am missing an...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
43 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
User | Count |
---|---|
98 | |
37 | |
26 | |
25 | |
19 |