- 4038 Views
- 1 replies
- 0 kudos
Resolved! Ray cannot detect GPU on the cluster
I am trying to run ray on databricks for chunking and embedding tasks. The cluster I’m using is:g4dn.xlarge1-4 workers with 4-16 cores1 GPU and 16GB memoryI have set spark.task.resource.gpu.amount to 0.5 currently.This is how I have setup my ray clus...
- 4038 Views
- 1 replies
- 0 kudos
- 0 kudos
I have replicated all your steps and created the ray cluster exactly as you have done. Also, I have set: spark.conf.set("spark.task.resource.gpu.amount", "0.5") And I see a warning that shows that I don't allocate any GPU for Spark (as 1), even tho...
- 0 kudos
- 1245 Views
- 4 replies
- 2 kudos
Oauth Token federation
Dear allHas anyone tried oauth token federation for authentication with Databricks REST APIs?appreciate if there is a re-usable code snippet to achieve the same.
- 1245 Views
- 4 replies
- 2 kudos
- 2 kudos
@noorbasha534 Here is a sample python code I use for getting oauth token from Azure Active Directory and then pass the token in databricks API. Prerequisite is the SPN needs to be a admin in the workspace.import requests # Azure AD credentials tena...
- 2 kudos
- 475 Views
- 2 replies
- 1 kudos
Resolved! SQLSTATE HY000 after upgrading from Databricks 15.4 to 16.4
After upgrading from Databricks 15.4 to 16.4, without changing our Python code, we suddenly get SQL Timeouts, see below.Is there some new timeout default, that we don't know about, that we need to increase with the new version? After a quick search I...
- 475 Views
- 2 replies
- 1 kudos
- 1 kudos
After upgrading to Databricks 16.4, there is a notable change in SQL timeout behavior. The default timeout for SQL statements and objects like materialized views and streaming tables is now set to two days (172,800 seconds). This system-wide default ...
- 1 kudos
- 575 Views
- 2 replies
- 0 kudos
View Refresh Frequency
Dear allwe have around 5000+ finished data products (aka views) in several schemas of unity catalog. One question that comes from business users frequently is - how frequently these get refreshed?for that the answer is not simpler as the underlying t...
- 575 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @noorbasha534 just a pseudocode:for view in all_views:lineage = get_lineage(view) # Use Unity Catalog APIbase_tables = extract_base_tables(lineage)refresh_times = []for table in base_tables:job = find_job_refreshing_table(table) # Custom logic/met...
- 0 kudos
- 13669 Views
- 12 replies
- 7 kudos
Resolved! Unable to view Manage Account Option in Databricks UI
Hi All, I want to remove Unity Catalog from Admin Console so that I need to navigate to manage account option but the option is not available in my Databricks workspace. Please help me to sort this issue by removing Unity Catalog.
- 13669 Views
- 12 replies
- 7 kudos
- 7 kudos
Brilliant! I followed the above and its works seamlessly.
- 7 kudos
- 805 Views
- 3 replies
- 2 kudos
Databricks OAuth: User-based OAuth (U2M) Databricks Connect in Apps
I'm looking to use a databricks session in a Databricks app. The databricks session should be able to use user-based oauth (U2M) to ensure the app has same privileges as the authenticated user using the app. Databricks apps have the ability to use th...
- 805 Views
- 3 replies
- 2 kudos
- 2 kudos
Thanks for you response and the links. But the documentation doesn't explicitly explain why the spark connect has been placed out of scope and what app builders should use to implement proper data governance using on behave of user permissions.
- 2 kudos
- 2520 Views
- 5 replies
- 3 kudos
Resolved! Connecting Azure databricks with firewall enabled Azure storage account
Hi I am trying to connect from Azure Databrick workspace to Azure gen2 storage account securely. The storage account is set up with these options1. Enabled from selected virtual networks and IP addresses- we whitelisted few ips 2. Added Microsoft.Dat...
- 2520 Views
- 5 replies
- 3 kudos
- 3 kudos
I am having exact issue as @trailblazer , that if I enable traffic for all network, I can read/write to storage account, if I only allow selected network, including the VNet, then it doesn't. I am using Serverless setup. I also followed the firewall ...
- 3 kudos
- 939 Views
- 1 replies
- 1 kudos
Resolved! Job Notifications specifically on Succeeded with Failures
Hi everyone,I have a set of jobs that always execute the last task regardless of whether the previous ones failed or not (using the ‘ALL done’ execution dependency).When moving to production and wanting to enable notifications, there is no option to ...
- 939 Views
- 1 replies
- 1 kudos
- 1 kudos
Databricks does not provide a direct way to distinguish or send notifications specifically for a "Succeeded with failures" state at the job level—the job is classified as "Success" even when some upstream tasks have failed, if the last (leaf) task is...
- 1 kudos
- 737 Views
- 2 replies
- 3 kudos
Resolved! Error when trying to destory databricks_permissions with OpenTofu
Hi,In our company's project we created a databricks_user for a service account (which is needed for our deployment process) via OpenTofu and afterwards adjusted permissions to that "user's" user folder using the databricks_permissions resource.resour...
- 737 Views
- 2 replies
- 3 kudos
- 3 kudos
Hi @MiriamHundemer , The issue occurs because the owner of the home folder (in this case, the databricks_user.databricks_deployment_sa service account) often has an unremovable CAN_MANAGE permission on its own home directory. When OpenTofu attempts t...
- 3 kudos
- 703 Views
- 4 replies
- 1 kudos
Resolved! Deply databricks workspace on azure with terraform - failed state: legacy access
I'm trying to deploy a workspace on azure via terraform and i'm getting the following error:"INVALID_PARAMETER_VALUE: Given value cannot be set for workspace~<id>~default_namespace_ws~ because: cannot set default namespace to hive_metastore since leg...
- 703 Views
- 4 replies
- 1 kudos
- 1 kudos
I found the issue, The setting automatically assigned workspaces to this metastore was checked. Unchecking this and manually assigning the metastore worked.
- 1 kudos
- 369 Views
- 1 replies
- 1 kudos
Clarification on Unity Catalog Metastore - Metadata and storage
Where does the Unity Catalog metastore metadata actually reside?Is it stored and managed in the Databricks account (control plane)?Or does it get stored in the customer-managed S3 bucket when we create a bucket for Unity Catalog metastore?I want to c...
- 369 Views
- 1 replies
- 1 kudos
- 1 kudos
@APJESK Replied here https://community.databricks.com/t5/data-governance/clarification-on-unity-catalog-metastore-metadata-and-storage/td-p/133389
- 1 kudos
- 579 Views
- 3 replies
- 3 kudos
Resolved! Databricks GCP login with company account
I created gmail on my company account, then I logged in in gcp databricks, till yesterday it was working fine, yesterday I logged in into gmail account, there it asked for another gmail id, so I have provided new, but today I am not able to login usi...
- 579 Views
- 3 replies
- 3 kudos
- 3 kudos
Hello @xavier_db! Were you able to get this login issue resolved? If yes, it would be great if you could share what worked for you so others facing the same problem can benefit as well.
- 3 kudos
- 637 Views
- 4 replies
- 6 kudos
Resolved! Is there a way to register S3 compatible tables?
Hi everyone,I have successfully registered AWS S3 tables in unity catalog, but I would like to register S3-compatible as well.But, to create an EXTERNAL LOCATION in unity catalog, it seems I must register a credential. But the only suported credentia...
- 637 Views
- 4 replies
- 6 kudos
- 6 kudos
Hey @tabasco, did you check out the External Table documentation in the Databricks AWS docs?External Locationhttps://docs.databricks.com/aws/en/sql/language-manual/sql-ref-external-locationsCredentialshttps://docs.databricks.com/aws/en/sql/language-m...
- 6 kudos
- 349 Views
- 1 replies
- 1 kudos
Workload identity federation policy
Dear allCan I create a single workload federation policy for all devops pipelines?Our set-up : we have code version controlled in Github repos. And, we use Azure DevOps pipelines to authenticate with Databricks via a service principal currently and d...
- 349 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @noorbasha534 ,In docs they are giving following example of subject requirements for Azure Devops. So, the subject (sub) claim must uniquely identify the workload. So as long as all of your pipelines resides in the same organization, same project ...
- 1 kudos
- 2421 Views
- 6 replies
- 9 kudos
Resolved! Is it possible restore a deleted catalog and schema
Is it possible restore a deleted catalog and schema.if CASCADE is used even though schemas and tables are present in catalog, catalog will be dropped.Is it possible to restore catalog or is possible to restrict the use of CACADE command.Thank you.
- 2421 Views
- 6 replies
- 9 kudos
- 9 kudos
@Louis_Frolio I cannot click any "Accept as Solution" button, as I was not the one creating the post, I believe
- 9 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
59 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 119 | |
| 39 | |
| 37 | |
| 28 | |
| 25 |