Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
Hi ,I have created a function to anonymize user id using secret.I want to give access of this function to other users so they can execute it without giving access to the secret .Is it possible in databricks? I have tested it and see user is not able ...
Hi,I am trying to install mosaic on my cluster, but get the error once I use 'enable_mosaic': ImportError: cannot import name '_to_java_column' from 'pyspark.sql.functions' (/databricks/spark/python/pyspark/sql/functions/__init__.py)
File <command-14...
When launching a job via "Create and trigger a one-time run" (docs), when using a custom image (docs), what's the lifetime of the container? Does it create the cluster, start the container, run the job, then terminate the container? Or does the runni...
Hi @mrstevegross
Cluster Creation: When you submit a job using the "Create and trigger a one-time run" API, a new cluster is created if one is not specified.Container Start: The custom Docker image specified in the cluster configuration is us...
We have three source tables used for a streaming dimension table in silver. Around 50K records are changed in one of the source tables, and the DLT pipeline shows that it has updated those 50K records, but they remain unchanged. The only way to pick ...
Hi team, I have scheduled my Databricks Data Engineer Associate exam for 12th Feb 2025 using the below mail id, but I still have not received any confirmation mail there. I have checked spam folder too.Could you please resend it to barnitac@kpmg.com ...
Hi team, I have cleared my exam today. Unfortunately I have not received a single mail either to confirm my exam or to confirm test completion and result. @Cert-Team
Hello @Bala_K!
For information on becoming a Databricks partner, please email partnerops@Databricks.com. They can guide you through the prerequisites and next steps.
I'm working with Databricks Asset Bundles and need to define constants at the bundle level based on the target environment. These constants will be used inside Databricks notebooks.For example, I want a constant gold_catalog to take different values ...
Library installation attempted on the driver node of cluster 0210-115502-3lo6gkwd and failed. Pip could not find a version that satisfies the requirement for the library. Please check your library version and dependencies. Error code: ERROR_NO_MATCHI...
Hi All, Could you be able to help to resolve the issue related to Azure Databricks Notification. Trigger is not happening whenever job fails or success to webhook. I have created webhook in Azure Automation account and created Python webhook and ...
I'm having trouble logging in to my Databricks account at databricks.com. Here's what happens:I enter my email address and password.I receive an account verification code via email.I enter the verification code on the login page.Instead of logging me...
We have setup the integration between PowerBI and Databricks (hosted on AWS) using native databricks connector. However, we required Azure databricks connector to utilize the RBAC from unity catalog. We followed all the pre-requisite mentioned here ,...
Hi,I have been trying to create an account at databricks community edition but unable to do it for the past we weeks.Can someone from the databricks support team look into it? I am getting this since past few weeks. Unable to attach the HAR file. Do ...
Hi all, I wanted some insight and clarification on the VACUUM LITE command. VACUUM | Databricks on AWSSo I am aware that the VACUUM FULL command will deletes data files outside of the retention duration and all files in the table directory not refere...
Hello @dbxlearner,
If you set your retention duration to 3 days for a VACUUM LITE operation, it means that the command will use the Delta transaction log to identify and remove files that are no longer referenced by any table versions within the last...
Hi Team,Does anyone have a good SQL query that I can use for showing usage costs against custom tags for example on clusters. The Account Console usage report is good but I can only seem to query one custom tag at a time and ideally I want a dashboar...
Facing the below issueWe were not able to find a Community Edition workspace with this email. Please login to accounts.cloud.databricks.com to find the non-community-edition workspaces you may have access to. For help, please see Community Edition Lo...