Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
Hi,When i running this command over my private cluster (Single User) it works well :dbutils.fs.cp(ituff_file, protocol_local_file_path) When i try to run it over a shared cluster, i am getting : java.lang.SecurityException: Cannot use com.databricks....
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...
I am trying to access a file in azure data storage using databricks in python. when i access I am getting py4SecurityException (py4j.security.Py4JSecurityException: Method public com.databricks.backend.daemon.dbutils.DBUtilsCore$Result com.databricks...
Hi @dev_4 , The py4j.security.Py4JSecurityException the error you’re encountering is thrown when a method that Azure Databricks has not explicitly marked as safe for Azure data lake Storage credential passthrough clusters is accessed. This securit...
I have a delta table in Databricks named "prod.silver.control_table". It has a few columns including "table_name" with string data type and "transform_options" with the below structure: |-- transform_options: map (nullable = true)
| |-- key: str...
Hi @Mado, Yes, you can update the values in the “order_duplicates_by” field of your Delta table using the withColumn function in PySpark.
Also, please be aware that overwriting a Delta table will replace all existing data in the table. If you want ...
Hello there,I have successfully created a databricks account and went to login to the community edition with the exact same login credentials as my account, but it tells me that the email/password are invalid. I can login with these same exact creden...
https://jupyter-contrib-nbextensions.readthedocs.io/en/latest/nbextensions/scratchpad/README.htmlIs something like this available on Databrick's notebooks ui?
Hi, it would be great if you can submit the idea on this and the progress can be tracked by the ideas portal. https://docs.databricks.com/en/resources/ideas.html
Hello Team,I have generated a new token via Admin Settings --> Developer --> Access Token -- > Manage.Now my token is deleted/Expired after 90 days. I know what is my token and generated alphanumeric one.Now how can i set or reuse the same token in d...
Hi, To change the default lifetime of 90 days ,you can leave the Lifetime (days) box empty (blank).
Refer: https://docs.databricks.com/en/dev-tools/auth/pat.html#databricks-personal-access-tokens-for-workspace-users
Welcome to the world of Data Analysis with Databricks!
We are thrilled to introduce our latest course, providing a comprehensive journey into data analysis on the Databricks platform. Whether you're a beginner or looking to enhance your skills, this...
I am trying to iterate over a container(adls gen2) in azure databricks using pyspark. basically I am using dbutils.fs.ls to list the contents for folder using recursive function.this logic works perfectly fine for all folder except for the folders eh...
Hi @Loki466, Special characters like [] in identifiers can indeed cause issues in many systems, including Databricks.
One possible workaround could be to avoid using special characters in your folder names. If changing the folder names is not an op...
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...
Hi,I have mlflow model served using serverless GPU which takes audio file name as input and then file will be passed as parameter to huggiung face model inside predict method. But I am getting following errorHFValidationError(\nhuggingface_hub.utils....
Hi @sanjay, The HFValidationError you’re encountering is typically thrown when the Hugging Face model loading function (from_pretrained) cannot find the model you’re trying to load.
This can happen in two scenarios:
If you’re passing a nonexistent ...
HI AllI am trying to create databricks storage credentials , external location and catalog with terraform.cloud : AzureMy storage credentials code is working correctly . But the external location code is throwing below error when executing the Terraf...
HI @Kaniz_FatmaThanks for the reply . Even after correcting my databricks workspace provider configuration . I am not able to create 3 external location in databricks workspace.i am using below code in terraform .Provider .tf provider "databricks" {...
I have a task to revise CSV ingestion in Azure Databricks. The current implementation uses the below settings: source_query = (
spark.readStream.format("cloudFiles")
.option("cloudFiles.format", "csv")
.schema(defined_schema)
.option(...
Hi I created Terraform script to add the existing workspace to exisitng unity catlaog by reading the metastore id . When i created pipleina nd tried to use it . This is failing with below error.Planning failed. Terraform encountered an error while ge...
Hi Everyone, I'm getting this error while running DLT pipeline in UCFailed to execute python command for notebook 'sample/delta_live_table_rules.py' with id RunnableCommandId(5596174851701821390) and error AnsiResult(,None, Map(), Map(),List(),List(...
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...