cancel
Showing results for 
Search instead for 
Did you mean: 
Community Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

udi_azulay
by New Contributor II
  • 904 Views
  • 2 replies
  • 0 kudos

local filesystem access is forbidden

Hi,When i running this command over my private cluster (Single User) it works well :dbutils.fs.cp(ituff_file, protocol_local_file_path) When i try to run it over a shared cluster, i am getting : java.lang.SecurityException: Cannot use com.databricks....

  • 904 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...

  • 0 kudos
1 More Replies
dev_4
by New Contributor
  • 681 Views
  • 1 replies
  • 0 kudos

Py4JSecurityException for file access in azure data storage - seeking help

I am trying to access a file in azure data storage using databricks in python. when i access I am getting py4SecurityException (py4j.security.Py4JSecurityException: Method public com.databricks.backend.daemon.dbutils.DBUtilsCore$Result com.databricks...

  • 681 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @dev_4 , The py4j.security.Py4JSecurityException the error you’re encountering is thrown when a method that Azure Databricks has not explicitly marked as safe for Azure data lake Storage credential passthrough clusters is accessed. This securit...

  • 0 kudos
Mado
by Valued Contributor II
  • 2002 Views
  • 2 replies
  • 0 kudos

How to update a value in a column in a delta table with Map of Struct datatype?

I have a delta table in Databricks named "prod.silver.control_table". It has a few columns including "table_name" with string data type and "transform_options" with the below structure:  |-- transform_options: map (nullable = true) | |-- key: str...

Community Discussions
MAP
Struct
Update_table
  • 2002 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Mado, Yes, you can update the values in the “order_duplicates_by” field of your Delta table using the withColumn function in PySpark.   Also, please be aware that overwriting a Delta table will replace all existing data in the table. If you want ...

  • 0 kudos
1 More Replies
arun949290
by New Contributor II
  • 1040 Views
  • 3 replies
  • 0 kudos

Unable to login to community edition

Hello there,I have successfully created a databricks account and went to login to the community edition with the exact same login credentials as my account, but it tells me that the email/password are invalid. I can login with these same exact creden...

  • 1040 Views
  • 3 replies
  • 0 kudos
Latest Reply
Erik_L
Contributor II
  • 0 kudos

Databricks Community Edition and Databricks are separate services. You have to create an account specific to community.

  • 0 kudos
2 More Replies
Omri
by New Contributor
  • 291 Views
  • 1 replies
  • 0 kudos

Is it possible to create a scratchpad ui?

https://jupyter-contrib-nbextensions.readthedocs.io/en/latest/nbextensions/scratchpad/README.htmlIs something like this available on Databrick's notebooks ui?

  • 291 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi, it would be great if you can submit the idea on this and the progress can be tracked by the ideas portal. https://docs.databricks.com/en/resources/ideas.html 

  • 0 kudos
SivaPK
by New Contributor II
  • 822 Views
  • 1 replies
  • 1 kudos

Generated Access Token is deleted/Expired after lifetime 90 days? How to use old token now?

Hello Team,I have generated a new token via Admin Settings --> Developer --> Access Token -- > Manage.Now my token is deleted/Expired after 90 days. I know what is my token and generated alphanumeric one.Now how can i set or reuse the same token in d...

Community Discussions
access_token
generate_token
restore_token
settings
  • 822 Views
  • 1 replies
  • 1 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 1 kudos

Hi, To change the default lifetime of 90 days ,you can leave the Lifetime (days) box empty (blank). Refer: https://docs.databricks.com/en/dev-tools/auth/pat.html#databricks-personal-access-tokens-for-workspace-users

  • 1 kudos
Sujitha
by Community Manager
  • 5976 Views
  • 1 replies
  • 0 kudos

Exciting Announcement: Launch of New Course - Data Analysis with Databricks!

Welcome to the world of Data Analysis with Databricks! We are thrilled to introduce our latest course, providing a comprehensive journey into data analysis on the Databricks platform. Whether you're a beginner or looking to enhance your skills, this...

Screenshot 2024-01-16 at 1.57.40 PM.png
  • 5976 Views
  • 1 replies
  • 0 kudos
Latest Reply
hthiru
New Contributor II
  • 0 kudos

is this course made available in Home - Databricks Learning?

  • 0 kudos
Loki466
by New Contributor
  • 1200 Views
  • 1 replies
  • 0 kudos

Unable to list a folder with square bracket in name

I am trying to iterate over a container(adls gen2) in azure databricks using pyspark. basically I am using dbutils.fs.ls to list the contents for folder using recursive function.this logic works perfectly fine for all folder except for the folders eh...

  • 1200 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Loki466, Special characters like [] in identifiers can indeed cause issues in many systems, including Databricks.   One possible workaround could be to avoid using special characters in your folder names. If changing the folder names is not an op...

  • 0 kudos
SamAWS
by New Contributor III
  • 861 Views
  • 2 replies
  • 1 kudos

DataFrame vs. Spark SQL

When should I use DataFrame over Spark SQL?If DataFrame is better, then why do I need Spark SQL?

  • 861 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...

  • 1 kudos
1 More Replies
sanjay
by Valued Contributor II
  • 1267 Views
  • 3 replies
  • 2 kudos

Error accessing file from dbfs inside mlflow serve endpoint

Hi,I have mlflow model served using serverless GPU which takes audio file name as input and then file will be passed as parameter to huggiung face model inside predict method. But I am getting following errorHFValidationError(\nhuggingface_hub.utils....

  • 1267 Views
  • 3 replies
  • 2 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 2 kudos

Hi @sanjay, The HFValidationError you’re encountering is typically thrown when the Hugging Face model loading function (from_pretrained) cannot find the model you’re trying to load.   This can happen in two scenarios: If you’re passing a nonexistent ...

  • 2 kudos
2 More Replies
manoj_2355ca
by New Contributor III
  • 1547 Views
  • 3 replies
  • 1 kudos

cannot create external location: invalid Databricks Workspace configuration

HI AllI am trying to create databricks storage credentials , external location and catalog with terraform.cloud : AzureMy storage credentials code is working correctly . But the external location code is throwing below error when executing the Terraf...

Community Discussions
azuredatabricks
  • 1547 Views
  • 3 replies
  • 1 kudos
Latest Reply
manoj_2355ca
New Contributor III
  • 1 kudos

HI @Kaniz_FatmaThanks for the reply . Even after correcting my databricks workspace provider configuration . I am not able to create 3 external location in databricks workspace.i am using below code in terraform .Provider .tf  provider "databricks" {...

  • 1 kudos
2 More Replies
Mado
by Valued Contributor II
  • 4041 Views
  • 4 replies
  • 1 kudos

Resolved! Read CSV files in Azure Databricks notebook, how to read data when columns in CSV files are in the w

I have a task to revise CSV ingestion in Azure Databricks. The current implementation uses the below settings: source_query = ( spark.readStream.format("cloudFiles") .option("cloudFiles.format", "csv") .schema(defined_schema) .option(...

  • 4041 Views
  • 4 replies
  • 1 kudos
Latest Reply
Mado
Valued Contributor II
  • 1 kudos

 Also, I am looking for a solution that works with both correct files and malformed files using PySpark. 

  • 1 kudos
3 More Replies
manoj_2355ca
by New Contributor III
  • 816 Views
  • 1 replies
  • 1 kudos

Azure devops pipeline throwing databricks provider bug when trying to read the metastore

Hi I created Terraform script to add the existing workspace to exisitng unity catlaog by reading the metastore id . When i created pipleina nd tried to use it . This is failing with below error.Planning failed. Terraform encountered an error while ge...

Community Discussions
databricks api bug
  • 816 Views
  • 1 replies
  • 1 kudos
Latest Reply
manoj_2355ca
New Contributor III
  • 1 kudos

Please can anyone provide help me on this ?

  • 1 kudos
NP7
by New Contributor II
  • 1550 Views
  • 5 replies
  • 0 kudos

DLT pipeline unity catalog error

 Hi Everyone, I'm getting this error while running DLT pipeline in UCFailed to execute python command for notebook 'sample/delta_live_table_rules.py' with id RunnableCommandId(5596174851701821390) and error AnsiResult(,None, Map(), Map(),List(),List(...

  • 1550 Views
  • 5 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...

  • 0 kudos
4 More Replies
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!