cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

rgualans
by New Contributor
  • 1819 Views
  • 1 replies
  • 1 kudos

Unable to reconstruct state at version ? as the transaction log has been truncated

We have a small table, which undergoes a merge operation on a daily basis. This causes that currently the table has 83 versions.When trying to query this table, we receive the following error:      DeltaFileNotFoundException: dbfs:/mnt/XXXXX/warehous...

rgualans_0-1705078443409.png
  • 1819 Views
  • 1 replies
  • 1 kudos
Latest Reply
Lakshay
Databricks Employee
  • 1 kudos

It is difficult to say exactly what caused the issue as there are multiple underlying reasons. But as a general advice, you can try setting the "delta.checkpointRetentionDuration=7 days" and also you can try switching to a different DBR version to se...

  • 1 kudos
itzme_rahul
by New Contributor II
  • 1686 Views
  • 4 replies
  • 0 kudos

Need reattempt to Certified Data Engineer Associate

Hello Team/ @Cert-Team ,I encountered Pathetic experience while attempting my 1st DataBricks certification. My exam got suspended within 10min  after starting my exam. Abruptly, Proctor asked me to show my desk, walls, room, wallet, id card, adapter,...

  • 1686 Views
  • 4 replies
  • 0 kudos
Latest Reply
KavonToy
New Contributor II
  • 0 kudos

Any update?

  • 0 kudos
3 More Replies
kiko_roy
by Contributor
  • 3173 Views
  • 2 replies
  • 2 kudos

Resolved! DLT Unity catalog schema no storage location mention details

Hi TeamAs part of an earlier That I did with Databricks team , I got the info for that if one wants to dump the data in unity catalog schema from a DLT pipeline , The specific schema's Storage location must not be specified . else he DLT pipeline wil...

  • 3173 Views
  • 2 replies
  • 2 kudos
Latest Reply
kiko_roy
Contributor
  • 2 kudos

Thanks @Walter_C  for the explanation and confirming the understanding. Really Appreaciate

  • 2 kudos
1 More Replies
ivanychev
by Contributor II
  • 1102 Views
  • 1 replies
  • 0 kudos

Spark doesn't register executors when new workers are allocated

Our pipelines sometimes get stuck (example).Some workers get decommissioned due to spot termination and then the new workers get added. However, after (1) Spark doesn't notice new executors: And I don't know why. I don't understand how to debug this,...

Screenshot 2023-12-11 at 11.12.05.png Screenshot 2023-12-11 at 11.08.56.png Screenshot 2023-12-11 at 11.48.50.png
Community Platform Discussions
decommission
executor
worker
  • 1102 Views
  • 1 replies
  • 0 kudos
Latest Reply
shan_chandra
Databricks Employee
  • 0 kudos

@ivanychev  - Firstly, New workers are added and spark notice them hence, there is an init script logging in the event log stating the init script ran on the newly added workers.  For debugging, please check the Spark UI - executor tab.  Secondly, Fo...

  • 0 kudos
sanjay
by Valued Contributor II
  • 2063 Views
  • 1 replies
  • 1 kudos

Error accessing file from dbfs inside mlflow serve endpoint

Hi,I have mlflow model served using serverless GPU which takes audio file name as input and then file will be passed as parameter to huggiung face model inside predict method. But I am getting following errorHFValidationError(\nhuggingface_hub.utils....

  • 2063 Views
  • 1 replies
  • 1 kudos
Latest Reply
" src="" />
This widget could not be displayed.
This widget could not be displayed.
This widget could not be displayed.
  • 1 kudos

This widget could not be displayed.
Hi,I have mlflow model served using serverless GPU which takes audio file name as input and then file will be passed as parameter to huggiung face model inside predict method. But I am getting following errorHFValidationError(\nhuggingface_hub.utils....

This widget could not be displayed.
  • 1 kudos
This widget could not be displayed.
manoj_2355ca
by New Contributor III
  • 2785 Views
  • 1 replies
  • 0 kudos

cannot create external location: invalid Databricks Workspace configuration

HI AllI am trying to create databricks storage credentials , external location and catalog with terraform.cloud : AzureMy storage credentials code is working correctly . But the external location code is throwing below error when executing the Terraf...

  • 2785 Views
  • 1 replies
  • 0 kudos
Latest Reply
manoj_2355ca
New Contributor III
  • 0 kudos

HI @Retired_modThanks for the reply . Even after correcting my databricks workspace provider configuration . I am not able to create 3 external location in databricks workspace.i am using below code in terraform .Provider .tf  provider "databricks" {...

  • 0 kudos
aman_yadav007
by New Contributor
  • 784 Views
  • 1 replies
  • 0 kudos

Databricks Widget

Hi,I was previously working databricks runtime 10.0 and now just upgraded to 13.0 runtime.I was using dashboard to display the widgets. Before it was just showing the widget label, but now it shows the widget name below it as well. Also it shows the ...

  • 784 Views
  • 1 replies
  • 0 kudos
Latest Reply
SparkJun
Databricks Employee
  • 0 kudos

Hi @aman_yadav007, which widget type did you use? Can you please try a different widget type or check the widget type and its arguments from this example: https://docs.databricks.com/en/notebooks/widgets.html#databricks-widgets

  • 0 kudos
NP7
by New Contributor II
  • 2545 Views
  • 2 replies
  • 0 kudos

DLT pipeline unity catalog error

 Hi Everyone, I'm getting this error while running DLT pipeline in UCFailed to execute python command for notebook 'sample/delta_live_table_rules.py' with id RunnableCommandId(5596174851701821390) and error AnsiResult(,None, Map(), Map(),List(),List(...

  • 2545 Views
  • 2 replies
  • 0 kudos
Latest Reply
RaulValMob
New Contributor II
  • 0 kudos

I get a similar error, when there is a mistake in the @dlt.table() definition for a table. In my case the culprit is usually the path.

  • 0 kudos
1 More Replies
Bharathi-Rajen
by New Contributor II
  • 1527 Views
  • 2 replies
  • 0 kudos

Unable to migrate an empty parquet table to delta lake in Databricks

I'm trying to convert my Databricks Tables from Parquet to Delta. While most of the tables have data and are successfully converted to delta some of the empty parquet tables fail with an error message as below -CONVERT TO DELTA <schema-name>.parquet_...

  • 1527 Views
  • 2 replies
  • 0 kudos
Latest Reply
BR_DatabricksAI
Contributor
  • 0 kudos

Hello Bharathi, Ideally the ETL job should not generate the empty parquet files in the respective location as it's an overhead to read the empty file and it's a not best practice.Assuming this can be easily fix in ETL job while getting the rows count...

  • 0 kudos
1 More Replies
marketing2
by New Contributor
  • 407 Views
  • 0 replies
  • 0 kudos

L'importance de Databricks dans le SEO

Le SEO est un domaine dynamique et complexe qui évolue constamment avec les technologies et les algorithmes de recherche. L'utilisation de Databricks, une plateforme d'analyse basée sur le cloud, a révolutionné la manière dont les spécialistes du SEO...

pexels-oleksandr-p-9822732.jpg
  • 407 Views
  • 0 replies
  • 0 kudos
Mado
by Valued Contributor II
  • 5956 Views
  • 2 replies
  • 0 kudos

Read CSV files in Azure Databricks notebook, how to read data when columns in CSV files are in the w

I have a task to revise CSV ingestion in Azure Databricks. The current implementation uses the below settings: source_query = ( spark.readStream.format("cloudFiles") .option("cloudFiles.format", "csv") .schema(defined_schema) .option(...

  • 5956 Views
  • 2 replies
  • 0 kudos
Latest Reply
Mado
Valued Contributor II
  • 0 kudos

 Also, I am looking for a solution that works with both correct files and malformed files using PySpark. 

  • 0 kudos
1 More Replies
Sujitha
by Databricks Employee
  • 9972 Views
  • 2 replies
  • 0 kudos

Creating High Quality RAG Applications with Databricks

Retrieval-Augmented-Generation (RAG) has quickly emerged as a powerful way to incorporate proprietary, real-time data into Large Language Model (LLM) applications. Today we are excited to launch a suite of RAG tools to help Databricks users build hig...

Screenshot 2023-12-06 at 11.41.22 PM.png
  • 9972 Views
  • 2 replies
  • 0 kudos
Latest Reply
antsdispute
New Contributor II
  • 0 kudos

It seems like you're sharing an announcement or promotional content related to Databricks and their launch of a suite of tools for Retrieval-Augmented-Generation (RAG) applications. These tools are aimed at helping Databricks users build high-quality...

  • 0 kudos
1 More Replies
SethParker
by New Contributor III
  • 3632 Views
  • 2 replies
  • 1 kudos

Power BI Import Model Refresh from Databricks SQL Whse - Query has been timed out due to inactivity

We have an intermittant issue where occasionally a partition in our Power BI Import Dataset times out at 5 hours.  When I look at Query History in Databricks SQL, I see a query that failed with the following error message:  "Query has been timed out ...

  • 3632 Views
  • 2 replies
  • 1 kudos
Latest Reply
SethParker
New Contributor III
  • 1 kudos

The only solution we have been able to come up with was to create a Notebook in Databricks that uses the Power BI API to check the status of a Refresh.  We schedule it a bit after we expect the Refresh to complete.  If it is still running, we kill th...

  • 1 kudos
1 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Top Kudoed Authors