cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

elgeo
by Valued Contributor II
  • 3128 Views
  • 6 replies
  • 8 kudos

Clean up _delta_log files

Hello experts. We are trying to clarify how to clean up the large amount of files that are being accumulated in the _delta_log folder (json, crc and checkpoint files). We went through the related posts in the forum and followed the below:SET spark.da...

  • 3128 Views
  • 6 replies
  • 8 kudos
Latest Reply
Brad
Contributor II
  • 8 kudos

Awesome, thanks for response.

  • 8 kudos
5 More Replies
User16783853501
by Databricks Employee
  • 2660 Views
  • 2 replies
  • 2 kudos

Using Delta Time Travel what is the scalability limit for using the feature, at what point does the time travel become infeasible?

Using Delta Time Travel what is the scalability limit for using the feature, at what point does the time travel become infeasible? 

  • 2660 Views
  • 2 replies
  • 2 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 2 kudos

The scalability limit for using Delta Time Travel depends on several factors, including the size of your Delta tables, the frequency of changes to the tables, and the retention periods for the Delta versions.In general, Delta Time Travel can become i...

  • 2 kudos
1 More Replies
vinaykumar
by New Contributor III
  • 1802 Views
  • 3 replies
  • 0 kudos

Resolved! Time travel and version control- can create custom version control for each day data load when multiple updates happening in a day.

Time travel and version control- can create custom version control for each day data load when multiple updates happening in a day. For example , let’s assume we are doing multiple operation on table in a day every minute and want to keep time travel...

  • 1802 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @vinay kumar​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks...

  • 0 kudos
2 More Replies
elgeo
by Valued Contributor II
  • 4381 Views
  • 3 replies
  • 5 kudos

Resolved! Delta Table - Reduce time travel storage size

Hello! I am trying to understand time travel feature. I see with "DESCRIBE HISTORY" command that all the transaction history on a specific table is recorded by version and timestamp. However, I understand that this occupies a lot of storage especiall...

  • 4381 Views
  • 3 replies
  • 5 kudos
Latest Reply
elgeo
Valued Contributor II
  • 5 kudos

Thank you @Werner Stinckens​ for your reply. However I still haven't managed to delete history even after setting the below. The number of history rows remains the same when running "DESCRIBE HISTORY".SET spark.databricks.delta.retentionDurationCheck...

  • 5 kudos
2 More Replies
Mark1
by New Contributor II
  • 2031 Views
  • 2 replies
  • 2 kudos

Resolved! Using Delta Tables without Time Travel features?

Hi Everyone / Experts,is it possible to use Delta Tables without the Time Travel features? We are primarily interested in using the DML Features (delete, update, merge into, etc)Thanks,Mark

  • 2031 Views
  • 2 replies
  • 2 kudos
Latest Reply
Mark1
New Contributor II
  • 2 kudos

Thank you Hubert

  • 2 kudos
1 More Replies
User16826992666
by Valued Contributor
  • 1603 Views
  • 1 replies
  • 0 kudos
  • 1603 Views
  • 1 replies
  • 0 kudos
Latest Reply
brickster_2018
Databricks Employee
  • 0 kudos

To time travel to a particular version, it's necessary to have the JSON file for that particular version. the JSON files in the delta_log have default retention of 30 days. So by default, we can time travel only up to 30 days. The retention of the D...

  • 0 kudos
Labels