cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Log files are not getting deleted automatically after logRetentionDuration internal

vinaykumar
New Contributor III

Hi team

Log files are not getting deleted automatically after logRetentionDuration internal from delta log folder and after analysis , I see checkpoint files are not getting created after 10 commits .

Below table properties using

spark.sql(    f"""

    ALTER TABLE Table_name

      SET TBLPROPERTIES (

        delta.logRetentionDuration = 'interval 1 hours',

        delta.deletedFileRetentionDuration = 'interval 1 hours',

        delta.checkpointRetentionDuration = '1 hours'

      )

    """

No checkpoint.parquet file getting generated in delta log

No checkpoint.parquet

6 REPLIES 6

Lakshay
Esteemed Contributor
Esteemed Contributor

Hi @vinay kumarโ€‹ , Could you please run "SHOW TBLPROPERTIES <table name>" command and attach a screenshot here?

vinaykumar
New Contributor III

Please find TBLPROPERTIES 

imageimageimage

Lakshay
Esteemed Contributor
Esteemed Contributor

Hi @vinay kumarโ€‹ , could you please answer the below questions as well

  1. What is the latest version of the delta table?
  2. Are there any checkpoint files created?
  3. Are you doing the commits using the job cluster?

  1. What is the latest version of the delta table? -

  1. Are there any checkpoint files created? Checkpoint is not getting created after 10 commitsimage
  2. Are you doing the commits using the job cluster? using all purpose cluster

Lakshay
Esteemed Contributor
Esteemed Contributor

Hi @vinay kumarโ€‹ , It is difficult to tell what is causing this issue. I advise you to raise a support case with us to have a closer look. You can raise a support case using the link: https://help.databricks.com/

Anonymous
Not applicable

Hi @vinay kumarโ€‹ 

Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. 

We'd love to hear from you.

Thanks!

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.