cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Shan3009
by New Contributor III
  • 4924 Views
  • 3 replies
  • 5 kudos

The transaction log has failed integrity checks. We recommend you contact Databricks support for assistance. Failed verification at version 48 of:

I am trying to write data frame data into delta table. Previously it was working fine but now it is throwing "Log has failed integrity checks"

  • 4924 Views
  • 3 replies
  • 5 kudos
Latest Reply
jcasanella
New Contributor III
  • 5 kudos

@Shanmuganathan Jothikumar​ I've the same exception after upgrading into unity catalog. Need to investigate a little more but adding the following setting, it works:spark.conf.set("spark.databricks.delta.state.corruptionIsFatal", False)

  • 5 kudos
2 More Replies
BenzDriver
by New Contributor II
  • 2461 Views
  • 2 replies
  • 1 kudos

Resolved! SQL command FSCK is not found

Hello there,I currently have the problem of deleted files still being in the transaction log when trying to call a delta table. What I found was this statement:%sql FSCK REPAIR TABLE table_name [DRY RUN]But using it returned following error:Error in ...

  • 2461 Views
  • 2 replies
  • 1 kudos
Latest Reply
RKNutalapati
Valued Contributor
  • 1 kudos

Remove square brackets and try executing the command%sqlFSCK REPAIR TABLE table_name DRY RUN

  • 1 kudos
1 More Replies
CalvinCalvert_
by New Contributor
  • 780 Views
  • 0 replies
  • 0 kudos

How does FSCK work and does it have any negative effects on subsequent notebook executions?

In my environment, there are 3 groups of notebooks that run on their own schedules, however they all use the same underlying transaction logs (auditlogs, as we call them) in S3. From time to time, various notebooks from each of the 3 groups fail wit...

  • 780 Views
  • 0 replies
  • 0 kudos
User16752241457
by New Contributor II
  • 1150 Views
  • 1 replies
  • 0 kudos

Overwriting Delta Table Using SQL

I have a delta table that is updated nightly, that I drop and recreate at the start of each day. However, this isn't ideal because every time I drop the table I lose all the info in the transaction log. Is there a way that I can do the equivalent of:...

  • 1150 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ryan_Chynoweth
Esteemed Contributor
  • 0 kudos

I think you are looking for the INSERT OVERWRITE command in Spark SQL. Check out the documentation here: https://docs.databricks.com/spark/latest/spark-sql/language-manual/sql-ref-syntax-dml-insert-overwrite-table.html

  • 0 kudos
Labels