- 4924 Views
- 3 replies
- 5 kudos
I am trying to write data frame data into delta table. Previously it was working fine but now it is throwing "Log has failed integrity checks"
- 4924 Views
- 3 replies
- 5 kudos
Latest Reply
@Shanmuganathan Jothikumar I've the same exception after upgrading into unity catalog. Need to investigate a little more but adding the following setting, it works:spark.conf.set("spark.databricks.delta.state.corruptionIsFatal", False)
2 More Replies
- 2461 Views
- 2 replies
- 1 kudos
Hello there,I currently have the problem of deleted files still being in the transaction log when trying to call a delta table. What I found was this statement:%sql
FSCK REPAIR TABLE table_name [DRY RUN]But using it returned following error:Error in ...
- 2461 Views
- 2 replies
- 1 kudos
Latest Reply
Remove square brackets and try executing the command%sqlFSCK REPAIR TABLE table_name DRY RUN
1 More Replies
- 780 Views
- 0 replies
- 0 kudos
In my environment, there are 3 groups of notebooks that run on their own schedules, however they all use the same underlying transaction logs (auditlogs, as we call them) in S3. From time to time, various notebooks from each of the 3 groups fail wit...
- 780 Views
- 0 replies
- 0 kudos
- 1150 Views
- 1 replies
- 0 kudos
I have a delta table that is updated nightly, that I drop and recreate at the start of each day. However, this isn't ideal because every time I drop the table I lose all the info in the transaction log. Is there a way that I can do the equivalent of:...
- 1150 Views
- 1 replies
- 0 kudos
Latest Reply
I think you are looking for the INSERT OVERWRITE command in Spark SQL. Check out the documentation here: https://docs.databricks.com/spark/latest/spark-sql/language-manual/sql-ref-syntax-dml-insert-overwrite-table.html