- 10795 Views
- 5 replies
- 0 kudos
How to remove checkpoints from DeltaLake table ?I see that on my delta table exist a few checkpoints I want to remove the oldest one. It seems that existing of it is blocking removing the oldest _delta_logs entries
- 10795 Views
- 5 replies
- 0 kudos
Latest Reply
Hi @Pawel Woj Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we ...
4 More Replies
by
sbux
• New Contributor
- 2880 Views
- 2 replies
- 0 kudos
Trying to connect dots on method below through a new event on Azure eventhub, storage, partition, avro records (those I can monitor) to my delta table? How do I trace observe, writeStream and the trigger? ...
elif TABLE_TYPE == "live":
print("D...
- 2880 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @David Martin Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers ...
1 More Replies
- 1455 Views
- 3 replies
- 0 kudos
Why sometimes autoloader lose the checkpoint path and break the streaming?
- 1455 Views
- 3 replies
- 0 kudos
Latest Reply
Hi @Vittorio Antonacci Hope all is well! Just wanted to check in if you were able to resolve your issue, and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from y...
2 More Replies
by
Mayank
• New Contributor III
- 11526 Views
- 8 replies
- 4 kudos
I am trying to load parquet files using Autoloader. Below is the code def autoload_to_table (data_source, source_format, table_name, checkpoint_path):
query = (spark.readStream
.format('cloudFiles')
.option('cl...
- 11526 Views
- 8 replies
- 4 kudos
Latest Reply
Hi again @Mayank Srivastava Thank you so much for getting back to us and marking the answer as best.We really appreciate your time.Wish you a great Databricks journey ahead!
7 More Replies