As the title suggests, whenever I create a streaming live table it creates a __apply_changes_storage_"mytablename" section in the database on databricks. Is there a way to specify a different cloud location for these files?
I have read that delta live tables will keep a history of 7 days. However after creating a streaming live table and using the dlt.apply_changes function. With this codedef run_pipeline(table_name,keys,sequence_by):
lower_table_name = table_name.l...
I am attempting to create a streaming delta live table. The main issue I am experiencing is the error below.com.databricks.sql.cloudfiles.errors.CloudFilesIllegalStateException: Found mismatched event: keyI have an aws appflow that is creating a fold...
If anyone comes back to this. I ended up finding the solution on my own. DLT makes it so if you are streaming files from a location then the folder cannot change. You must drop your files into the same folder. Otherwise it complains about the name of...
I've changed some of code to remove any personal information. The table name is being passed into the pipeline's function from another section of code. my_table was just an example name.
@Debayan Mukherjee Hello! I have gone through the documentation and used it when I set up the dlt. To be more specific when I run an initial load everything works just fine. I run my appflow which creates a folder in s3 and the path would be somethi...