What is the difference between the create_table and create_streaming_table functions in dlt?For example, this is how I have created a table that streams data from kafka written as json files to a volume. @Dlt.table(
name="raw_orders",
table_...
I am using Delta Live Tables and have my pipeline defined using the code below. My understanding is that a checkpoint is automatically set when using Delta Live Tables. I am using the Unity Catalog and Schema settings in the pipeline as the storage d...
I am reading JSON files written to adls from Kafka using dlt and spark.readStream to create a streaming table for my raw ingest data. My schema is two arrays at the top levelNewRecord array, OldRecord array. I pass the schema and I run a select on Ne...
I have around 20 pgp files in a folder in my volume that I need to decrypt. I have a decryption function that accepts a file name and writes the decrypted file to a new folder in the same volume. I had thought I could create a spark dataframe with th...
I did a full refresh from the delta tables pipeline and that fixed it. I guess it was remembering the first run where I just had the top level arrays as two columns in the table.