Hello,Currently I'm in process of upgrading the DBR version in my jobs to version 11.3 LTS. After upgrading pyspark version to 3.3.0 on my local machine I found that exceptAll function is broken (it looks like others have similar problem). It throws ...
Hey,Do you guys know, if there is an option to implement something like this in DLT:@dlt.view()def view_1(): # some calculations that return a small dataframe with around max 80 rows@dlt.table()def table_1(): result_df = dlt.read("view_1") resu...
Hi,Is there any speed difference between mounted s3 bucket and direct access during reading/writing delta tables or other type of files? I tried to find something in docs, but didn't found anything.
Hi,I would like to know what you think about using the Delta Live Tables when the source for this pipeline is not incremental. What I mean by that is suppose that the data provider creates for me a new folder with files each time it has update to the...
Hello,I'm wondering if there is an option to make an expectation on DLT that will compare the number of records between two stages and e.g. fail if there is a difference between those counts?I mean something like this:@dlt.table()def bronze(): Some...
Thank you for your answer! Do you know how to save that created stream into the delta table right away? I need to save this stream into a temporary delta table and then make some transformations on it?
I dig into DLT docs and found this https://docs.databricks.com/workflows/delta-live-tables/delta-live-tables-cookbook.html#validate-row-counts-across-tables . I guess it solves my problem.
Unfortunately, it didnt work that way, expect don't see scalar values saved as variables @dlt.expect_or_fail("equal_number_of_records", "qa_silver_row_count == bronze_count")@dlt.table()def silver(): bronze_count == bronze_table.count() silver_t...