I want to use liquid clustering on a materialised view created via a DLT pipeline, however, there doesn't appear to be a valid way to do this.Via table properties:@Dlt.table(
name="<table name>,
comment="<table description",
table_propert...
Following best practice, we want to avoid reusing code by putting commonly used transformations into function libraries and then importing and calling those functions where required.We also want to follow Databricks recommendations to use serverless ...
Our business does a LOT of reporting and analysis by time-of-day and clock times, independent of day or date. Databricks does not seem to support the TIME data type, that I can see. If I attempt to import data recorded as a time (eg., 02:59:59.000)...
I'm a newbie and I've just done the "Run your first Delta Live Tables pipeline" tutorial.The tutorial downloads a publicly available csv baby names file and creates two new Delta Live tables from it. Now I want to be a good dev and clean up the reso...
Hi; I'm new to Databricks, so apologies if this is a dumb question.I have a notebook with SQL cells that are selecting data from various Delta tables into temporary views. Then I have a query that joins up the data from these temporary views.I'd lik...
Thanks @aayrm5 . I want to use cluster by auto, because the data will get queried and aggregated several different ways by different business users. I did try your code above anyway, specifying the columns to cluster by. The pipeline ran without er...
Thanks, @BigRouxMy understanding is that DLT only allows for materialized views and streaming tables. When you say, "liquid clustering is supported for Delta Lake tables managed through DLT Preview and Current channels", do you mean that liquid clust...
Hi @Ayushi_Suthar I don't see "Task options" in the Job UI. If I click on a Task, I see Task name, Type, Job, Depends on, Job parameters, Notifications, and Duration threshold. @SS_RATH , did you find an answer?
Thank you @gchandra . Deleting the pipeline does indeed remove the materialized view definitions from the Catalog. How can I confirm that the underlying S3 storage has also been cleared? Just removing the pointers in the Catalog is not enough, if ...