by
Thefan
• New Contributor II
- 510 Views
- 0 replies
- 1 kudos
Greetings !I've been trying out DLT for a few days but I'm running into an unexpected issue when trying to use Koalas dropna in my pipeline.My goal is to drop all columns that contain only null/na values before writing it.Current code is this : @dlt...
- 510 Views
- 0 replies
- 1 kudos
by
Daba
• New Contributor III
- 2857 Views
- 6 replies
- 6 kudos
Hi, I'm exploring the DLT with AutoLoader feature and wondering where are the schema and checkpoint hide? I want to wipe these two to reset/reinitialize the flow but unlike the "regular" AutoLoader the checkpoint and schema folder are not there.Thank...
- 2857 Views
- 6 replies
- 6 kudos
Latest Reply
Hi @Alexander Plepler​ , Just a friendly follow-up. Do you still need help? Please let us know.
5 More Replies
- 885 Views
- 1 replies
- 1 kudos
Databricks Office HoursOur next Office Hours session is scheduled for April 27 2022 - 8:00 am PT.Do you have questions about how to set up or use Databricks? Do you want to learn more about the best practices for deploying your use case or tips on da...
- 885 Views
- 1 replies
- 1 kudos
Latest Reply
Just registered. Thank you and happy weekend.
- 1071 Views
- 2 replies
- 5 kudos
Databricks is excited to announce the general availability of Delta Live Tables to you, our community. Anxiously awaited, Delta Live Tables (DLT) is the first ETL framework that uses a simple, declarative approach to building reliable streaming or ...
- 1071 Views
- 2 replies
- 5 kudos
by
SM
• New Contributor III
- 4614 Views
- 8 replies
- 3 kudos
Hello, I am working with Delta Live Tables, I am trying to create a DLT from a combination of Dataframes from a 'for loop' which are unioned and then DLT is created over the Unioned Dataframe. However I noticed that the delta table has duplciates. An...
- 4614 Views
- 8 replies
- 3 kudos
Latest Reply
@Shikha Mathew​ - Does your last answer mean that your issue is resolved? Would you be happy to mark whichever answer helped as best? Or, if it wasn't a specific one, would you tell us what worked?
7 More Replies
- 1009 Views
- 2 replies
- 0 kudos
I've been experimenting with DLT and it works well. I'd like to understand where can I see details of which records didn't meet the quality critera?
- 1009 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @Paresh J​ , The event log captures data quality metrics based on the expectations defined in your pipelines. (Source)
1 More Replies
- 643 Views
- 1 replies
- 0 kudos
I created several tables in my DLT pipeline but didn't specify a location to save them on creation. The pipleline seems to have ran, but I don't know where the tables actually are. How can I find them?
- 643 Views
- 1 replies
- 0 kudos
Latest Reply
Checkout the configuration storage under settings . If you didn't specify the storage setting, the system will default to a location in dbfs:/pipelines/
- 883 Views
- 1 replies
- 0 kudos
Where do you specify what database the DLT tables land in?
- 883 Views
- 1 replies
- 0 kudos
Latest Reply
The target key, when creating the pipeline specifies the database that the tables get published to. Documented here - https://docs.databricks.com/data-engineering/delta-live-tables/delta-live-tables-user-guide.html#publish-tables
- 644 Views
- 1 replies
- 1 kudos
I would like to run a DLT pipeline with the 8.2 runtime.
- 644 Views
- 1 replies
- 1 kudos
Latest Reply
You can add the below JSON property to the Delta Live Table pipeline specification at the parent level:"dbr_version": "8.2"