by
gyapar
• New Contributor II
- 7261 Views
- 0 replies
- 0 kudos
Hi all,I'm trying to do creating one job cluster with one configuration or specification which has a workflow and this workflow needs to have 3 dependent tasks as a straight line. For example, t1->t2->t3. In databricks there are some constraints also...
- 7261 Views
- 0 replies
- 0 kudos
- 1423 Views
- 0 replies
- 0 kudos
Hello,We are using DLT pipelines for many of our jobs with notifications on failures to slack.Wondering if there is a clean way to disable the alerts when in development mode. It does make sense to have it turned off in dev, doesn't it?
- 1423 Views
- 0 replies
- 0 kudos
- 2028 Views
- 0 replies
- 0 kudos
Hi Team,I have attended the Advantage Lakehouse: Fueling Innovation in the Era of Data and AI webinar.Also completed Databricks Lakehouse Fundamentals and feedback survey, but still I have not received the Databricks voucher.Could you please look i...
- 2028 Views
- 0 replies
- 0 kudos
- 3420 Views
- 2 replies
- 0 kudos
1. How to use cloudFiles.backfillInterval option in a notebook?2. Does It need to be any set of the property?3. Where is exactly placed readstream portion of the code or writestream portion of the code?4. Do you have any sample code?5. Where we find ...
- 3420 Views
- 2 replies
- 0 kudos
Latest Reply
1.Is the following code correct for specifying the .option("cloudFiles.backfillInterval", 300)?df = spark.readStream.format("cloudFiles") \.option("cloudFiles.format", "csv") \.option("cloudFiles.schemaLocation", f"dbfs:/FileStore/xyz/back_fill_opti...
1 More Replies
- 2062 Views
- 0 replies
- 0 kudos
Is there a way to set up a workflow with multiple tasks, so that different tasks can share the same compute resource, at the same time?I understand that an instance pool may be an option, here. Wasn't sure if there were other possible options to cons...
- 2062 Views
- 0 replies
- 0 kudos