โ05-30-2024 07:10 AM
Hello Everyone
We are trying to create an ML pipeline on Databricks using the famous Databricks workflows. Currently our pipeline includes having 3 major components: Data Ingestion, Model Training and Model Testing. My question is whether it is possible to share the output of one task to another (i.e. to share data generated by ingestion task to model training task). Currently we are saving the data in the DBFS volumes and reading it from there but I believe that this approach would fail if the dataset is too big. Is there a more elegant way to pass the output from one task to another maybe something similar to what we can do when creating Azure ML pipeline.
#MachineLearning #DataScience #MLOps
โ05-30-2024 11:38 PM
Hi,
There is a way to share value from one task to another, but this will only work when the pipeline is executed from workflow.
#Code from which you want to pass the value.
dbutils.jobs.taskValues.set(key='first_notebook_list', value=<value or variable you want to pass>)
#Code for notebook in which you want to access the previous notebook value.
list_object = dbutils.jobs.taskValues.get(taskKey = "<task_name_from_which_value_to_be fetched>", key = "first_notebook_list", default = 00, debugValue = 0)
โ05-31-2024 05:50 AM
Hi @Retired_mod for your quick reply. I will test it out in our scenario and let you know. Just for confirmation if I have two scripts (e.g. ingest.py and train.py) and in my task named "ingest" I do something like inside ingest.py I run:
dbutils.jobs.taskValues.set(taskKey = "ingest", key = "processed_data", value=data)
then should I pass inside the pipeline for the train.py: {{tasks.ingest.values.processed_data}}?
โ05-31-2024 11:44 AM
@Retired_mod I looked into your solution and it seems like that the value you set or get needs to be json serialisable this means I can not pass for e.g. a spark or pandas dataframe from one step to another directly. I will have to serialise and de-serialise it. Is there any step for passing Big Data between various steps of the jobs?
โ06-13-2024 04:41 AM
@Retired_mod @Hkesharwani any updates?
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group