โ04-26-2024 07:37 AM
Hello,
I have a Job A that runs a Job B, and Job A defines a globalTempView and I would like to somehow access it in the child job.
Is that in anyway possible ? Can the same cluster be used for both jobs ? If it is not possible, does someone know of a work-around to do it ? Thank you very much !
โ05-03-2024 12:49 AM
Hello @Retired_mod ,
thank you for the very detailed answer. If I understand correctly there is no way to do this using temp views and using a Job Cluster ? I need in the case to use the same All-purpose for all my tasks in order to remain in the same spark application ?
โ07-11-2024 08:41 AM
Hello @Retired_mod and @erigaud
I am also having the same issue. We have multiple tasks inside databricks jobs which require to share dictionaries of dataframes amongst them. Is there any way we can do this? Initially we thought TaskValues might help but seems like you can not send big data loads in them even if they are json serialisable. Any ideas as to how we can do this?
โ07-12-2024 12:13 AM
just curious, will using the same job cluster within the same workflow work ? Theoretically they should work. If it is across jobs with different job clusters they may not work and persistent tables are the solution. you could drop the table at the end of the flow with a generic task.
โ07-12-2024 01:15 AM
Hi @ranged_coop
Yes, we are using the same job compute for using different workflows. But I think different tasks are like different docker containers so that is why it becomes an issue. It would be nice if you can explain a bit about the approach you think would work to access data from one task to another?
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group