Hello Kevin3,
In pyspark, the below line gets the context:
ctx = dbutils.notebook.entry_point.getDbutils().notebook().getContext()
But, this "dbutils.notebook.setContext(ctx)" doesn't set the Context()
I was searching the directory and found the method somewhat like this.
dbutils.notebook.entry_point.getDbutils().notebook().setContext(ctx)
but even after using this I am unable to start the child notebook from the parent notebook. It gives the below error:
"Exception due to : Context not valid. If you are calling this outside the main thread, you must set the Notebook context via dbutils.notebook.setContext(ctx), where ctx is a value retrieved from the main thread (and the same cell) via dbutils.notebook.getContext()."
Maybe I am not using it the correct way. Let me know if you have a workaround for this situation:
1. Attaching the functions that I am using to run the child notebook.
2. I am running the function "execute_child_nb" in the stream microbatch.
Thanks,
Abhijit