Hello,
I cloned a repo my_repo in the Dataricks space Repos.
Inside my_repo, I created a notebook new_experiment where I can import functions from my_repo, which is really handy.
When I want to modify a function in my_repo, I open my local IDE, do the modifications and push them on Github, then I pull the changes on Databricks. The problem is that when I import the function in new_experiment the changes are not synchronized.
I tried :
%load_ext autoreload %autoreload 2
But the only solution seems to detach & reattach the cluster, which is a pain as I have to rerun the code from scratch. Is there a better solution?