cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Latest pushed code is not taken into account by Notebook

pgagliardi
New Contributor II

Hello,

I cloned a repo my_repo in the Dataricks space Repos.

Inside my_repo, I created a notebook new_experiment where I can import functions from my_repo, which is really handy.

When I want to modify a function in my_repo, I open my local IDE, do the modifications and push them on Github, then I pull the changes on Databricks. The problem is that when I import the function in new_experiment the changes are not synchronized.

I tried :

%load_ext autoreload %autoreload 2

But the only solution seems to detach & reattach the cluster, which is a pain as I have to rerun the code from scratch. Is there a better solution?

1 REPLY 1

Jnguyen
New Contributor II
New Contributor II

Use 

%reload_ext autoreload

instead, it will do your expected behavior.
You just need to run it once, like %load_ext autoreload %autoreload 2

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.