cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Latest pushed code is not taken into account by Notebook

pgagliardi
New Contributor II

Hello,

I cloned a repo my_repo in the Dataricks space Repos.

Inside my_repo, I created a notebook new_experiment where I can import functions from my_repo, which is really handy.

When I want to modify a function in my_repo, I open my local IDE, do the modifications and push them on Github, then I pull the changes on Databricks. The problem is that when I import the function in new_experiment the changes are not synchronized.

I tried :

%load_ext autoreload %autoreload 2

But the only solution seems to detach & reattach the cluster, which is a pain as I have to rerun the code from scratch. Is there a better solution?

1 REPLY 1

Jnguyen
Databricks Employee
Databricks Employee

Use 

%reload_ext autoreload

instead, it will do your expected behavior.
You just need to run it once, like %load_ext autoreload %autoreload 2

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now