cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

How to import local python file in notebook?

jsaddam28
New Contributor III

for example I have one.py and two.py in databricks and I want to use one of the module from one.py in two.py. Usually I do this in my local machine by import statement like below

two.py__

from one import module1

.

.

.

How to do this in databricks???

24 REPLIES 24

vida
Contributor II
Contributor II

Just do a %run command of the other notebook which will import all the functions there.

%run your_folder/run2.py

-Vida

JavierOrozco
New Contributor III

Hi Vida,

I have the same problem, and I followed your solution but I get the error that the file cannot be found. I tried with and without full path when the run2.py is in the same folder of the notebook

Stacktrace: /Users/username/Test NB: python

Thanks for your assistance

vida
Contributor II
Contributor II

Hi,

You need the full path - we don't support relative paths yet. I can do the following:

%run "/Users/XXXXXXX@databricks.com/InnerNotebook"

Please make sure you copy the notebook name correctly without any spaces, etc.

-Vida

JavierOrozco
New Contributor III

This the exact copy of the code

%run '/Users/admin/s3_handling_poc.py'
Notebook not found: /Users/admin/s3_handling_poc.py

Whereas with a Notebook it does work:

%run /Users/admin/test2
Command took 0.18s
Hello World

vida
Contributor II
Contributor II

Can you try without the ".py" at the end?

What do you mean by with a notebook it does work?

JavierOrozco
New Contributor III

I doesn't seem to work either, I created a "Hello World" inside a python script. If I run it with ".py" shows same error, but without ".py" looks executing but never ends until I cancel it.

vida
Contributor II
Contributor II

Sorry - I'm confused - is your file - s3_handling_poc.py uploaded to Databricks?

%run is for running one notebook within another Databricks notebook.

To get local Python code into Databricks - you'll need to either import your python file as a Databricks Notebook. Or you can create an egg from your python code and upload that as a library. If it's a single python file - importing it as a Databricks notebook is going to be the easier route.

JavierOrozco
New Contributor III

yes it is python code, I let it run and it managed to finish.

I was following the initial instruction of how to import third party python modules. The instruction was to run them. However the same code runs faster within a notebook. The problem is how do I use my own existing python code into new databricks python notebooks. Do I have to embed all of them into a notebook?

%run "/Users/admin/test_hello"

Command took 746.62s

HELLO WORLD FROM PYTHON

PaulAgnew
New Contributor II

@javier.orozco @realeuesit.com, did you ever get clarity or resolution to this problem, its as if dies with your last statement. The last two sentences of your comment sum the issue very nicely.

dchokkadi1_5588
New Contributor II

@vida, how to run a notebook that is mounted with dbfs?

%run 'dbfs:/mnt/<<pathtopyfile>>' keeps reporting Notebook not found. Tried with and without dbfs and same error.

vida
Contributor II
Contributor II

@Deepak Chokkadi %run doesn't take a dbfs path - it takes the path to the notebook from the workspace.

Thanks @Vida Ha! Any other solution to execute a notebook from a mounted S3 path?

jurmu
New Contributor II

Replying here instead of raising a new question.

%run seems to be working OK, but what it does is to run the whole file which means that all the variables/functions become global. In this case the module cannot be used as an object "module.function" etc...

What would be a workaround to do that? I guess re-writing the module as a class could work? This just doesn't "feel" right, writing a class where it should just be a module.

Creating an egg has also been suggested, but again this would result in a package/library that cannot be easily edited.

Any thoughts?

I have the exact same question. Databricks seems nice for some initial exploration / playing around but once it goes into professionalization being able to only use notebooks sets severe limits...

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.