cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

How do I import class/functions so it work in Databricks and in my IDE

avnerrhh
New Contributor III

I already saw this post

I want my code to work on both platforms (Databricks and PyCharm), is there any way to do it?

1 ACCEPTED SOLUTION

Accepted Solutions

-werners-
Esteemed Contributor III

yes.

one way is to develop everything locally on your pc, so you also need to have spark installed.

This is of course not ideal as you will not have some interesting stuff that databricks provides.

But it can be done. What you have to do is create a whl and put that on databricks.

A much better way imo is to use databricks-connect. This tool enables you to use your IDE (pycharm) and let the code run on databricks, not locally:

https://docs.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect

Mind that the databricks-connect devs don't follow the ultrafast release schedule of the databricks platform.

Since a short while you can also use Databricks repos for modules and functions:

https://docs.microsoft.com/en-us/azure/databricks/repos#work-with-python-and-r-modules

View solution in original post

6 REPLIES 6

-werners-
Esteemed Contributor III

yes.

one way is to develop everything locally on your pc, so you also need to have spark installed.

This is of course not ideal as you will not have some interesting stuff that databricks provides.

But it can be done. What you have to do is create a whl and put that on databricks.

A much better way imo is to use databricks-connect. This tool enables you to use your IDE (pycharm) and let the code run on databricks, not locally:

https://docs.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect

Mind that the databricks-connect devs don't follow the ultrafast release schedule of the databricks platform.

Since a short while you can also use Databricks repos for modules and functions:

https://docs.microsoft.com/en-us/azure/databricks/repos#work-with-python-and-r-modules

avnerrhh
New Contributor III

Yes I've used Databricks-connect to do so.

When I work with pychram '%run' doesn't work but 'import' do, when I work with Databricks 'import' doesn't but '%run' do.

How can I resolve this?

avnerrhh
New Contributor III

Never mind I found that I need to use 'Files' to do so, Thanks 😄

Hi @Avner Huri​ ,

Would you mind elaborating on what you mean by using 'Files'?

It seems I have the same challenge:

1) I already use databricks-connect

2) I still have an issue exploring some notebooks in Pycharm because %run magic is not treated as an import

avnerrhh
New Contributor III

Files - if you want to 'import' command to Import code to your notebook, your code must be in a 'File'(this is his name on the Databricks UI) if you want to import a notebook you have to use %run

yes the %run command is a problem, I didn't try to solve it just didn't use it in places that must have it.

but I thunk you can just create a wrapper class that use %run when you are on Databricks and import otherwise.

Oh k, thanks.

That wrapper idea is interesting, maybe I could do something like that.

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!