Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
Showing results for 
Search instead for 
Did you mean: 

How to run a notebook in a .py file in databricks

New Contributor

The situation is that my colleague was using pycharm and now needs to adapt to databricks. They are now doing their job by connecting VScode to databricks and run the .py file using databricks clusters.

The problem is they want to call a notebook in databricks in the .py file in VScode, the %run command is not working because this command only works in notebook. I am wondering if there is a way to do it?

Thanks so much!


Contributor II

One way I can think of to achieve this is by calling the rest API of Databricks job (Create a job with only one task)
Check out this documentation for more details.

Harshit Kesharwani
Self-taught Data Engineer | Seeking Remote Full-time Opportunities

Contributor III

You can do so by adding your own dbutils function in your py file:


def get_dbutils():
    This is to make your local env (and flake  happy."""
    from pyspark.sql import SparkSession

    spark = SparkSession.getActiveSession()
    if spark.conf.get("spark.databricks.service.client.enabled") == "true":
        from pyspark.dbutils import DBUtils

        return DBUtils(spark)
        import IPython

        return IPython.get_ipython().user_ns["dbutils"]


And then run the notebook from the py file using the following way:


get_dbutils()'path/to/notebook', <timeout_in_seconds>) 


Good luck!




Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!