cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Pass variable from one notebook to another

ADB0513
New Contributor III

I have a main notebook where I am setting a python variable to the name of the catalog I want to work in.  I then call another notebook, using %run, which runs an insert into using a SQL command where I want to specify the catalog using the catalog variable from the main notebook.  What is the best way to accomplish this?  Is there a way to pass the variable to the notebook that is being called from the %run command?  Should I use a widget?  Should I use spark.conf?

 

Thanks in advance for the help.

1 REPLY 1

Kaniz_Fatma
Community Manager
Community Manager

Hi @ADB0513, To pass variables between notebooks in Databricks, you can use three main methods: **Widgets**, where you create and retrieve parameters using `dbutils.widgets` in both notebooks; **spark.conf**, where you set and get configuration parameters; and **Global Variables**, which are less recommended due to scope issues. Widgets and `spark.conf` are generally more robust and maintainable, especially for CI/CD setups.

Would you like more details on any of these methods? Or do you have any other questions about your CI/CD setup with Databricks Asset Bundles?

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group