cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Pass variable from one notebook to another

ADB0513
New Contributor III

I have a main notebook where I am setting a python variable to the name of the catalog I want to work in.  I then call another notebook, using %run, which runs an insert into using a SQL command where I want to specify the catalog using the catalog variable from the main notebook.  What is the best way to accomplish this?  Is there a way to pass the variable to the notebook that is being called from the %run command?  Should I use a widget?  Should I use spark.conf?

 

Thanks in advance for the help.

1 REPLY 1

Retired_mod
Esteemed Contributor III

Hi @ADB0513, To pass variables between notebooks in Databricks, you can use three main methods: **Widgets**, where you create and retrieve parameters using `dbutils.widgets` in both notebooks; **spark.conf**, where you set and get configuration parameters; and **Global Variables**, which are less recommended due to scope issues. Widgets and `spark.conf` are generally more robust and maintainable, especially for CI/CD setups.

Would you like more details on any of these methods? Or do you have any other questions about your CI/CD setup with Databricks Asset Bundles?

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now