Pass variable from one notebook to another
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-06-2024 08:23 AM
I have a main notebook where I am setting a python variable to the name of the catalog I want to work in. I then call another notebook, using %run, which runs an insert into using a SQL command where I want to specify the catalog using the catalog variable from the main notebook. What is the best way to accomplish this? Is there a way to pass the variable to the notebook that is being called from the %run command? Should I use a widget? Should I use spark.conf?
Thanks in advance for the help.
- Labels:
-
Spark
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-08-2024 01:07 AM
Hi @ADB0513, To pass variables between notebooks in Databricks, you can use three main methods: **Widgets**, where you create and retrieve parameters using `dbutils.widgets` in both notebooks; **spark.conf**, where you set and get configuration parameters; and **Global Variables**, which are less recommended due to scope issues. Widgets and `spark.conf` are generally more robust and maintainable, especially for CI/CD setups.
Would you like more details on any of these methods? Or do you have any other questions about your CI/CD setup with Databricks Asset Bundles?

