Global or environment parameters.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-23-2024 08:57 AM
Hi All,
Need a help on creating utility file that can be use in pyspark notebook.
Utility file contain variables like database and schema names. So I need to pass this variables in other notebook wherever I am using database and schema.
Thanks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-23-2024 01:24 PM
Hi. Have you looked at using cluster spark configurations? We're using it for default catalog for example and it works quite well. I don't know for certain, but there might be an option to pre-set databases as well.
Here's a topic about setting catalog: Re: Set default database thru Cluster Spark Config... - Databricks - 47645
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-23-2024 02:39 PM
You can use:
Pass actual value in the notebook through a job param "param_catalog" or widget utils through text called "param_catalog"
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-23-2024 02:46 PM
Oh yeah, using widgets, of course!

