Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
Need a help on creating utility file that can be use in pyspark notebook.
Utility file contain variables like database and schema names. So I need to pass this variables in other notebook wherever I am using database and schema.
Hi. Have you looked at using cluster spark configurations? We're using it for default catalog for example and it works quite well. I don't know for certain, but there might be an option to pre-set databases as well.
${param_catalog}.schema.tablename. Pass actual value in the notebook through a job param "param_catalog" or widget utils through text called "param_catalog"
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.