โ05-09-2024 09:49 AM
โ05-15-2024 11:05 AM
I would use widgets in the notebook which will process in Jobs. SQL in Notebooks can use parameters, as would the SQL in the jobs with parameterized queries now supported.
โ06-05-2024 10:33 AM
The solution that worked what adding this python cell to the notebook:
โ05-09-2024 10:57 PM
@SamGreene
Simply write your sql queries as a python variables and then run them through
spark.sql(qry)
โ05-10-2024 04:31 PM
Thanks for the suggestion, but we are using SQL in these notebooks and databricks documentation says COPY INTO supports using the IDENTIFIER function. I need to find a way to parameterize sql notebooks to run them against different catalog/schema.
โ05-15-2024 11:05 AM
I would use widgets in the notebook which will process in Jobs. SQL in Notebooks can use parameters, as would the SQL in the jobs with parameterized queries now supported.
โ06-05-2024 10:33 AM
The solution that worked what adding this python cell to the notebook:
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now