cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Workflow parameters

BradSheridan
Valued Contributor

Hey everyone! I'm close but can't seem to figure this out. I'm trying to add 2 notebooks to a Databricks Job. Instead of the first command in both notebooks being a connection to an RDS/Redshift cluster, I'd prefer to make that connection once and have it shared across both notebooks without using the %run command (I'd rather see both notebooks as tasks in my job/workflow)

thanks!

Brad

UPDATE: I set 2 parameters at the Job level and in each notebook, I just added: myvariablename = dbutils.widgets.get("param1name") and then the same myvariablename2 = dbutils.widgets.get("param2name"). BUT is there a better/best practice way of doing this?

0 REPLIES 0

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group