cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Is it possible to have customized task parameter variables

yzhang
New Contributor III

I found some Task parameter variables in this document, https://community.databricks.com/s/feed/0D58Y0000A7AUWhSQO. Sounds like it will be much flexible with double curly braces for Task parameter variables. However, from the documentation, seems only a whitelisted variables are supported. Is it possible to have customized (i.e. created by myself) task variables?

1 REPLY 1

Anonymous
Not applicable

@Yanan Zhangโ€‹ :

As per the documentation you shared, Databricks Task parameter variables are used to parameterize notebook tasks in a Databricks workspace. These variables are used to pass values from the parent notebook to the child notebook that is being executed as a task. However, it appears that Databricks currently only supports a predefined set of whitelisted Task parameter variables, and it does not explicitly mention support for custom variables.

The whitelisted Task parameter variables that are currently supported in Databricks include:

  • {{task_id}}: The unique identifier of the task.
  • {{task_run_number}}: The number of times the task has been run.
  • {{task_run_id}}: The unique identifier of the current task run.
  • {{task_retry_number}}: The number of times the task has been retried.
  • {{task_max_retries}}: The maximum number of retries allowed for the task.

These variables can be used in notebook cells or in the notebook parameters to dynamically pass values during the execution of notebook tasks.

If you need to use custom variables or parameters in your Databricks tasks, you may need to implement your own logic within the notebooks to handle these custom variables, such as passing them as input parameters or reading them from external sources like configuration files or environment variables. You can use standard Python or Scala code within the notebooks to handle these custom variables as needed.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group