cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

In the VSCode Databricks Extension how can one specify notebook parameters to a pass to a workflow job?

matkap
New Contributor II

I have successfully used the VSCode extension for Databricks to run a notebook on a cluster from my IDE. However in order to test effectively without changing the source, I need a way to pass parameters to the workflow job.

I have tried various ways of indicating notebook_params in the launch.json (see an example below) but none have successfully been received in the run as shown in the Job Run's Task run details.

Is there a way to pass parameters (to be read via dbutil.widgets) in the VSCode Databricks extension?

{

  // Use IntelliSense to learn about possible attributes.

  // Hover to view descriptions of existing attributes.

  // For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387

  "version": "0.2.0",

  "configurations": [

    {

      "type": "databricks-workflow",

      "request": "launch",

      "name": "Run on Databricks as a Workflow",

      "program": "${file}",

      "parameters": {"name": "John Doe"},

      "args":  []

    },

  ]

}

2 REPLIES 2

AsphaltDataRide
New Contributor III

Is there a solution for this?

AsphaltDataRide
New Contributor III

@matthew kaplan​ I am not using widgets but what works is running it by pressing F5 in the python file you want to run.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group