Showing results for 
Search instead for 
Did you mean: 

In the VSCode Databricks Extension how can one specify notebook parameters to a pass to a workflow job?

New Contributor II

I have successfully used the VSCode extension for Databricks to run a notebook on a cluster from my IDE. However in order to test effectively without changing the source, I need a way to pass parameters to the workflow job.

I have tried various ways of indicating notebook_params in the launch.json (see an example below) but none have successfully been received in the run as shown in the Job Run's Task run details.

Is there a way to pass parameters (to be read via dbutil.widgets) in the VSCode Databricks extension?


  // Use IntelliSense to learn about possible attributes.

  // Hover to view descriptions of existing attributes.

  // For more information, visit:

  "version": "0.2.0",

  "configurations": [


      "type": "databricks-workflow",

      "request": "launch",

      "name": "Run on Databricks as a Workflow",

      "program": "${file}",

      "parameters": {"name": "John Doe"},

      "args":  []





New Contributor III

Is there a solution for this?

New Contributor III

@matthew kaplan​ I am not using widgets but what works is running it by pressing F5 in the python file you want to run.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.