โ05-03-2023 04:10 PM
I have successfully used the VSCode extension for Databricks to run a notebook on a cluster from my IDE. However in order to test effectively without changing the source, I need a way to pass parameters to the workflow job.
I have tried various ways of indicating notebook_params in the launch.json (see an example below) but none have successfully been received in the run as shown in the Job Run's Task run details.
Is there a way to pass parameters (to be read via dbutil.widgets) in the VSCode Databricks extension?
{
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
{
"type": "databricks-workflow",
"request": "launch",
"name": "Run on Databricks as a Workflow",
"program": "${file}",
"parameters": {"name": "John Doe"},
"args": []
},
]
}
โ06-16-2023 07:34 AM
Is there a solution for this?
โ06-19-2023 03:58 AM
@matthew kaplanโ I am not using widgets but what works is running it by pressing F5 in the python file you want to run.
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.
Click here to register and join today!
Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.