Hello.
Do you know how to solve issue with the HTTPSConnectionPool when we are using SDK WorkspaceClient in notebook via workflow?
I would like to trigger job when some conditions are met. These condition are done using Python. I am using SDK to trigger the job (run_now() + WorkspaceClient()).; When I am running notebook manually using 'run all' button, everything works fine. Host and port is recognised correctly. I can easily use run_now() function to trigger another job.
However, when I added notebook (with the whole logic) to the existing workflow as another task (that is dependent on different one), HTTPSConnectionPool has no info about host. Below error appeared
HTTPSConnectionPool(host='none', port=443): Max retries exceeded with url: /api/2.1/jobs/list?name=XXX (Caused by NewConnectionError(': Failed to establish a new connection: [Errno -2] Name or service not known'))
Do you know how we can use notebook with SDK in a workflow? Should I set up some additional env variables or credentials?
Based on the Databricks documentation, by default, the Databricks SDK for Python uses default Databricks notebook authentication. And there is no special requirements needed. In my case it doesn't work.
Any suggestion?