Hi Everyone,
I'm using the Databricks VS Code Extension to develop and deploy Asset Bundles. Usually we work with Notebooks and use the "Run File as Workflow" function. Now I'm trying to use pure python file for a new use case and tried to use the "Upload and Run File" function in the VS Code Extension, however I get a rather quick success message saying the Job is done, without the job actually executing the code (at least so it seems).
In the log4j output I see, that the REPL session is stopped after a few miliseconds (even if I add a sleeper for 10s).
The execution on the platform directly works as expected and the databricks connect execution works as well. The workspace we are using is deployed in our VNET in Azure as has external access disabled, however, i'm in a peered network.
How can I get this to work?