Hi All,
I am using a python wheel to execute ingestions with Databricks workflows based on entry points in the wheel for each workflow. Included in the .whl file is a separate script named functions.py which includes several functions which get imported for use across the different ingestion scripts. The import in an ingestion script looks like the below import.
from apps.functions import some_function
The functions import and work correctly when I use a custom cluster for compute. However, when trying to use serverless compute in the workflow the functions don't seem to be imported at all. One example of this is I have a function to append a load date onto a dataframe. When the data is loaded with a compute cluster the load date is appended correctly, but when it is run with serverless compute a load date does not get appended.
What am I missing here to make sure the functions file gets imported across the different ingestion entry points?