I have a spark submit job which is running one python file called main.py.
The other file is alert.py which is being imported in main.py.
Also main.py is using multiple config files.
Alert.py is passed in --py-files and other config files are passed as --files.
All the files are in S3.
Need help in running the same in Databricks.
Trying bt spark submit task and spark python task, not sure how to pass --py-files and --files. And even on passing these in parameters, getting no module alert found. Which means it is not recognising any other files.