Hi @AmandaOhare ,
Yes, exactly. To trigger workflow based on some kind of event you have to use cloud native solution like Azure Functions in case of Azure or AWS lamdbda.
Another approach is to use structured streaming in continous mode. Then if something is put on queue ,spark will automatically consume it. But that has major drawback - your cluster needs to constantly run.