Hi Community,I'm looking to initialize Sentry in all notebooks that are used across multiple jobs. My goal is to capture exceptions using Sentry whenever a job runs a notebook.What’s the recommended approach for initializing Sentry packages in this c...
Hi community,My team and I are using a job that is triggered based on dynamic scheduling, with the schedule defined within some of the job's tasks. However, this job is attached to a cluster that is always running and never terminated.I understand th...
Hi community,I created a job using databricks asset bundle, but I'm worrying about how to install this dependency in the right way?because, I was testing the related job, but seems it doesn't install the torch library properly
Hello Community,I'm facing an issue with a job that runs a notebook task. When I run the same join condition through the job pipeline, it produces different results compared to running the notebook interactively (outside the job).Why might this be ha...
Hello community,I was searching a way to pass secrets to spark_python_task. Using a notebook file is easy, it's only to use dbutils.secrets.get(...) but how to do the same thing using a spark_python_task set using serveless compute?Kind regards,
Hi,Thanks for you answer, but I was talking to another issue. In the case, using serveless compute takes some minutes to install the packages and this is not good since our job is based on different tasks and if they are serveless takes every time so...
Hi,thanks for your answer! We're trying to develop a job that is used to send emails to our clients. But, these exports need to be sent in the time decided by the clients. For e.g. If i want to receive every day an email, this job needs to be resched...