Thursday
Hello Community,
Suddenly, I have an error, when I'm doing the deploy of the new bundle to databricks changing the python script, the cluster continue to point to an old version of the py script uploaded from databricks asset bundle, why this?
Thursday
Seems that the problem is to use an existing cluster and making the deploy again and again point to a first upload of the same script.. is this a databricks bug?
Thursday
Hi @jeremy98,
What is the error that you are hitting?
Thursday
Hello Alberto,
Yes, essentially, when I was deploying a workflow based on an existing cluster to run manually inside Databricks, I encountered an issue. While updating a Python script used by a notebook task, the script pointed to the initial version instead of the updated one, even though I redeployed the DAB.
Let me know if this makes things clearer.
Thursday
Thanks for the comments!
Just to confirm you are following this deployment logic?
databricks bundle validate
databricks bundle deploy -t dev
databricks bundle run -t dev hello_job
Also it looks like existing cluster is not picking up new version you can either restart the cluster manually or automate the restart process as part of your deployment workflow, just to confirm.
Thursday
Hello Alberto,
Thanks for your answer, yes basically I did those steps. Sometimes I didn't do validate before deploy (putting -t stg, instead dev target) but also the last command all manually through the portal.
I restarted, but again after 3 deploys for example, the cluster doesn't take the correct version of the py script..
yesterday
Hi @jeremy98,
Could you please send me your databricks.yml file to review it?
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group