- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ02-22-2022 05:51 AM
Hi All,
I have a code in the dev and production using DB 7.3 LTS. Now, I would like to update the environment to 9.1 LTS as support is going to finish. I have gone through the documentation given in the following link.
https://docs.databricks.com/release-notes/runtime/9.1.html#new-features-and-improvements
As I am new to the process, I would like to understand what procedural precautions to be taken during the migration process.
Thank you in advance for the help.
Regards
Kiran
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ02-22-2022 10:48 PM
@Kiran Chalasaniโ You can start with the DBR 9.1 LTS migration guide and Apache Spark 3.1.2 migration guide.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ02-22-2022 10:21 AM
Hi @Kiran Chalasaniโ! My name is Piper, and I'm a moderator for Databricks. Welcome! It's nice to meet you!
Thanks for your question. We will give your peers a chance to respond and then we'll circle back if we need to.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ02-22-2022 01:41 PM
Hi Piper
Thank you for welcoming me.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ02-22-2022 10:48 PM
@Kiran Chalasaniโ You can start with the DBR 9.1 LTS migration guide and Apache Spark 3.1.2 migration guide.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ02-23-2022 09:55 AM
Hi Ravi,
Thanks for the reply with the useful links. Correct me if I am wrong. I am thinking to create a new cluster and attach the job to it. During this process I will be choosing the new 9.1 LTS and it automatically picks up the Spark 3.1.2. and new Python 3.8.8 environment. Then run the code in dev to check the potential errors and fix them. is that right? or anything different?
Thanks in adavnce.
Kiran
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ02-23-2022 10:27 AM
@Kiran Chalasaniโ Yes, that is the right approach. Always test in dev before moving the changes in prod!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ02-23-2022 10:28 AM
Thank you Ravi.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ03-03-2022 09:25 AM
Hi Ravi,
So, If I update the DB to 9.1 LTS, NumPy version will be 1.19 and later, if I update to 1.21 in the notebooks. At cluster I have Spark version related to the 9.1 LTS which will support 1.19 and notebook will be 1.21. One should expect any compatibility issues? or any thing else to be take care? any suggestions?
Thank you
Kiran C
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ02-22-2022 11:51 PM
The guides Ravi mentioned are definitely a must read.
The migration from 7.3 to 9.1 is pretty transparent imo. Nothing compared to spark 2.x to 3.x f.e.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ05-19-2022 07:51 AM
@Kiran Chalasaniโ Hey Have you ever been able to run 7.3run time with multi_gpus before you migrated to 9.1?