cancel
Showing results for 
Search instead for 
Did you mean: 
Training Offerings
cancel
Showing results for 
Search instead for 
Did you mean: 

Runtime Spark 13.2 no longer available

bedant
New Contributor II

I'm enrolled in the course Get Started with Databricks for Data Engineering

Trying work on one of the demos that asked to copy the code base 

https://labs.training.databricks.com/import/get-started-with-data-engineering-on-databricks/v2.0.3/g...

I tried to run the code under Module 2.3 Demo: Data Management, but was not able to run the first code, under 

Includes/Classroom-Setup-05, cause it calls the code under /Includes/_common. 
 
That indicates the the DB Spark compute should be 13.2, but Azure no longer has 13.2, it has 13.3 and higher.
 
How to fix the _common code or any other solution?
1 ACCEPTED SOLUTION

Accepted Solutions

feiyun0112
Contributor

I think you should change _common code to use actual databricks version

course_config = CourseConfig(course_code = "gswdeod",
                             course_name = "get-started-with-data-engineering-on-databricks",
                             data_source_name = "get-started-with-data-engineering-on-databricks",
                             data_source_version = "v01",
                             install_min_time = "1 min",
                             install_max_time = "5 min",
                             remote_files = remote_files,
                             supported_dbrs = ["13.2.x-scala2.12", "13.2.x-photon-scala2.12", "13.2.x-cpu-ml-scala2.12"],
                             expected_dbrs = "13.2.x-scala2.12, 13.2.x-photon-scala2.12, 13.2.x-cpu-ml-scala2.12")

View solution in original post

2 REPLIES 2

feiyun0112
Contributor

I think you should change _common code to use actual databricks version

course_config = CourseConfig(course_code = "gswdeod",
                             course_name = "get-started-with-data-engineering-on-databricks",
                             data_source_name = "get-started-with-data-engineering-on-databricks",
                             data_source_version = "v01",
                             install_min_time = "1 min",
                             install_max_time = "5 min",
                             remote_files = remote_files,
                             supported_dbrs = ["13.2.x-scala2.12", "13.2.x-photon-scala2.12", "13.2.x-cpu-ml-scala2.12"],
                             expected_dbrs = "13.2.x-scala2.12, 13.2.x-photon-scala2.12, 13.2.x-cpu-ml-scala2.12")

bedant
New Contributor II

Yes updated to my exact spark version and now it worked 🙂

 

supported_dbrs = ["13.3.x-scala2.12", "13.3.x-photon-scala2.12", "13.3.x-cpu-ml-scala2.x"],
expected_dbrs = "13.3.x-scala2.12, 13.3.x-photon-scala2.12, 13.3.x-cpu-ml-scala2.x")
Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.