cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks runtime version Error

Solide
New Contributor

Hello,

I'm following courses on the Databricks academy and using for that purpose the Databricks Community edition using a runtime 12.2 LTS (includes Apache Spark 3.3.2, Scala 2.12) and I believe it can't be changed

I'm following the Data engineering course and some commands are not working and throwing the error

The Databricks Runtime is expected to be one of ['11.3.x-scala2.12', '11.3.x-photon-scala2.12', '11.3.x-cpu-ml-scala2.12'], found "12.2.x-scala2.12".

Is there a way around it? I'd just like to finish the course

Thanks

8 REPLIES 8

Anonymous
Not applicable

Hi @Hakim Ennouni​ 

Great to meet you, and thanks for your question!

Let's see if your peers in the community have an answer to your question. Thanks.

Kumaran
Valued Contributor III
Valued Contributor III

Hi @Solide,

Thank you for posting the question in the Databricks community.

Please go ahead and create your own cluster with the suggested runtime, this seems to resolve the issue.

Please let us know how it works.

emergentpropert
New Contributor II

In the CE there is no option to select 12.2.x-cpu-ml-scala2.12 runtime.

Only 12.2 LTS (Apache Spark 3.3.2, Scala 2.12)

I need it to follow training: Scalable Machine Learning with Apache Spark.

Hi @emergentpropertDatabricks Community Edition users can get more capacity and gain production-grade functionalities by upgrading their subscription to the complete Databricks platform. To upgrade, sign-up for a 14-day free trial or contact us.

Hello @Kaniz, thank you for your answer. When I click click your Contact link (https://live-databricksinc.pantheonsite.io/company/contact) I got Access Denied error. Would like to use Community Edition to do my Machine Learning Practitioner courses, and it will take more than 14 days. I cannot use our .gov account for training. Thanks.

Hi @emergentpropert, Thank you for bringing this to our attention. We have updated our contact link.

Community Edition has limited functionalities. Please try the complete Databricks platform free for 14 days on your choice of AWS, Microsoft Azure or Google Cloud. Contact our sales team here if you have other questions regarding the functionalities.

 

V2dha
New Contributor II

I was facing the same error. This could be resolved by adding the version that you are currently working with in the config function present in '_common' notebook in the "Includes' folder. (This was the case of my folder structure that I downloaded for the Apache Spark Programming with Databricks course for you the file structure could be different). Below is the function where you have to add the version.

 

# The following attributes are externalized to make them easy
# for content developers to update with every new course.

course_config = CourseConfig(course_code = "asp",
                             course_name = "apache-spark-programming-with-databricks",
                             data_source_name = "apache-spark-programming-with-databricks",
                             data_source_version = "v03",
                             install_min_time = "2 min",
                             install_max_time = "5 min",
                             remote_files = remote_files,
                             supported_dbrs = ["11.3.x-scala2.12", "11.3.x-photon-scala2.12",  "12.2.x-scala2.12"], #add your version
                             expected_dbrs = "11.3.x-scala2.12, 11.3.x-photon-scala2.12, 12.2.x-scala2.12") #add your version

 

gyulook
New Contributor II

This was helpful. I added my runtime version (`13.3.x-aarch64-scala2.12` shown in the error message) to the `_common` Notebook > CMD 4 > `supported_dbrs` and `expected_dbrs` lists. This helped me pass the `__validate_spark_version()` check.

It seems like the `Version Info` Notebook needs to be updated though, since it didn't contain any information related to this particular issue

```

Please see the "Troubleshooting | Spark Version" section of the "Version Info" notebook for more information.

```

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!