Databricks runtime version Error
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-22-2023 04:21 AM
Hello,
I'm following courses on the Databricks academy and using for that purpose the Databricks Community edition using a runtime 12.2 LTS (includes Apache Spark 3.3.2, Scala 2.12) and I believe it can't be changed
I'm following the Data engineering course and some commands are not working and throwing the error
The Databricks Runtime is expected to be one of ['11.3.x-scala2.12', '11.3.x-photon-scala2.12', '11.3.x-cpu-ml-scala2.12'], found "12.2.x-scala2.12".
Is there a way around it? I'd just like to finish the course
Thanks
- Labels:
-
Courses
-
Databricks Community
-
Runtime
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-22-2023 10:21 PM
Hi @Hakim Ennouni
Great to meet you, and thanks for your question!
Let's see if your peers in the community have an answer to your question. Thanks.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-12-2023 10:18 AM
Hi @Solide,
Thank you for posting the question in the Databricks community.
Please go ahead and create your own cluster with the suggested runtime, this seems to resolve the issue.
Please let us know how it works.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-01-2023 07:59 AM
In the CE there is no option to select 12.2.x-cpu-ml-scala2.12 runtime.
Only 12.2 LTS (Apache Spark 3.3.2, Scala 2.12)
I need it to follow training: Scalable Machine Learning with Apache Spark.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-02-2023 08:36 AM
Hello @Retired_mod, thank you for your answer. When I click click your Contact link (https://live-databricksinc.pantheonsite.io/company/contact) I got Access Denied error. Would like to use Community Edition to do my Machine Learning Practitioner courses, and it will take more than 14 days. I cannot use our .gov account for training. Thanks.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-20-2023 06:40 AM - edited 12-20-2023 06:45 AM
I was facing the same error. This could be resolved by adding the version that you are currently working with in the config function present in '_common' notebook in the "Includes' folder. (This was the case of my folder structure that I downloaded for the Apache Spark Programming with Databricks course for you the file structure could be different). Below is the function where you have to add the version.
# The following attributes are externalized to make them easy
# for content developers to update with every new course.
course_config = CourseConfig(course_code = "asp",
course_name = "apache-spark-programming-with-databricks",
data_source_name = "apache-spark-programming-with-databricks",
data_source_version = "v03",
install_min_time = "2 min",
install_max_time = "5 min",
remote_files = remote_files,
supported_dbrs = ["11.3.x-scala2.12", "11.3.x-photon-scala2.12", "12.2.x-scala2.12"], #add your version
expected_dbrs = "11.3.x-scala2.12, 11.3.x-photon-scala2.12, 12.2.x-scala2.12") #add your version
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-15-2024 05:55 AM
This was helpful. I added my runtime version (`13.3.x-aarch64-scala2.12` shown in the error message) to the `_common` Notebook > CMD 4 > `supported_dbrs` and `expected_dbrs` lists. This helped me pass the `__validate_spark_version()` check.
It seems like the `Version Info` Notebook needs to be updated though, since it didn't contain any information related to this particular issue
```
Please see the "Troubleshooting | Spark Version" section of the "Version Info" notebook for more information.
```
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-05-2024 12:02 PM - edited 07-06-2024 08:55 AM
Problem solved. I didn't import all the folders into Databricks.
This was helpful, but I got another error. See the picture below. How can I get past this error?