- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-06-2024 10:11 AM
Do asset bundles support Scala notebooks? I am trying to run a simple job that uses Scala notebook and getting an error: "run failed with error message Your administrator has only allowed sql and python commands on this cluster. This execution contained at least one disallowed language."
I can run the same notebook as a regular job. Do I need to provide some special cluster configuration when configuring the job?
Currently I just have this:
job_clusters:
- job_cluster_key: job_cluster
new_cluster:
spark_version: 13.3.x-scala2.12
node_type_id: i3.xlarge
autoscale:
min_workers: 1
max_workers: 4
- Labels:
-
Spark
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-06-2024 09:46 PM
Hi @VitaliiK,Thanks for bringing up your concerns; I am always happy to help 😁
The error means that the cluster is configured to only allow SQL and Python languages to be used in notebooks and the notebook you are trying to run contains a Scala language that is not allowed.
To solve this issue, you can either modify the cluster configuration to allow the language you are trying to use or modify the notebook to only use the allowed languages.
To modify the cluster configuration:
- Go to the cluster configuration page in your workspace.
- Click Edit.
- Expand Advanced Options.
In the Spark Config field, add the following configuration: ‘spark.databricks.allowedLanguages python,r,scala,sql’ - Click Confirm and Restart. This will allow the use of Python, R, Scala and SQL in notebooks on the cluster.
Also please check the cluster access mode to know the supported language on the cluster:
https://docs.databricks.com/en/compute/configure.html#access-modes
Leave a like if this helps; follow-ups are appreciated.
Kudos
Ayushi
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-06-2024 09:46 PM
Hi @VitaliiK,Thanks for bringing up your concerns; I am always happy to help 😁
The error means that the cluster is configured to only allow SQL and Python languages to be used in notebooks and the notebook you are trying to run contains a Scala language that is not allowed.
To solve this issue, you can either modify the cluster configuration to allow the language you are trying to use or modify the notebook to only use the allowed languages.
To modify the cluster configuration:
- Go to the cluster configuration page in your workspace.
- Click Edit.
- Expand Advanced Options.
In the Spark Config field, add the following configuration: ‘spark.databricks.allowedLanguages python,r,scala,sql’ - Click Confirm and Restart. This will allow the use of Python, R, Scala and SQL in notebooks on the cluster.
Also please check the cluster access mode to know the supported language on the cluster:
https://docs.databricks.com/en/compute/configure.html#access-modes
Leave a like if this helps; follow-ups are appreciated.
Kudos
Ayushi