<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: databricks-connect 14.3 spark error against 14.3 cluster with data_security_mode = NONE in Administration &amp; Architecture</title>
    <link>https://community.databricks.com/t5/administration-architecture/databricks-connect-14-3-spark-error-against-14-3-cluster-with/m-p/81019#M1436</link>
    <description>&lt;P&gt;Are you running the latest version of the Databricks connect?&lt;/P&gt;</description>
    <pubDate>Mon, 29 Jul 2024 20:56:21 GMT</pubDate>
    <dc:creator>Walter_C</dc:creator>
    <dc:date>2024-07-29T20:56:21Z</dc:date>
    <item>
      <title>databricks-connect 14.3 spark error against 14.3 cluster with data_security_mode = NONE</title>
      <link>https://community.databricks.com/t5/administration-architecture/databricks-connect-14-3-spark-error-against-14-3-cluster-with/m-p/80611#M1421</link>
      <description>&lt;P&gt;I am running into an issue with trying to use a 14.3 cluster with databricks-connect 14.3.&lt;/P&gt;&lt;P&gt;My cluster config:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="javascript"&gt;{
  "autoscale": {
    "min_workers": 2,
    "max_workers": 10
  },
  "cluster_name": "Developer Cluster",
  "spark_version": "14.3.x-scala2.12",
  "spark_conf": {
    "spark.databricks.delta.preview.enabled": "true",
    "spark.databricks.service.server.enabled": "true"
  },
  "azure_attributes": {
    "first_on_demand": 1,
    "availability": "ON_DEMAND_AZURE",
    "spot_bid_max_price": -1
  },
  "node_type_id": "Standard_DS3_v2",
  "driver_node_type_id": "Standard_DS3_v2",
  "ssh_public_keys": [],
  "custom_tags": {},
  "spark_env_vars": {},
  "autotermination_minutes": 60,
  "enable_elastic_disk": true,
  "cluster_source": "UI",
  "init_scripts": [],
  "enable_local_disk_encryption": false,
  "data_security_mode": "NONE",
  "runtime_engine": "STANDARD"
}&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Running databricks-connect test I get the following output:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;databricks-connect test
* PySpark is installed at /Users/user/projects/github.com/org/repo/.python/repo/lib/python3.10/site-packages/pyspark
* Checking SPARK_HOME
* Checking java version
openjdk version "21.0.4" 2024-07-16 LTS
OpenJDK Runtime Environment Temurin-21.0.4+7 (build 21.0.4+7-LTS)
OpenJDK 64-Bit Server VM Temurin-21.0.4+7 (build 21.0.4+7-LTS, mixed mode)
WARNING: Java versions &amp;gt;8 are not supported by this SDK
* Testing scala command
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
24/07/25 12:27:24 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Spark context Web UI available at http://localhost:4040
Spark context available as 'sc' (master = local[*], app id = local-1721924845141).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 3.5.1
/_/

Using Scala version 2.12.18 (OpenJDK 64-Bit Server VM, Java 21.0.4)
Type in expressions to have them evaluated.
Type :help for more information.

scala&amp;gt;

scala&amp;gt; import com.databricks.service.SparkClientManager
&amp;lt;console&amp;gt;:22: error: object databricks is not a member of package com
import com.databricks.service.SparkClientManager
^

scala&amp;gt; val serverConf = SparkClientManager.getForCurrentSession().getServerSparkConf
&amp;lt;console&amp;gt;:22: error: not found: value SparkClientManager
val serverConf = SparkClientManager.getForCurrentSession().getServerSparkConf
^

scala&amp;gt; val processIsolation = serverConf .get("spark.databricks.pyspark.enableProcessIsolation")
&amp;lt;console&amp;gt;:22: error: not found: value serverConf
val processIsolation = serverConf .get("spark.databricks.pyspark.enableProcessIsolation")
^

scala&amp;gt; if (!processIsolation.toBoolean) {
| spark.range(100).reduce((a,b) =&amp;gt; Long.box(a + b))
| } else {
| spark.range(99*100/2).count()
| }
&amp;lt;console&amp;gt;:23: error: not found: value processIsolation
if (!processIsolation.toBoolean) {
^

scala&amp;gt;
|
scala&amp;gt; :quit

Traceback (most recent call last):
File "/Users/user/projects/github.com/org/repo/.python/repo/bin/databricks-connect", line 8, in &amp;lt;module&amp;gt;
sys.exit(main())
File "/Users/user/projects/github.com/org/repo/.python/repo/lib/python3.10/site-packages/pyspark/databricks_connect.py", line 311, in main
test()
File "/Users/user/projects/github.com/org/repo/.python/repo/lib/python3.10/site-packages/pyspark/databricks_connect.py", line 267, in test
raise ValueError("Scala command failed to produce correct result")
ValueError: Scala command failed to produce correct result&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Trying to run tests against the cluster tells me that a spark session isn't running. However, I can run&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;spark.sparkContext.&lt;/SPAN&gt;&lt;SPAN&gt;getConf&lt;/SPAN&gt;&lt;SPAN&gt;().&lt;/SPAN&gt;&lt;SPAN&gt;getAll&lt;/SPAN&gt;&lt;SPAN&gt;() in a notebook and successfully get a list of configs.&lt;BR /&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 25 Jul 2024 17:02:03 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/databricks-connect-14-3-spark-error-against-14-3-cluster-with/m-p/80611#M1421</guid>
      <dc:creator>rhammonds1</dc:creator>
      <dc:date>2024-07-25T17:02:03Z</dc:date>
    </item>
    <item>
      <title>Re: databricks-connect 14.3 spark error against 14.3 cluster with data_security_mode = NONE</title>
      <link>https://community.databricks.com/t5/administration-architecture/databricks-connect-14-3-spark-error-against-14-3-cluster-with/m-p/81019#M1436</link>
      <description>&lt;P&gt;Are you running the latest version of the Databricks connect?&lt;/P&gt;</description>
      <pubDate>Mon, 29 Jul 2024 20:56:21 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/databricks-connect-14-3-spark-error-against-14-3-cluster-with/m-p/81019#M1436</guid>
      <dc:creator>Walter_C</dc:creator>
      <dc:date>2024-07-29T20:56:21Z</dc:date>
    </item>
  </channel>
</rss>

