cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Sql Error on MultiNode cluster, but fine on SingleNode

brandocomando8
New Contributor III

If I run the following code on a cluster in SingleNode mode it works fine, but if I run the exact same cell on a MultiNode Cluster It throws:

SparkConnectGrpcException: (java.sql.SQLTransientConnectionException) Could not connect to address=(host=HOSTm)(port=PORT)(type=master) : (conn=715518) Connections using insecure transport are prohibited while --require_secure_transport=ON.

code:

secret_scope="myScope"
user = dbutils.secrets.get(secret_scope, "MyUser")
password = dbutils.secrets.get(secret_scope, "MyPass")
url = dbutils.secrets.get(secret_scope, "MyURL") #format: jdbc:mysql://DOMAIN:PORT/SCHEMA
host = dbutils.secrets.get(secret_scope, "MyHost")

options = {
    "url": url,
    "query": "select 1 as id",
    "user": user,
    "password": password,
    "useSSL": "True",
    "sslmode": "required",
    "ssl" : "{ \"ca\" = \"/dbfs/databricks/certs/aws-global-bundle.pem\"}",
    "serverSslCert": "dbfs:/databricks/certs/aws-global-bundle.pem",
    "isolationLevel":"READ_UNCOMMITTED",
    "enabledSslProtocolSuites":"TLSv1.2",
}
df = spark.read.format('JDBC').options(**options).load()
df.display()

 

Any ideas? seems like maybe its some spark setting I'm missing.

15 REPLIES 15

gchandra
Databricks Employee
Databricks Employee

Try moving the .pem files from dbfs to WSFS or Volume



~

I tried both and get the same error. 

brandocomando8
New Contributor III

changed to: 

"serverSslCert": "/Volumes/catalog/schema/volume/aws-global-bundle.pem"
got the same error
to confirm its not an issues with the pem file I was able to view the contents of the file with:
dbutils.fs.head("/Volumes/catalog/schema/volume/aws-global-bundle.pem")

Try this code from Shared Compute 

dbutils.fs.ls("/dfbs/databricks/")

I see you are using /dbfs/databricks under SSL key. 

 



~

brandocomando8
New Contributor III

Thanks for the quick reply.

 

dbutils.fs.ls("dbfs:/databricks/certs/")
[FileInfo(path='dbfs:/databricks/certs/aws-global-bundle.pem', name='aws-global-bundle.pem', size=152872, modificationTime=1728053118000)]

 

 
current state:

 

secret_scope="myScope"
user = dbutils.secrets.get(secret_scope, "MyUser")
password = dbutils.secrets.get(secret_scope, "MyPass")
url = dbutils.secrets.get(secret_scope, "MyURL") #format: jdbc:mysql://DOMAIN:PORT/SCHEMA
host = dbutils.secrets.get(secret_scope, "MyHost")

options = {
    "url": url,
    "query": "select 1 as id",
    "user": user,
    "password": password,
    "useSSL": "True",
    "sslmode": "required",
    "serverSslCert": "dbfs:/databricks/certs/aws-global-bundle.pem",
    "isolationLevel":"READ_UNCOMMITTED",
    "enabledSslProtocolSuites":"TLSv1.2",
}
df = spark.read.format('JDBC').options(**options).load()
df.display()​


Same error:

SparkConnectGrpcException: (java.sql.SQLTransientConnectionException) Could not connect to address=(host=HOSTm)(port=PORT)(type=master) : (conn=715518) Connections using insecure transport are prohibited while --require_secure_transport=ON.​

 

gchandra
Databricks Employee
Databricks Employee

I see you gave a thumbs up; if the solution worked, can you please Accept it?



~

brandocomando8
New Contributor III

Sorry, I was just trying to get you attention back on this post. No the issue is not resolved. 

gchandra
Databricks Employee
Databricks Employee

Have you granted ANY FILE access?

https://docs.databricks.com/en/dbfs/unity-catalog.html#how-does-dbfs-work-in-shared-access-mode

 

GRANT SELECT ON ANY FILE TO `<user@domain-name>`

 



~

brandocomando8
New Contributor III

Thanks for looking into this,
I was able to grant myself select on any file, but it did not resolve the issue

gchandra
Databricks Employee
Databricks Employee

Will it be possible to paste the cluster config screenshot of the ones that work and the ones that fail?



~

gchandra
Databricks Employee
Databricks Employee

Can you try with this config?

 

options = { 
"url": url,
"query": "select 1 as id",
"user": user,
"password": password,
"useSSL": "true", # Use lowercase 'true'
"sslmode": "VERIFY_CA",
"serverSslCert": "/dbfs/databricks/certs/aws-global-bundle.pem",
"isolationLevel": "READ_UNCOMMITTED",
"enabledSslProtocolSuites": "TLSv1.2", }

There are some minor changes please see whether this works in both cluster modes.



~

unfortunately, same error.
Emailed the cluster config json files to our SA, whom you are working with. 
the difference is:
works:
"data_security_mode": "NONE",
"spark_conf": {
"spark.master": "local[*, 4]",
"spark.databricks.cluster.profile": "singleNode"
},

doesn't work:
"spark_conf": {}
"data_security_mode": "USER_ISOLATION"

gchandra
Databricks Employee
Databricks Employee

data_security_mode": "NONE":  This is a non-Unity Catalog Cluster. No Governance enforced.

"data_security_mode": "USER_ISOLATION": This is a UC Shared Compute cluster that has certain limitations when accessing Low-Level APIs, RDDs, and dbfs/data bricks folders. 

If the .pem files are copied under /Workspace/Shared or /Volumes you should be able to access them via

/Workspace/Shared/file.pem
/Volumes/path/file.pem 

Please make sure READ access to these folders is available.



~

so I copied the pem file into a volume:

/Volumes/catalog/schema/volume/aws-global-bundle.pem


and to the workspace:

/Workspace/Shared/global-bundle.pem



data sec mode: None cannot read this file and so throws the error:

Failed to find serverSslCert file. serverSslCert=/Volumes/catalog/schema/volume.....


this is expected since UC is not enabled basically in this mode. 
I get the same thing if I try to reference it at the workspace location: /Workspace/Shared/global-bundle.pem.

But really I want it to work with data security mode User isolation. the result is the same error as before using both locations:

SparkConnectGrpcException: (java.sql.SQLTransientConnectionException) Could not connect to address=(host=HOSTm)(port=PORT)(type=master) : (conn=715518) Connections using insecure transport are prohibited while --require_secure_transport=ON.​

 What is interesting to me is its either not getting to the part where it looks for this pem file, OR its getting passed it but erroring out afterwards. To test this I tried a bogus file location in the volume

/Volumes/catalog/schema/volume/bad.pem

Same error. Which makes me thing its erroring before its even going to look for that file.... 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group