If I run the following code on a cluster in SingleNode mode it works fine, but if I run the exact same cell on a MultiNode Cluster It throws:
SparkConnectGrpcException: (java.sql.SQLTransientConnectionException) Could not connect to address=(host=HOSTm)(port=PORT)(type=master) : (conn=715518) Connections using insecure transport are prohibited while --require_secure_transport=ON.
code:
secret_scope="myScope"
user = dbutils.secrets.get(secret_scope, "MyUser")
password = dbutils.secrets.get(secret_scope, "MyPass")
url = dbutils.secrets.get(secret_scope, "MyURL") #format: jdbc:mysql://DOMAIN:PORT/SCHEMA
host = dbutils.secrets.get(secret_scope, "MyHost")
options = {
"url": url,
"query": "select 1 as id",
"user": user,
"password": password,
"useSSL": "True",
"sslmode": "required",
"ssl" : "{ \"ca\" = \"/dbfs/databricks/certs/aws-global-bundle.pem\"}",
"serverSslCert": "dbfs:/databricks/certs/aws-global-bundle.pem",
"isolationLevel":"READ_UNCOMMITTED",
"enabledSslProtocolSuites":"TLSv1.2",
}
df = spark.read.format('JDBC').options(**options).load()
df.display()
Any ideas? seems like maybe its some spark setting I'm missing.