cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Foreign catalog - Connections using insecure transport are prohibited --require_secure_transport=ON

TimB
New Contributor II

I have added a connection to a MySql database in Azure, and I have created a foreign catalog in Databricks. But when I go to query the database I get the following error;

Connections using insecure transport are prohibited while --require_secure_transport=ON

I appreciate that I can turn off this setting on my database, but I would rather do it properly and configure it from the Databricks side. Is it possible to configure Databricks so it uses secure transport on external connections?

 

 

4 REPLIES 4

Kaniz
Community Manager
Community Manager

Hi @TimB , Yes, it is possible to configure Databricks to use secure transport on external connections. You will need to include the SSL properties in your connection options.

The SSL properties usually include:

sslmode: The SSL mode to use when connecting to the target. To enforce SSL, set this to 'require'.
sslcert: The path to the SSL client certificate file.
sslkey: The path to the SSL private key file.
sslrootcert: The path to the SSL root certificate file.

TimB
New Contributor II

Thanks Kaniz, but if you view the screenshot below, you will see the options I get, and I'm not sure where I should be adding those extra details. Can you advise?

Kaniz
Community Manager
Community Manager

Hi @TimB , 

Yes, it is possible to configure Databricks to use secure transport on external connections. You can set the trustServerCertificate option to true in the Spark configuration. This option allows the driver to trust the server certificate for SSL connections, instead of validating it against the certificate authority.

Here's an example of how to configure a JDBC connection to use SSL in Databricks using the Spark configuration:

jdbcUsername = "myusername"
jdbcPassword = "mypassword"
jdbcHostname = "myhostname"
jdbcPort = "myport" 
jdbcDatabase = "mydatabase" 

jdbcUrl = "jdbc:mysql://{0}:{1}/{2}?useSSL=true&requireSSL=true&verifyServerCertificate=false&trustServerCertificate=true".format(jdbcHostname, jdbcPort, jdbcDatabase)

connectionProperties = {
  "user": jdbcUsername,
  "password": jdbcPassword,
  "driver": "com.mysql.jdbc.Driver",
}

df = spark.read.jdbc(url=jdbcUrl, table="mytable", properties=connectionProperties)

TimB
New Contributor II

Hi Kaniz,

That is the way I work when I normally query data from the azure SQLdatabase in DBX. But I am trying to use the new Lakehouse Federation from within DBX itself which should abstract away those connections properties. 

https://learn.microsoft.com/en-us/azure/databricks/query-federation/mysql

 

 

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.