- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-05-2022 07:55 AM
Hi all,
there is a random error when pushing data from Databricks to a Azure SQL Database.
Anyone else also had this problem? Any ideas are appreciated.
See stacktrace attached.
Target: Azure SQL Database, Standard S6: 400 DTUs
Databricks Cluster config:
"spark_version": "9.1.x-scala2.12",
"spark_conf": {
"spark.driver.extraJavaOptions": "-Dlog4j2.formatMsgNoLookups=true",
"spark.sql.session.timeZone": "UTC",
"spark.driver.maxResultSize": "6g",
"spark.executor.extraJavaOptions": "-Dlog4j2.formatMsgNoLookups=true",
"spark.databricks.io.cache.enabled": "true"
},
"node_type_id": "Standard_E4ds_v4",
"driver_node_type_id": "Standard_E8ds_v4",
- Labels:
-
Azure sql database
-
Sqlserver
-
UTC
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-05-2022 10:01 AM
This is related to cipher algorithm config occasionally failing during the handshake. As an immediate workaround I'd recommend trying to update to the latest JDBC driver which has a newer JRE which has some TLS cipher suite config updates which might mitigate the issue ( cf https://java.com/en/configure_crypto.html#TLSCipherSuiteOrder ) and also set the TLS version explicitly to 1.1 in the connection string.
Here is the latest version - https://github.com/microsoft/mssql-jdbc/releases/tag/v10.2.0
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-05-2022 10:01 AM
This is related to cipher algorithm config occasionally failing during the handshake. As an immediate workaround I'd recommend trying to update to the latest JDBC driver which has a newer JRE which has some TLS cipher suite config updates which might mitigate the issue ( cf https://java.com/en/configure_crypto.html#TLSCipherSuiteOrder ) and also set the TLS version explicitly to 1.1 in the connection string.
Here is the latest version - https://github.com/microsoft/mssql-jdbc/releases/tag/v10.2.0
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-06-2022 01:02 AM
Thx Pearl.. we already use mssql-jdbc 10.2.0 (from Maven library com.microsoft.azure:spark-mssql-connector_2.12:1.2.0)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-19-2022 11:07 PM
@Pearl Ubaru
TLS 1.1 is already deprecated.
Are there any concerns from your side to set TLS 1.2 in the connection string?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-21-2022 07:43 AM
Hi @Michael Galli. No there should be no concerns. What DBR version are you using?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-21-2022 07:58 AM
@Pearl Ubaru DBR 9.1 LTS, because we are using com.microsoft.azure:spark-mssql-connector_2.12:1.2.0
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-21-2022 08:56 AM
Yes then you are fine. The DBR must be 8.4+ for TLS 1.2

