cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

How can I resolve this SSL error which occurrs when calling databricks-sql-connector/databricks.sql.connect() from my python app?

mattmunz
New Contributor III

Error: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:997)

>  python --version

Python 3.10.4

This error seems to be coming from the thrift backend. I suspect but have not confirmed that the python version is involved. This app was previously able to connect under a configuration that used python 3.8.

I'd prefer not to use self-signed certificates. Any help would be appreciated!

4 REPLIES 4

mattmunz
New Contributor III

@Kaniz Fatma​ Thanks for this workaround. Ultimately I do want to verify the certificates so I'd be interested in hearing about any solutions that would allow that as well.

@Matt Munz​ Were you ever able to find a solution to this issue?

ziggy
New Contributor II

I have the same issue and tried the solution mentioned above. It still did not work. I am getting below error

Error: ('HY000', '[HY000] [Simba][ThriftExtension] (14) Unexpected response from server during a HTTP connection: SSL_connect: certificate verify failed. (14) (SQLDriverConnect)')

I am on Jupyterhub running on Linux

import pyodbc

import ssl

try:

  _create_unverified_https_context = ssl._create_unverified_context

except AttributeError:

  # Legacy Python that doesn't verify HTTPS certificates by default

  pass

else:

  # Handle target environment that doesn't support HTTPS verification

  ssl._create_default_https_context = _create_unverified_https_context

   

   

conn = pyodbc.connect("Driver=/opt/simba/spark/lib/64/libsparkodbc_sb64.so;" +

           "HOST=;" +

           "PORT=443;" +

           "Schema=default;" +

           "SparkServerType=3;" +

           "AuthMech=3;" +

           "UID=token;" +

           "PWD=;" +

           "ThriftTransport=2;" +

           "SSL=1;" +

           "HTTPPath=;" +

           "ssl_ca=rootdbcert.cer;" +

           "sslverify=0",

           autocommit=True)

Where should the SSL certificate reside? I uploaded it to the same project folder where the python script is running. I gave the drive path mentioned in

https://docs.databricks.com/dev-tools/pyodbc.html

twole
Databricks Employee
Databricks Employee

One way to resolve this could be to ensure your connection values in are surrounded by quotes.

  • host: "hostname.databricks.com" # Required
  • http_path: "/sql/1.0/warehouses/aaaabbbccc" 
 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group