cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

How can I resolve this SSL error which occurrs when calling databricks-sql-connector/databricks.sql.connect() from my python app?

mattmunz
New Contributor III

Error: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:997)

>  python --version

Python 3.10.4

This error seems to be coming from the thrift backend. I suspect but have not confirmed that the python version is involved. This app was previously able to connect under a configuration that used python 3.8.

I'd prefer not to use self-signed certificates. Any help would be appreciated!

6 REPLIES 6

mattmunz
New Contributor III

@Kaniz Fatma​ Thanks for this workaround. Ultimately I do want to verify the certificates so I'd be interested in hearing about any solutions that would allow that as well.

@Matt Munz​ Were you ever able to find a solution to this issue?

ziggy
New Contributor II

I have the same issue and tried the solution mentioned above. It still did not work. I am getting below error

Error: ('HY000', '[HY000] [Simba][ThriftExtension] (14) Unexpected response from server during a HTTP connection: SSL_connect: certificate verify failed. (14) (SQLDriverConnect)')

I am on Jupyterhub running on Linux

import pyodbc

import ssl

try:

  _create_unverified_https_context = ssl._create_unverified_context

except AttributeError:

  # Legacy Python that doesn't verify HTTPS certificates by default

  pass

else:

  # Handle target environment that doesn't support HTTPS verification

  ssl._create_default_https_context = _create_unverified_https_context

   

   

conn = pyodbc.connect("Driver=/opt/simba/spark/lib/64/libsparkodbc_sb64.so;" +

           "HOST=;" +

           "PORT=443;" +

           "Schema=default;" +

           "SparkServerType=3;" +

           "AuthMech=3;" +

           "UID=token;" +

           "PWD=;" +

           "ThriftTransport=2;" +

           "SSL=1;" +

           "HTTPPath=;" +

           "ssl_ca=rootdbcert.cer;" +

           "sslverify=0",

           autocommit=True)

Where should the SSL certificate reside? I uploaded it to the same project folder where the python script is running. I gave the drive path mentioned in

https://docs.databricks.com/dev-tools/pyodbc.html

twole
Databricks Employee
Databricks Employee

One way to resolve this could be to ensure your connection values in are surrounded by quotes.

  • host: "hostname.databricks.com" # Required
  • http_path: "/sql/1.0/warehouses/aaaabbbccc" 
 

alpha_mann
New Contributor II

I have just found a way😁

Try to creating an SSL context without certificate verification and then to accomplish this, use the command given below:

import ssl
context = ssl._create_unverified_context()
urllib.request.urlopen(req,context=context)

I have tried some steps from here 

Hardy_M
New Contributor II

You can set up an SSL context that skips certificate verification with the following command:

import ssl
ssl._create_default_https_context = ssl._create_unverified_context

I have followed some steps from this source.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now