cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Importing CA certificate into a Databricks cluster

Direo
Contributor

Hi!

I was following guide outlined here:

https://kb.databricks.com/en_US/python/import-custom-ca-cert

(also tried this: https://stackoverflow.com/questions/73043589/configuring-tls-ca-on-databricks)

to add ca root certificate into Databricks cluster, but without any luck.

Maybe someone has managed to do it and could share specifics of how it is achieved?

I would really appreciate your help.

1 ACCEPTED SOLUTION

Accepted Solutions

Direo
Contributor

In the end it turned out that I tried to add wrong certificate. To check certificate's Distinguished Name (DN) which help identify the organization that the certificate was issued to, run

%sh openssl s_client -connect <hostname>:<port>-showcerts -CAfile <path to the .pem file>. Guide outlined here: https://kb.databricks.com/en_US/python/import-custom-ca-cert did work once I got the right certificate.

View solution in original post

5 REPLIES 5

Debayan
Databricks Employee
Databricks Employee

Hi, What was the error received?

Also, please tag @Debayan with your next update which will notify me.

@Debayan Mukherjee 

Hi!

Getting "SSLError: HTTPSConnectionPool(host='host', port=443): Max retries exceeded with url: url (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1131)')))

request:

response = requests.post(url, headers=headers, data=json_data, verify="/dbfs/path.pem")

btw, with verify=False, I am able to get response.

Debayan
Databricks Employee
Databricks Employee

Hi Direo, Thanks for the confirmation, SSL verify=false can be a good option to try with in this case as a workaround.

(Spark config:

spark.databricks.pip.ignoreSSL true)

Anonymous
Not applicable

Hi @Direo Direo​ 

Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. 

We'd love to hear from you.

Thanks!

Direo
Contributor

In the end it turned out that I tried to add wrong certificate. To check certificate's Distinguished Name (DN) which help identify the organization that the certificate was issued to, run

%sh openssl s_client -connect <hostname>:<port>-showcerts -CAfile <path to the .pem file>. Guide outlined here: https://kb.databricks.com/en_US/python/import-custom-ca-cert did work once I got the right certificate.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group