- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-22-2023 01:35 AM
Hi!
I was following guide outlined here:
https://kb.databricks.com/en_US/python/import-custom-ca-cert
(also tried this: https://stackoverflow.com/questions/73043589/configuring-tls-ca-on-databricks)
to add ca root certificate into Databricks cluster, but without any luck.
Maybe someone has managed to do it and could share specifics of how it is achieved?
I would really appreciate your help.
- Labels:
-
Databricks Cluster
-
Guide
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-07-2023 05:10 AM
In the end it turned out that I tried to add wrong certificate. To check certificate's Distinguished Name (DN) which help identify the organization that the certificate was issued to, run
%sh openssl s_client -connect <hostname>:<port>-showcerts -CAfile <path to the .pem file>. Guide outlined here: https://kb.databricks.com/en_US/python/import-custom-ca-cert did work once I got the right certificate.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-22-2023 10:55 PM
Hi, What was the error received?
Also, please tag @Debayan with your next update which will notify me.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-23-2023 12:12 AM
@Debayan Mukherjee
Hi!
Getting "SSLError: HTTPSConnectionPool(host='host', port=443): Max retries exceeded with url: url (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1131)')))
request:
response = requests.post(url, headers=headers, data=json_data, verify="/dbfs/path.pem")
btw, with verify=False, I am able to get response.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-03-2023 12:18 AM
Hi Direo, Thanks for the confirmation, SSL verify=false can be a good option to try with in this case as a workaround.
(Spark config:
spark.databricks.pip.ignoreSSL true)

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-27-2023 09:59 PM
Hi @Direo Direo
Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-07-2023 05:10 AM
In the end it turned out that I tried to add wrong certificate. To check certificate's Distinguished Name (DN) which help identify the organization that the certificate was issued to, run
%sh openssl s_client -connect <hostname>:<port>-showcerts -CAfile <path to the .pem file>. Guide outlined here: https://kb.databricks.com/en_US/python/import-custom-ca-cert did work once I got the right certificate.

