03-22-2023 01:35 AM
Hi!
I was following guide outlined here:
https://kb.databricks.com/en_US/python/import-custom-ca-cert
(also tried this: https://stackoverflow.com/questions/73043589/configuring-tls-ca-on-databricks)
to add ca root certificate into Databricks cluster, but without any luck.
Maybe someone has managed to do it and could share specifics of how it is achieved?
I would really appreciate your help.
04-07-2023 05:10 AM
In the end it turned out that I tried to add wrong certificate. To check certificate's Distinguished Name (DN) which help identify the organization that the certificate was issued to, run
%sh openssl s_client -connect <hostname>:<port>-showcerts -CAfile <path to the .pem file>. Guide outlined here: https://kb.databricks.com/en_US/python/import-custom-ca-cert did work once I got the right certificate.
03-22-2023 10:55 PM
Hi, What was the error received?
Also, please tag @Debayan with your next update which will notify me.
03-23-2023 12:12 AM
@Debayan Mukherjee
Hi!
Getting "SSLError: HTTPSConnectionPool(host='host', port=443): Max retries exceeded with url: url (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1131)')))
request:
response = requests.post(url, headers=headers, data=json_data, verify="/dbfs/path.pem")
btw, with verify=False, I am able to get response.
04-03-2023 12:18 AM
Hi Direo, Thanks for the confirmation, SSL verify=false can be a good option to try with in this case as a workaround.
(Spark config:
spark.databricks.pip.ignoreSSL true)
03-27-2023 09:59 PM
Hi @Direo Direo
Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!
04-07-2023 05:10 AM
In the end it turned out that I tried to add wrong certificate. To check certificate's Distinguished Name (DN) which help identify the organization that the certificate was issued to, run
%sh openssl s_client -connect <hostname>:<port>-showcerts -CAfile <path to the .pem file>. Guide outlined here: https://kb.databricks.com/en_US/python/import-custom-ca-cert did work once I got the right certificate.
Sunday - last edited Sunday
@Debayan One question - Will same approach work for JKS file containing private key certificate for X.509 authentication to Mongo Atlas database.
Usual way of adding below spark config's is not working.
spark.driver.extraJavaOptions -Djavax.net.ssl.keyStore=/tmp/keystore.jks -Djavax.net.ssl.keyStorePassword=<pass>
spark.executor.extraJavaOptions -Djavax.net.ssl.keyStore=/tmp/keystore.jks -Djavax.net.ssl.keyStorePassword=<pass>
I have created following discussion - https://community.databricks.com/t5/data-engineering/issues-when-adding-keystore-spark-config-for-py...
Error in cluster logs - Caused by: java.sql.SQLNonTransientConnectionException: Could not connect to address=(host=mdb7sywh50xhpr.chkweekm4xjq.us-east-1.rds.amazonaws.com)(port=3306)(type=master) : readHandshakeRecord
25/05/10 14:19:06 WARN HiveClientImpl: HiveClient got thrift or connection reset exception, destroying client and retrying (13 tries remaining)
It is continuously showing above error and I can't execute any command on the cluster.
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now