- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-08-2021 08:26 AM
How do we add a certificate file in Databricks for sparksubmit type of job?
- Labels:
-
ETL
-
Sparksubmit Type
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-13-2021 11:08 AM
Here are directions on how to add a custom CA Cert. Does this help?
https://kb.databricks.com/python/import-custom-ca-cert.html
Stragetic Data and AI Advisor
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-08-2021 03:44 PM
Hello, @geetha Venkatesan! Welcome and thank you for asking. Let's give the community a bit longer to answer. We'll circle back if we need to.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-12-2021 04:58 PM
@geetha Venkatesan can you explain what sort of certificate you're trying to use? spark-submit works over the Jobs API, which does not require submitting any cert.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-13-2021 10:55 PM
ECS is hosted on our private cloud and it uses internal certificate owned by us. So from databricks when we need to connect to ECS, we need to import our internal certificate in databricks, so that connection happens between databricks and ECS .
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-13-2021 11:08 AM
Here are directions on how to add a custom CA Cert. Does this help?
https://kb.databricks.com/python/import-custom-ca-cert.html
Stragetic Data and AI Advisor
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-13-2021 11:02 PM
As per this document, we need to use DButils and we can use this only on notebook. But ours is spark scala application jar that runs in spark submit mode. So we want to know how we can import the certificate and make it use at the startup of the cluster.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-07-2023 12:02 PM
Since DBFS based files getting deprecated, is there a newer method to this?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-13-2023 08:22 AM
I have the same problem... when i worked with the hive_metastore in past, i was able tu use file system and also use API certs.
Now i'm using the unity catalog and i can't upload a certificate, can somebody help me?