Hello,
I'm trying to connect databricks with an own JFrog Artifactory.
The objective is to download both PIP/JAR dependencies from it instead of connecting to maven-central/PyPi.
Im struggling with JAR's.
My aproximation to solve the problem is:
1. Create an init script creating a new trustore with CA where the Artifactory is deployed and saving it in /tmp.
2. Create a new Ivy Settings with de solver's for the artifactory repositories.
3. Configure spark conf in order that it get's everything. The properties setted are:
spark.driver.extraJavaOptions -Djavax.net.ssl.trustStore=With the jks -Djavax.net.ssl.trustStorePassword=changeit
spark.executor.extraJavaOptions -Djavax.net.ssl.trustStore=With the jks -Djavax.net.ssl.trustStorePassword=changeit
spark.databricks.library.ivySettings /Volumes/XXX/init_scripts/ivysettings.xml
spark.jars.packages com.microsoft.azure:azure-eventhubs-spark_2.12:2.3.22
The cluster is a Standard one with 16.4 LTS Runtime.
If anyone can help I would appreciate it.
Thanks in advance!