<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Adding JAR from Azure DevOps Artifacts feed to Databricks job in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/adding-jar-from-azure-devops-artifacts-feed-to-databricks-job/m-p/13751#M8375</link>
    <description>&lt;P&gt;Hey thanks for the reply! &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Yeah I read the same about internal maven libraries, but I'm not sure what "internal" means in this context. If I try to resolve the repository URL using user:token locally it authenticates perfectly fine from a public address. &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;The problem is that the error given by Databricks is the same even when you remove the user:token from the URL so you don't get any feedback what's going wrong.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Your idea is indeed also a possible solution but it's similar to putting it on ADLS and installing from there.&lt;/P&gt;</description>
    <pubDate>Wed, 13 Oct 2021 07:14:55 GMT</pubDate>
    <dc:creator>yannickmo</dc:creator>
    <dc:date>2021-10-13T07:14:55Z</dc:date>
    <item>
      <title>Adding JAR from Azure DevOps Artifacts feed to Databricks job</title>
      <link>https://community.databricks.com/t5/data-engineering/adding-jar-from-azure-devops-artifacts-feed-to-databricks-job/m-p/13748#M8372</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;We have some Scala code which is compiled and published to an Azure DevOps Artifacts feed.&lt;/P&gt;&lt;P&gt;The issue is we're trying to now add this JAR to a Databricks job (through Terraform) to automate the creation.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;To do this I'm trying to authenticate using a generated token but am now getting an error when trying to add the library:&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;Run result unavailable: job failed with error message
 Library installation failed for library due to user error for maven {
  coordinates: "groupId:artifactId:version"
  repo: "https://user:token@ORG.pkgs.visualstudio.com/FEED_ID/_packaging/FEED_NAME/maven/v1"
}&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;Any idea how to set this up properly? I know people have figured this out for PyPi but haven't found anything for Maven.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;The other option is putting it on ADLS and adding it from there, but I'd prefer direct integration without a middle-man.&lt;/P&gt;</description>
      <pubDate>Tue, 12 Oct 2021 15:01:59 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/adding-jar-from-azure-devops-artifacts-feed-to-databricks-job/m-p/13748#M8372</guid>
      <dc:creator>yannickmo</dc:creator>
      <dc:date>2021-10-12T15:01:59Z</dc:date>
    </item>
    <item>
      <title>Re: Adding JAR from Azure DevOps Artifacts feed to Databricks job</title>
      <link>https://community.databricks.com/t5/data-engineering/adding-jar-from-azure-devops-artifacts-feed-to-databricks-job/m-p/13750#M8374</link>
      <description>&lt;P&gt;In past it was said that databrics doesn't support internal maven libraries. Now I don't see anymore that words in documentation but I guess is still like that.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;only idea which I have is to use Azure Pipelines +&lt;/P&gt;&lt;P&gt;databricks libraries install --cluster-id 1234-567890-lest123 --jar dbfs:/test-dir/test.jar&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 12 Oct 2021 15:45:51 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/adding-jar-from-azure-devops-artifacts-feed-to-databricks-job/m-p/13750#M8374</guid>
      <dc:creator>Hubert-Dudek</dc:creator>
      <dc:date>2021-10-12T15:45:51Z</dc:date>
    </item>
    <item>
      <title>Re: Adding JAR from Azure DevOps Artifacts feed to Databricks job</title>
      <link>https://community.databricks.com/t5/data-engineering/adding-jar-from-azure-devops-artifacts-feed-to-databricks-job/m-p/13751#M8375</link>
      <description>&lt;P&gt;Hey thanks for the reply! &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Yeah I read the same about internal maven libraries, but I'm not sure what "internal" means in this context. If I try to resolve the repository URL using user:token locally it authenticates perfectly fine from a public address. &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;The problem is that the error given by Databricks is the same even when you remove the user:token from the URL so you don't get any feedback what's going wrong.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Your idea is indeed also a possible solution but it's similar to putting it on ADLS and installing from there.&lt;/P&gt;</description>
      <pubDate>Wed, 13 Oct 2021 07:14:55 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/adding-jar-from-azure-devops-artifacts-feed-to-databricks-job/m-p/13751#M8375</guid>
      <dc:creator>yannickmo</dc:creator>
      <dc:date>2021-10-13T07:14:55Z</dc:date>
    </item>
    <item>
      <title>Re: Adding JAR from Azure DevOps Artifacts feed to Databricks job</title>
      <link>https://community.databricks.com/t5/data-engineering/adding-jar-from-azure-devops-artifacts-feed-to-databricks-job/m-p/13752#M8376</link>
      <description>&lt;P&gt;Hey thanks for the welcome and likewise!&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I do get that using Scala isn't as popular as Python for Databricks and using JARs to assemble jobs makes it even more special of a use-case, but if this is possible it would align nicely for automation.&lt;/P&gt;</description>
      <pubDate>Wed, 13 Oct 2021 07:17:30 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/adding-jar-from-azure-devops-artifacts-feed-to-databricks-job/m-p/13752#M8376</guid>
      <dc:creator>yannickmo</dc:creator>
      <dc:date>2021-10-13T07:17:30Z</dc:date>
    </item>
    <item>
      <title>Re: Adding JAR from Azure DevOps Artifacts feed to Databricks job</title>
      <link>https://community.databricks.com/t5/data-engineering/adding-jar-from-azure-devops-artifacts-feed-to-databricks-job/m-p/13753#M8377</link>
      <description>&lt;P&gt;Any other ideas?&lt;/P&gt;</description>
      <pubDate>Mon, 18 Oct 2021 12:58:51 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/adding-jar-from-azure-devops-artifacts-feed-to-databricks-job/m-p/13753#M8377</guid>
      <dc:creator>yannickmo</dc:creator>
      <dc:date>2021-10-18T12:58:51Z</dc:date>
    </item>
    <item>
      <title>Re: Adding JAR from Azure DevOps Artifacts feed to Databricks job</title>
      <link>https://community.databricks.com/t5/data-engineering/adding-jar-from-azure-devops-artifacts-feed-to-databricks-job/m-p/13754#M8378</link>
      <description>&lt;P&gt;As of right now, Databricks can't use non-public Maven repositories as resolving of the maven coordinates happens in the control plane. That's different from the R &amp;amp; Python libraries. As workaround you may try to install libraries via init script or upload to ADLS or S3, and refer by URL from the cluster config&lt;/P&gt;</description>
      <pubDate>Thu, 25 Nov 2021 18:47:59 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/adding-jar-from-azure-devops-artifacts-feed-to-databricks-job/m-p/13754#M8378</guid>
      <dc:creator>alexott</dc:creator>
      <dc:date>2021-11-25T18:47:59Z</dc:date>
    </item>
    <item>
      <title>Re: Adding JAR from Azure DevOps Artifacts feed to Databricks job</title>
      <link>https://community.databricks.com/t5/data-engineering/adding-jar-from-azure-devops-artifacts-feed-to-databricks-job/m-p/13755#M8379</link>
      <description>&lt;P&gt;@Alex Ott​&amp;nbsp;do you have an example of an init script that can copy the jar file from ADLS to /databricks/jars?  I cannot seem to get connected to ADLS over abfss nor https from an init script.&lt;/P&gt;</description>
      <pubDate>Tue, 16 Aug 2022 15:55:07 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/adding-jar-from-azure-devops-artifacts-feed-to-databricks-job/m-p/13755#M8379</guid>
      <dc:creator>BobGeor_68322</dc:creator>
      <dc:date>2022-08-16T15:55:07Z</dc:date>
    </item>
    <item>
      <title>Re: Adding JAR from Azure DevOps Artifacts feed to Databricks job</title>
      <link>https://community.databricks.com/t5/data-engineering/adding-jar-from-azure-devops-artifacts-feed-to-databricks-job/m-p/13756#M8380</link>
      <description>&lt;P&gt;It's two different things:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;installing via init script - in it you can just do &lt;/LI&gt;&lt;/UL&gt;&lt;PRE&gt;&lt;CODE&gt;cp /dbfs/FileStore/jars/name1.jar /databricks/jars/&lt;/CODE&gt;&lt;/PRE&gt;&lt;UL&gt;&lt;LI&gt;Installing library from ADLS - just specify ADLS URL in the cluster UI, and make sure that cluster has service principal attached so the file could be downloaded&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 19 Aug 2022 12:28:03 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/adding-jar-from-azure-devops-artifacts-feed-to-databricks-job/m-p/13756#M8380</guid>
      <dc:creator>alexott</dc:creator>
      <dc:date>2022-08-19T12:28:03Z</dc:date>
    </item>
  </channel>
</rss>

