Using JARs from Google Cloud Artifact Registry in Databricks for Job Execution
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-06-2024 12:24 AM
We have our CI/CD pipelines set up in Google Cloud using Cloud Build, and we are publishing our artifacts to a private repository in Google Cloud's Artifact Registry. I want to use these JAR files to create and run jobs in Databricks.
However, when I try to put in the repository I get the following error - "Repository must be a valid URL". The repository URL is of the format "artifactregistry://". I am looking for guidance on the best way to achieve this.
Also, if this is not possible, can we integrate our CI/CD pipeline to upload jars directly in DBFS and refer the same. Can you please point to an example where databricks-cli is used in such fashion.
Thanks