<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: What are some best practices for CICD? in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26072#M18197</link>
    <description>&lt;P&gt;If you use PowerShell for your CI/CD pipelines you may want to have a look at my DatabricksPS module which is a wrapper for the Databricks REST API.&lt;/P&gt;&lt;P&gt;It has dedicated cmdlets to export (e.g. from DEV) and import again (e.g. to TEST/PROD) in an automated fashion. This includes notebooks, job-definitions, cluster-definitions, secrets, ..., (and SQL objects to come soon!)&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;A href="https://www.powershellgallery.com/packages/DatabricksPS" target="test_blank"&gt;https://www.powershellgallery.com/packages/DatabricksPS&lt;/A&gt;&lt;/P&gt;</description>
    <pubDate>Fri, 03 Sep 2021 06:39:34 GMT</pubDate>
    <dc:creator>gbrueckl</dc:creator>
    <dc:date>2021-09-03T06:39:34Z</dc:date>
    <item>
      <title>What are some best practices for CICD?</title>
      <link>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26070#M18195</link>
      <description>&lt;P&gt;A number of people have questions on using Databricks in a productionalized environment. What are the best practices to enable CICD automation?&lt;/P&gt;</description>
      <pubDate>Mon, 07 Jun 2021 17:50:07 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26070#M18195</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2021-06-07T17:50:07Z</dc:date>
    </item>
    <item>
      <title>Re: What are some best practices for CICD?</title>
      <link>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26071#M18196</link>
      <description>&lt;P&gt;Databricks enables CICD using REST API. That allows build servers (such as Jenkins, github actions, etc) to update artifacts.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;In addition one should use CICD practices to store databricks job config. Use the REST API to update and refresh those in the appropriate environment.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;A href="https://docs.databricks.com/dev-tools/api/latest/index.html" target="test_blank"&gt;https://docs.databricks.com/dev-tools/api/latest/index.html&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 07 Jun 2021 17:51:47 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26071#M18196</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2021-06-07T17:51:47Z</dc:date>
    </item>
    <item>
      <title>Re: What are some best practices for CICD?</title>
      <link>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26072#M18197</link>
      <description>&lt;P&gt;If you use PowerShell for your CI/CD pipelines you may want to have a look at my DatabricksPS module which is a wrapper for the Databricks REST API.&lt;/P&gt;&lt;P&gt;It has dedicated cmdlets to export (e.g. from DEV) and import again (e.g. to TEST/PROD) in an automated fashion. This includes notebooks, job-definitions, cluster-definitions, secrets, ..., (and SQL objects to come soon!)&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;A href="https://www.powershellgallery.com/packages/DatabricksPS" target="test_blank"&gt;https://www.powershellgallery.com/packages/DatabricksPS&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 03 Sep 2021 06:39:34 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26072#M18197</guid>
      <dc:creator>gbrueckl</dc:creator>
      <dc:date>2021-09-03T06:39:34Z</dc:date>
    </item>
    <item>
      <title>Re: What are some best practices for CICD?</title>
      <link>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26073#M18198</link>
      <description>&lt;P&gt;Some time ago I've been looking for the very same answers and this is what I found/did back then:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;&lt;A href="https://menziess.github.io/howto/" target="test_blank"&gt;https://menziess.github.io/howto/&lt;/A&gt; - my source of inspiration (tech part)&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://databricks.com/blog/2017/10/30/continuous-integration-continuous-delivery-databricks.html" target="test_blank"&gt;https://databricks.com/blog/2017/10/30/continuous-integration-continuous-delivery-databricks.html&lt;/A&gt; - based on that I've been trying to set up some process-wise baseline&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://github.com/pawelmitrus/azure-databricks-cicd-notebooks" target="test_blank"&gt;https://github.com/pawelmitrus/azure-databricks-cicd-notebooks&lt;/A&gt; - my own idea of solving the problem of CI/CD for the &lt;B&gt;very common&lt;/B&gt; projects, using notebooks for running ETLs (it's kind of a mix of all above)&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I'd be happy to discuss&lt;/P&gt;</description>
      <pubDate>Tue, 14 Sep 2021 09:11:22 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26073#M18198</guid>
      <dc:creator>pawelmitrus</dc:creator>
      <dc:date>2021-09-14T09:11:22Z</dc:date>
    </item>
    <item>
      <title>Re: What are some best practices for CICD?</title>
      <link>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26074#M18199</link>
      <description>&lt;P&gt;I don't know if it's best practice, but perhaps it can serve as inspiration.&lt;/P&gt;&lt;P&gt;We do CI/CD with unit test of pyspark code with github actions. Have a look:&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;A href="https://github.com/Energinet-DataHub/geh-aggregations#databricks-workspace" alt="https://github.com/Energinet-DataHub/geh-aggregations#databricks-workspace" target="_blank"&gt;https://github.com/Energinet-DataHub/geh-aggregations#databricks-workspace&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;A href="https://github.com/Energinet-DataHub/geh-aggregations/actions/workflows/aggregation-job-infra-cd.yml" alt="https://github.com/Energinet-DataHub/geh-aggregations/actions/workflows/aggregation-job-infra-cd.yml" target="_blank"&gt;https://github.com/Energinet-DataHub/geh-aggregations/actions/workflows/aggregation-job-infra-cd.yml&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 14 Sep 2021 10:06:42 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26074#M18199</guid>
      <dc:creator>Kristian_Schnei</dc:creator>
      <dc:date>2021-09-14T10:06:42Z</dc:date>
    </item>
    <item>
      <title>Re: What are some best practices for CICD?</title>
      <link>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26075#M18200</link>
      <description>&lt;P&gt;Check out the &lt;A href="https://github.com/databrickslabs/cicd-templates" alt="https://github.com/databrickslabs/cicd-templates" target="_blank"&gt;Databricks Labs CI/CD Templates&lt;/A&gt;. This repository provides a template for automated Databricks CI/CD pipeline creation and deployment.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Table of Contents&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;&lt;A href="https://github.com/databrickslabs/cicd-templates#sample-project-structure-with-github-actions" alt="https://github.com/databrickslabs/cicd-templates#sample-project-structure-with-github-actions" target="_blank"&gt;Sample project structure (with GitHub Actions)&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://github.com/databrickslabs/cicd-templates#sample-project-structure-with-azure-devops" alt="https://github.com/databrickslabs/cicd-templates#sample-project-structure-with-azure-devops" target="_blank"&gt;Sample project structure (with Azure DevOps)&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://github.com/databrickslabs/cicd-templates#sample-project-structure-with-gitlab" alt="https://github.com/databrickslabs/cicd-templates#sample-project-structure-with-gitlab" target="_blank"&gt;Sample project structure (with GitLab)&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://github.com/databrickslabs/cicd-templates#note-on-dbx" alt="https://github.com/databrickslabs/cicd-templates#note-on-dbx" target="_blank"&gt;Note on dbx&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://github.com/databrickslabs/cicd-templates#quickstart" alt="https://github.com/databrickslabs/cicd-templates#quickstart" target="_blank"&gt;Quickstart&lt;/A&gt;&lt;UL&gt;&lt;LI&gt;&lt;A href="https://github.com/databrickslabs/cicd-templates#local-steps" alt="https://github.com/databrickslabs/cicd-templates#local-steps" target="_blank"&gt;Local steps&lt;/A&gt;&lt;UL&gt;&lt;LI&gt;&lt;A href="https://github.com/databrickslabs/cicd-templates#setting-up-cicd-pipeline-on-gitlab" alt="https://github.com/databrickslabs/cicd-templates#setting-up-cicd-pipeline-on-gitlab" target="_blank"&gt;Setting up CI/CD pipeline on Gitlab&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://github.com/databrickslabs/cicd-templates#deployment-file-structure" alt="https://github.com/databrickslabs/cicd-templates#deployment-file-structure" target="_blank"&gt;Deployment file structure&lt;/A&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://github.com/databrickslabs/cicd-templates#setting-up-cicd-pipeline-on-github-actions" alt="https://github.com/databrickslabs/cicd-templates#setting-up-cicd-pipeline-on-github-actions" target="_blank"&gt;Setting up CI/CD pipeline on GitHub Actions&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://github.com/databrickslabs/cicd-templates#setting-up-cicd-pipeline-on-azure-devops" alt="https://github.com/databrickslabs/cicd-templates#setting-up-cicd-pipeline-on-azure-devops" target="_blank"&gt;Setting up CI/CD pipeline on Azure DevOps&lt;/A&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://github.com/databrickslabs/cicd-templates#different-deployment-types" alt="https://github.com/databrickslabs/cicd-templates#different-deployment-types" target="_blank"&gt;Different deployment types&lt;/A&gt;&lt;UL&gt;&lt;LI&gt;&lt;A href="https://github.com/databrickslabs/cicd-templates#deployment-for-run-submit-api" alt="https://github.com/databrickslabs/cicd-templates#deployment-for-run-submit-api" target="_blank"&gt;Deployment for Run Submit API&lt;/A&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 08 Nov 2021 18:38:21 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26075#M18200</guid>
      <dc:creator>MadelynM</dc:creator>
      <dc:date>2021-11-08T18:38:21Z</dc:date>
    </item>
    <item>
      <title>Re: What are some best practices for CICD?</title>
      <link>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26076#M18201</link>
      <description>&lt;P&gt;Two additional resources come to mind:&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;If using Jenkins, there's a &lt;A href="https://docs.databricks.com/dev-tools/ci-cd/ci-cd-jenkins.html?_ga=2.43022053.1957089013.1636433871-567565675.1587488609" alt="https://docs.databricks.com/dev-tools/ci-cd/ci-cd-jenkins.html?_ga=2.43022053.1957089013.1636433871-567565675.1587488609" target="_blank"&gt;best practice guide for CI/CD using Jenkins&lt;/A&gt; that was written based on numerous successful implementations. &lt;/LI&gt;&lt;LI&gt;There is an on-demand webinar focused on &lt;A href="https://databricks.com/p/webinar/data-arch-best-practices" alt="https://databricks.com/p/webinar/data-arch-best-practices" target="_blank"&gt;Getting Workloads to Production&lt;/A&gt; from a DevOps and CI/CD perspective.&lt;/LI&gt;&lt;/OL&gt;</description>
      <pubDate>Tue, 09 Nov 2021 05:21:03 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26076#M18201</guid>
      <dc:creator>User16859945835</dc:creator>
      <dc:date>2021-11-09T05:21:03Z</dc:date>
    </item>
    <item>
      <title>Re: What are some best practices for CICD?</title>
      <link>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26078#M18203</link>
      <description>&lt;P&gt;there is no such REST API (yet)&lt;/P&gt;</description>
      <pubDate>Thu, 25 Nov 2021 18:44:05 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26078#M18203</guid>
      <dc:creator>alexott</dc:creator>
      <dc:date>2021-11-25T18:44:05Z</dc:date>
    </item>
    <item>
      <title>Re: What are some best practices for CICD?</title>
      <link>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26079#M18204</link>
      <description>&lt;P&gt;We have DBFS APIs, but not sure if that solve your purpse @Erik Parmann​&amp;nbsp; . &lt;A href="https://docs.databricks.com/dev-tools/api/latest/dbfs.html#list" target="test_blank"&gt;https://docs.databricks.com/dev-tools/api/latest/dbfs.html#list&lt;/A&gt; you can check out this.&lt;/P&gt;</description>
      <pubDate>Sat, 27 Nov 2021 14:57:24 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26079#M18204</guid>
      <dc:creator>Atanu</dc:creator>
      <dc:date>2021-11-27T14:57:24Z</dc:date>
    </item>
    <item>
      <title>Re: What are some best practices for CICD?</title>
      <link>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26082#M18207</link>
      <description>&lt;P&gt;I guess one could say the same for all the SQL meta objects for which you also need to have a cluster up and running&lt;/P&gt;&lt;P&gt;but some just need a cluster up and running to validate and check them&lt;/P&gt;&lt;P&gt;some even rely on a cluster e.g. for authentication - if you used a mount point with OAuth  for example&lt;/P&gt;</description>
      <pubDate>Mon, 29 Nov 2021 08:18:22 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26082#M18207</guid>
      <dc:creator>gbrueckl</dc:creator>
      <dc:date>2021-11-29T08:18:22Z</dc:date>
    </item>
    <item>
      <title>Re: What are some best practices for CICD?</title>
      <link>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26083#M18208</link>
      <description>&lt;P&gt;Hello there, I would like to retake this thread to ask for good practices for Data processes in Databricks. I  have 2 cloud accounts with one Databricks env in each one (One for dev another for prod).&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I was thinking to create my own CI/CD pipeline to move notebooks from dev env to prod and schedule them with GitHub and Azure DevOps but would like to see what community recommends. &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I've seen that &lt;A href="https://github.com/databrickslabs/cicd-templates" alt="https://github.com/databrickslabs/cicd-templates" target="_blank"&gt;ci-cd templates&lt;/A&gt; is deprecated and now is recommended to use &lt;A href="https://github.com/databrickslabs/dbx" alt="https://github.com/databrickslabs/dbx" target="_blank"&gt;dbx&lt;/A&gt;, is it a tool for that purpose? &lt;/P&gt;</description>
      <pubDate>Wed, 17 Aug 2022 11:22:14 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26083#M18208</guid>
      <dc:creator>LorenRD</dc:creator>
      <dc:date>2022-08-17T11:22:14Z</dc:date>
    </item>
    <item>
      <title>Re: What are some best practices for CICD?</title>
      <link>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26084#M18209</link>
      <description>&lt;P&gt;&lt;A href="https://github.com/databrickslabs/cicd-templates" target="test_blank"&gt;https://github.com/databrickslabs/cicd-templates&lt;/A&gt; is legacy now, the updated one here: &lt;A href="https://dbx.readthedocs.io/en/latest/" alt="https://dbx.readthedocs.io/en/latest/" target="_blank"&gt;dbx&lt;/A&gt;.&lt;/P&gt;&lt;P&gt;you need to walkthrough the doc carefully, there're many informations inside, and BTW `dbx init` could create a template.&lt;/P&gt;</description>
      <pubDate>Fri, 18 Nov 2022 15:12:27 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26084#M18209</guid>
      <dc:creator>xiangzhu</dc:creator>
      <dc:date>2022-11-18T15:12:27Z</dc:date>
    </item>
    <item>
      <title>Re: What are some best practices for CICD?</title>
      <link>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26077#M18202</link>
      <description>&lt;P&gt;We are using the &lt;A href="https://github.com/databrickslabs/terraform-provider-databricks" alt="https://github.com/databrickslabs/terraform-provider-databricks" target="_blank"&gt;databricks terraform provider&lt;/A&gt; to handle... everything really. Then we use a CI runner (in our case azure pipelines) to deploy to dev/test/prod depending on branches and stuff in git (you might prefer tags/branches whatever your branching strategy is). &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;It works pretty good, EXCEPT, validating mounts take a long time (10-15 min) because it needs to spin up a cluster. That is pretty lame, and the only fix seems to be for databricks to make a REST API letting you list/modify mounts, but this is nowhere on any list. &lt;/P&gt;</description>
      <pubDate>Thu, 11 Nov 2021 14:41:58 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26077#M18202</guid>
      <dc:creator>Erik</dc:creator>
      <dc:date>2021-11-11T14:41:58Z</dc:date>
    </item>
    <item>
      <title>Re: What are some best practices for CICD?</title>
      <link>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26080#M18205</link>
      <description>&lt;P&gt;(Note: The mount issue is not mine alone, its a problem for everyone using the &lt;A href="https://github.com/databrickslabs/terraform-provider-databricks" alt="https://github.com/databrickslabs/terraform-provider-databricks" target="_blank"&gt;terraform databricks  provider &lt;/A&gt;)&lt;/P&gt;&lt;P&gt;I guess one could use DBFS api to determine if anything was present at the mount-point, but it wont tell you if its because something is actually mounted there, or where it is mounted from. So one would still have to start a cluster to check those things:-/&lt;/P&gt;</description>
      <pubDate>Sun, 28 Nov 2021 21:33:18 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26080#M18205</guid>
      <dc:creator>Erik</dc:creator>
      <dc:date>2021-11-28T21:33:18Z</dc:date>
    </item>
    <item>
      <title>Re: What are some best practices for CICD?</title>
      <link>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26081#M18206</link>
      <description>&lt;P&gt;Unfortunately not. Where do I vote to make it happen faster?&lt;/P&gt;</description>
      <pubDate>Sun, 28 Nov 2021 21:34:11 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/26081#M18206</guid>
      <dc:creator>Erik</dc:creator>
      <dc:date>2021-11-28T21:34:11Z</dc:date>
    </item>
    <item>
      <title>Re: What are some best practices for CICD?</title>
      <link>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/67193#M33303</link>
      <description>&lt;P&gt;Any leads/posts for Databricks CI/CD&amp;nbsp; integration with Bitbucket pipeline. I am facing the below error while I creation my CICD pipeline&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;pipelines:&lt;BR /&gt;branches:&lt;BR /&gt;master:&lt;BR /&gt;- step:&lt;BR /&gt;name: Deploy Databricks Changes&lt;BR /&gt;image: docker:19.03.12&lt;BR /&gt;services:&lt;BR /&gt;- docker&lt;BR /&gt;script:&lt;BR /&gt;# Update and install required packages&lt;BR /&gt;- apk add --no-cache py-pip expect git&lt;BR /&gt;- pip install --upgrade pip&lt;BR /&gt;- pip install databricks-cli&lt;BR /&gt;- databricks --version&lt;BR /&gt;&lt;BR /&gt;# Create Databricks CLI configuration file for Source Workspace&lt;BR /&gt;- echo -e "[DEFAULT]\nhost = $DB_HOST_SOURCE\ntoken = $DB_TOKEN_QA" &amp;gt; ~/.databrickscfg_source&lt;BR /&gt;# Create Databricks CLI configuration file for Destination Workspace&lt;BR /&gt;- echo -e "[DEFAULT]\nhost = $DB_HOST_DESTINATION\ntoken = $DB_TOKEN_PROD" &amp;gt; ~/.databrickscfg_destination&lt;BR /&gt;&lt;BR /&gt;# Display the contents of the configuration files for verification&lt;BR /&gt;- cat ~/.databrickscfg_source&lt;BR /&gt;- cat ~/.databrickscfg_destination&lt;BR /&gt;&lt;BR /&gt;# Configure Databricks CLI for source&lt;BR /&gt;- |&lt;BR /&gt;expect -c "&lt;BR /&gt;spawn /usr/local/bin/databricks configure --host $DB_HOST_SOURCE --token $DB_TOKEN_QA&lt;BR /&gt;expect eof&lt;BR /&gt;"&lt;BR /&gt;&lt;BR /&gt;# Configure Databricks CLI for destination&lt;BR /&gt;- |&lt;BR /&gt;expect -c "&lt;BR /&gt;spawn /usr/local/bin/databricks configure --host $DB_HOST_DESTINATION --token $DB_TOKEN_PROD&lt;BR /&gt;expect eof&lt;BR /&gt;"&lt;BR /&gt;&lt;BR /&gt;# Clone the latest code from Master branch using App Password&lt;BR /&gt;- git clone --depth 1 https://&amp;lt;&amp;gt;username:&amp;lt;token&amp;gt;@bitbucket.org/ttec-digital-ip/insights_databricks_qa.git /tmp/source_workspace_export/&lt;BR /&gt;&lt;BR /&gt;# Configure Databricks CLI with token for destination using expect&lt;BR /&gt;- |&lt;BR /&gt;expect -c "&lt;BR /&gt;spawn /usr/local/bin/databricks configure --profile destination&lt;BR /&gt;expect \"Databricks Host (should begin with https://): \"&lt;BR /&gt;send \"$DB_HOST_DESTINATION\r\"&lt;BR /&gt;expect \"Token: \"&lt;BR /&gt;send \"$DB_TOKEN_PROD\r\"&lt;BR /&gt;expect eof&lt;BR /&gt;"&lt;BR /&gt;&lt;BR /&gt;# Copy files from Local File System to Destination Workspace&lt;BR /&gt;- databricks fs cp -r /tmp/source_workspace_export /Workspace/Shared/ --profile destination&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;DIV class=""&gt;&lt;DIV class=""&gt;&lt;SPAN&gt;&lt;SPAN class=""&gt;Error: InvalidConfigurationError: You haven't configured the CLI yet for the profile destination! Please configure by entering `/usr/bin/databricks configure --profile destination`&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/DIV&gt;&lt;/DIV&gt;</description>
      <pubDate>Wed, 24 Apr 2024 15:49:08 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/what-are-some-best-practices-for-cicd/m-p/67193#M33303</guid>
      <dc:creator>BaivabMohanty</dc:creator>
      <dc:date>2024-04-24T15:49:08Z</dc:date>
    </item>
  </channel>
</rss>

