<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Import a notebook in a Release Pipeline with a Python script in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/import-a-notebook-in-a-release-pipeline-with-a-python-script/m-p/35067#M25748</link>
    <description>&lt;P&gt;Hi, &lt;/P&gt;&lt;P&gt;I would like to import a python file to Databricks with a Azure DevOps Release Pipeline.&lt;/P&gt;&lt;P&gt;Within the pipeline I execute a python script which contains this code :&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;import sys
import os
import base64
import requests
&amp;nbsp;
dbw_url = sys.argv[1] # &lt;A href="https://adb-XXXXXXXXXXXXX.XX.azuredatabricks.net/" target="test_blank"&gt;https://adb-XXXXXXXXXXXXX.XX.azuredatabricks.net/&lt;/A&gt;
token = sys.argv[2] # databricks PAT
root_source = os.path.join(os.environ.get('SYSTEM_DEFAULTWORKINGDIRECTORY'), '_Build Notebook Artifact', 'artifact_dir_path') # This is a result from a build pipeline
target_dir_path = '/Shared'
file = os.listdir(root_source)[0]
print(file)
&amp;nbsp;
with open(os.path.join(root_source, file), 'rb') as f:
    data = base64.standard_b64encode(f.read()).decode('utf-8')
&amp;nbsp;
json = {
    "content": data,
    "path": os.path.join(target_dir_path, file),
    "language": "PYTHON",
    "overwrite": True,
    "format": "SOURCE"      
}
&amp;nbsp;
import_notebook = requests.post(
  '{}/api/2.0/workspace/import'.format(dbw_url),
  headers={'Authorization': 'Bearer {}'.format(token)},
  json=json
)
&amp;nbsp;
print(import_notebook.status_code) # -&amp;gt; 200&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;Status code is 200 but nothing has been imported in my Databricks Workspace.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Here is what I have in my pipeline logs:&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;2021-11-15T14:39:54.0421229Z ##sectionStarting: Run a Python script
2021-11-15T14:39:54.0429015Z ==============================================================================
2021-11-15T14:39:54.0429348Z Task         : Python script
2021-11-15T14:39:54.0429590Z Description  : Run a Python file or inline script
2021-11-15T14:39:54.0429815Z Version      : 0.182.0
2021-11-15T14:39:54.0430025Z Author       : Microsoft Corporation
2021-11-15T14:39:54.0430358Z Help         : &lt;A href="https://docs.microsoft.com/azure/devops/pipelines/tasks/utility/python-script" target="test_blank"&gt;https://docs.microsoft.com/azure/devops/pipelines/tasks/utility/python-script&lt;/A&gt;
2021-11-15T14:39:54.0430694Z ==============================================================================
2021-11-15T14:39:54.1829847Z [command]/opt/hostedtoolcache/Python/3.10.0/x64/bin/python /home/vsts/work/_temp/2dfc3151-6ce7-4c6d-a74e-59125c767241.py *** ***
2021-11-15T14:39:54.4797958Z ingest_csv.py
2021-11-15T14:39:54.4798454Z 200
2021-11-15T14:39:54.5018448Z ##sectionFinishing: Run a Python script&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;This is working fine when I execute it from my local machine.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Note: weird but when I put a ***** value in the 'token' variable, I obtain the same result : status code 200. However, as expected I obtain an error from my local machine.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Thanks for your help.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
    <pubDate>Mon, 15 Nov 2021 14:49:52 GMT</pubDate>
    <dc:creator>RantoB</dc:creator>
    <dc:date>2021-11-15T14:49:52Z</dc:date>
    <item>
      <title>Import a notebook in a Release Pipeline with a Python script</title>
      <link>https://community.databricks.com/t5/data-engineering/import-a-notebook-in-a-release-pipeline-with-a-python-script/m-p/35067#M25748</link>
      <description>&lt;P&gt;Hi, &lt;/P&gt;&lt;P&gt;I would like to import a python file to Databricks with a Azure DevOps Release Pipeline.&lt;/P&gt;&lt;P&gt;Within the pipeline I execute a python script which contains this code :&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;import sys
import os
import base64
import requests
&amp;nbsp;
dbw_url = sys.argv[1] # &lt;A href="https://adb-XXXXXXXXXXXXX.XX.azuredatabricks.net/" target="test_blank"&gt;https://adb-XXXXXXXXXXXXX.XX.azuredatabricks.net/&lt;/A&gt;
token = sys.argv[2] # databricks PAT
root_source = os.path.join(os.environ.get('SYSTEM_DEFAULTWORKINGDIRECTORY'), '_Build Notebook Artifact', 'artifact_dir_path') # This is a result from a build pipeline
target_dir_path = '/Shared'
file = os.listdir(root_source)[0]
print(file)
&amp;nbsp;
with open(os.path.join(root_source, file), 'rb') as f:
    data = base64.standard_b64encode(f.read()).decode('utf-8')
&amp;nbsp;
json = {
    "content": data,
    "path": os.path.join(target_dir_path, file),
    "language": "PYTHON",
    "overwrite": True,
    "format": "SOURCE"      
}
&amp;nbsp;
import_notebook = requests.post(
  '{}/api/2.0/workspace/import'.format(dbw_url),
  headers={'Authorization': 'Bearer {}'.format(token)},
  json=json
)
&amp;nbsp;
print(import_notebook.status_code) # -&amp;gt; 200&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;Status code is 200 but nothing has been imported in my Databricks Workspace.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Here is what I have in my pipeline logs:&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;2021-11-15T14:39:54.0421229Z ##sectionStarting: Run a Python script
2021-11-15T14:39:54.0429015Z ==============================================================================
2021-11-15T14:39:54.0429348Z Task         : Python script
2021-11-15T14:39:54.0429590Z Description  : Run a Python file or inline script
2021-11-15T14:39:54.0429815Z Version      : 0.182.0
2021-11-15T14:39:54.0430025Z Author       : Microsoft Corporation
2021-11-15T14:39:54.0430358Z Help         : &lt;A href="https://docs.microsoft.com/azure/devops/pipelines/tasks/utility/python-script" target="test_blank"&gt;https://docs.microsoft.com/azure/devops/pipelines/tasks/utility/python-script&lt;/A&gt;
2021-11-15T14:39:54.0430694Z ==============================================================================
2021-11-15T14:39:54.1829847Z [command]/opt/hostedtoolcache/Python/3.10.0/x64/bin/python /home/vsts/work/_temp/2dfc3151-6ce7-4c6d-a74e-59125c767241.py *** ***
2021-11-15T14:39:54.4797958Z ingest_csv.py
2021-11-15T14:39:54.4798454Z 200
2021-11-15T14:39:54.5018448Z ##sectionFinishing: Run a Python script&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;This is working fine when I execute it from my local machine.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Note: weird but when I put a ***** value in the 'token' variable, I obtain the same result : status code 200. However, as expected I obtain an error from my local machine.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Thanks for your help.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 15 Nov 2021 14:49:52 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/import-a-notebook-in-a-release-pipeline-with-a-python-script/m-p/35067#M25748</guid>
      <dc:creator>RantoB</dc:creator>
      <dc:date>2021-11-15T14:49:52Z</dc:date>
    </item>
    <item>
      <title>Re: Import a notebook in a Release Pipeline with a Python script</title>
      <link>https://community.databricks.com/t5/data-engineering/import-a-notebook-in-a-release-pipeline-with-a-python-script/m-p/35068#M25749</link>
      <description>&lt;P&gt;Recently I wrote about alternative way to export/import notebooks in pthon &lt;A href="https://community.databricks.com/s/question/0D53f00001TgT52CAF/import-notebook-with-python-script-using-api" target="test_blank"&gt;https://community.databricks.com/s/question/0D53f00001TgT52CAF/import-notebook-with-python-script-using-api&lt;/A&gt; This way you will get more readable error/message (often it is related to host name or access rights).&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;    pip install databricks-cli&lt;/CODE&gt;&lt;/PRE&gt;&lt;PRE&gt;&lt;CODE&gt;    from databricks_cli.workspace.api import WorkspaceApi
    from databricks_cli.sdk.api_client import ApiClient
     
    client = ApiClient(
        host='https://your.databricks-url.net',
        token=api_key
    )
    workspace_api = WorkspaceApi(client)
    workspace_api.import_workspace(
        source_path="/your/dir/here/hello.py",
        target_path="/Repos/test/hello.py",
        overwrite=True
    )&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 15 Nov 2021 15:43:46 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/import-a-notebook-in-a-release-pipeline-with-a-python-script/m-p/35068#M25749</guid>
      <dc:creator>Hubert-Dudek</dc:creator>
      <dc:date>2021-11-15T15:43:46Z</dc:date>
    </item>
    <item>
      <title>Re: Import a notebook in a Release Pipeline with a Python script</title>
      <link>https://community.databricks.com/t5/data-engineering/import-a-notebook-in-a-release-pipeline-with-a-python-script/m-p/35069#M25750</link>
      <description>&lt;P&gt;okay, I will use the dedicated python api.&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;</description>
      <pubDate>Mon, 15 Nov 2021 17:54:00 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/import-a-notebook-in-a-release-pipeline-with-a-python-script/m-p/35069#M25750</guid>
      <dc:creator>RantoB</dc:creator>
      <dc:date>2021-11-15T17:54:00Z</dc:date>
    </item>
  </channel>
</rss>

