<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Apart from notebook , is it possible to deploy an application (Pyspark , or R+spark) as a package or file and execute them in Databricks ? in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/apart-from-notebook-is-it-possible-to-deploy-an-application/m-p/34515#M25257</link>
    <description>&lt;P&gt;So we definitely need a notebook just to execute our python file.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Is it possible to deploy this python files in the Workspace , instead of the Repos ?&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="image"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/2284iD84E85AA2C03A052/image-size/large?v=v2&amp;amp;px=999" role="button" title="image" alt="image" /&gt;&lt;/span&gt;&lt;/P&gt;</description>
    <pubDate>Wed, 24 Nov 2021 16:06:42 GMT</pubDate>
    <dc:creator>sunil_smile</dc:creator>
    <dc:date>2021-11-24T16:06:42Z</dc:date>
    <item>
      <title>Apart from notebook , is it possible to deploy an application (Pyspark , or R+spark) as a package or file and execute them in Databricks ?</title>
      <link>https://community.databricks.com/t5/data-engineering/apart-from-notebook-is-it-possible-to-deploy-an-application/m-p/34511#M25253</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;With the help of Databricks-connect i was able to connect the cluster to my local IDE like Pycharm and Rstudio desktop version and able to develop the application and committed the code in Git.&lt;/P&gt;&lt;P&gt;When i try to add that repo to the Databricks workspace , i noticed that python files which i created in Pycharm are not getting displayed. I see only the notebooks file. &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Is there any option , to deploy those python files in databricks cluster and execute those files.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;files present in pycharm&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="image"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/2294i3DACA1C77C6AC7D2/image-size/large?v=v2&amp;amp;px=999" role="button" title="image" alt="image" /&gt;&lt;/span&gt;but those files are not getting displayed in databricks workspace.&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="image"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/2287i98DC86306A3252C9/image-size/large?v=v2&amp;amp;px=999" role="button" title="image" alt="image" /&gt;&lt;/span&gt; &lt;/P&gt;&lt;P&gt;Most of my developers prefer to work in the IDE which they are familiar and comfortable. It helps them to develop and debug them quickly.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Kindly suggest , how i can address this problem.&lt;/P&gt;</description>
      <pubDate>Tue, 23 Nov 2021 16:04:53 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/apart-from-notebook-is-it-possible-to-deploy-an-application/m-p/34511#M25253</guid>
      <dc:creator>sunil_smile</dc:creator>
      <dc:date>2021-11-23T16:04:53Z</dc:date>
    </item>
    <item>
      <title>Re: Apart from notebook , is it possible to deploy an application (Pyspark , or R+spark) as a package or file and execute them in Databricks ?</title>
      <link>https://community.databricks.com/t5/data-engineering/apart-from-notebook-is-it-possible-to-deploy-an-application/m-p/34512#M25254</link>
      <description>&lt;P&gt;Please enable option "Files in Repo" in Settings -&amp;gt; Admin Console&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="image.png"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/2292i40B39DC7EB7D2DC7/image-size/large?v=v2&amp;amp;px=999" role="button" title="image.png" alt="image.png" /&gt;&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 23 Nov 2021 16:59:46 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/apart-from-notebook-is-it-possible-to-deploy-an-application/m-p/34512#M25254</guid>
      <dc:creator>Hubert-Dudek</dc:creator>
      <dc:date>2021-11-23T16:59:46Z</dc:date>
    </item>
    <item>
      <title>Re: Apart from notebook , is it possible to deploy an application (Pyspark , or R+spark) as a package or file and execute them in Databricks ?</title>
      <link>https://community.databricks.com/t5/data-engineering/apart-from-notebook-is-it-possible-to-deploy-an-application/m-p/34513#M25255</link>
      <description>&lt;P&gt;@Hubert Dudek​&amp;nbsp;Thanks for your response. &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Yes now i can able to see the python files.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;But how i can execute that python file , because i see the Job accepts only DBFS file location..&lt;/P&gt;&lt;P&gt;Is there any other way to execute that python file?&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="image"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/2283i84A08F745557510F/image-size/large?v=v2&amp;amp;px=999" role="button" title="image" alt="image" /&gt;&lt;/span&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Our requirement is to do the development in IDEs Pycharm or Rstudio  using the Databricks-connect and deploy the final version of code in databricks and execute them as Job. Is there any option available in Databricks&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 24 Nov 2021 08:51:53 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/apart-from-notebook-is-it-possible-to-deploy-an-application/m-p/34513#M25255</guid>
      <dc:creator>sunil_smile</dc:creator>
      <dc:date>2021-11-24T08:51:53Z</dc:date>
    </item>
    <item>
      <title>Re: Apart from notebook , is it possible to deploy an application (Pyspark , or R+spark) as a package or file and execute them in Databricks ?</title>
      <link>https://community.databricks.com/t5/data-engineering/apart-from-notebook-is-it-possible-to-deploy-an-application/m-p/34514#M25256</link>
      <description>&lt;P&gt;I just import class from .py in notebook and then use it:&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;from&amp;nbsp;folder.file_folder.file_name import class&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;in notebook I initiate class.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I have everything in repos (notebook, py files). I have repo opened in Visual Studio and I can edit there notebooks as well. I haven't tested other IDE but it is just git repo with notebook so you can edit as well in PyCharm. Than in job you can set just notebook so it will be much easier.&lt;/P&gt;</description>
      <pubDate>Wed, 24 Nov 2021 11:23:29 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/apart-from-notebook-is-it-possible-to-deploy-an-application/m-p/34514#M25256</guid>
      <dc:creator>Hubert-Dudek</dc:creator>
      <dc:date>2021-11-24T11:23:29Z</dc:date>
    </item>
    <item>
      <title>Re: Apart from notebook , is it possible to deploy an application (Pyspark , or R+spark) as a package or file and execute them in Databricks ?</title>
      <link>https://community.databricks.com/t5/data-engineering/apart-from-notebook-is-it-possible-to-deploy-an-application/m-p/34515#M25257</link>
      <description>&lt;P&gt;So we definitely need a notebook just to execute our python file.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Is it possible to deploy this python files in the Workspace , instead of the Repos ?&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="image"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/2284iD84E85AA2C03A052/image-size/large?v=v2&amp;amp;px=999" role="button" title="image" alt="image" /&gt;&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 24 Nov 2021 16:06:42 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/apart-from-notebook-is-it-possible-to-deploy-an-application/m-p/34515#M25257</guid>
      <dc:creator>sunil_smile</dc:creator>
      <dc:date>2021-11-24T16:06:42Z</dc:date>
    </item>
    <item>
      <title>Re: Apart from notebook , is it possible to deploy an application (Pyspark , or R+spark) as a package or file and execute them in Databricks ?</title>
      <link>https://community.databricks.com/t5/data-engineering/apart-from-notebook-is-it-possible-to-deploy-an-application/m-p/34516#M25258</link>
      <description>&lt;P&gt;may be you will be interested our db connect . not sure if that resolve your issue to connect with 3rd party tool and setup ur supported IDE notebook server&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;A href="https://docs.databricks.com/dev-tools/databricks-connect.html" target="test_blank"&gt;https://docs.databricks.com/dev-tools/databricks-connect.html&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 30 Nov 2021 06:52:16 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/apart-from-notebook-is-it-possible-to-deploy-an-application/m-p/34516#M25258</guid>
      <dc:creator>Atanu</dc:creator>
      <dc:date>2021-11-30T06:52:16Z</dc:date>
    </item>
  </channel>
</rss>

