cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Dean_Lovelace
by New Contributor III
  • 17768 Views
  • 11 replies
  • 2 kudos

How can I deploy workflow jobs to another databricks workspace?

I have created a number of workflows in the Databricks UI. I now need to deploy them to a different workspace.How can I do that?Code can be deployed via Git, but the job definitions are stored in the workspace only.

  • 17768 Views
  • 11 replies
  • 2 kudos
Latest Reply
cpradeep
New Contributor II
  • 2 kudos

@Dean_Lovelace did you implement the solution ? Please share how you implemented CI/CD for workflow?

  • 2 kudos
10 More Replies
Oliver_Angelil
by Valued Contributor II
  • 10372 Views
  • 6 replies
  • 6 kudos

In what circumstances are both UAT/DEV and PROD environments actually necessary?

I wanted to ask this Q yesterday in the Q&A session with Mohan Mathews, but didn't get around to it (@Kaniz Fatma​ do you know his handle here so I can tag him?)We (and most development teams) have two environments: UAT/DEV and PROD. For those that d...

  • 10372 Views
  • 6 replies
  • 6 kudos
Latest Reply
Anonymous
Not applicable
  • 6 kudos

Hi @Oliver Angelil​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Tha...

  • 6 kudos
5 More Replies
source2sea
by Contributor
  • 3454 Views
  • 1 replies
  • 0 kudos

Resolved! what mode is the deploy-mode when calling spark in databricks/

https://spark.apache.org/docs/latest/submitting-applications.htmlmainly want to know if extra class path could be used or not when i submit a job

  • 3454 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@min shi​ :In Databricks, when you run a job, you are submitting a Spark application to run in the cluster. The deploy-mode that is used by default depends on the type of job you are running:For interactive clusters, the deploy-mode is client. This m...

  • 0 kudos
dslin
by New Contributor III
  • 2437 Views
  • 3 replies
  • 2 kudos

How to deploy a python script with dependencies by dbx?

Hi,I'm quite new here. I'm trying to perform a deployment of python file with dbx command. The file contains libraries to be installed. How may I deploy the file (together with its dependencies) to databricks?Here are the commands I currently run:`db...

  • 2437 Views
  • 3 replies
  • 2 kudos
Latest Reply
Vidula
Honored Contributor
  • 2 kudos

Hi @Di Lin​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 2 kudos
2 More Replies
thushar
by Contributor
  • 3694 Views
  • 4 replies
  • 3 kudos

Resolved! Deploy tar.gz package from private git hub

We created Python package (.tar.gz) and kept it under private git.We can able to connect to that git (using PAT) from the Azure databricks notebook.Our requirement is to install that package from .tar.gz file for that notebook"pip install https://USE...

  • 3694 Views
  • 4 replies
  • 3 kudos
Latest Reply
Rahul_Samant
Contributor
  • 3 kudos

For installing the package using pip you need to package the repo using setup.py. check this link for more details https://packaging.python.org/en/latest/tutorials/packaging-projects/alternatively you can pass the tar.gz using --py-files while submi...

  • 3 kudos
3 More Replies
sunil_smile
by Contributor
  • 6337 Views
  • 5 replies
  • 6 kudos

Apart from notebook , is it possible to deploy an application (Pyspark , or R+spark) as a package or file and execute them in Databricks ?

Hi,With the help of Databricks-connect i was able to connect the cluster to my local IDE like Pycharm and Rstudio desktop version and able to develop the application and committed the code in Git.When i try to add that repo to the Databricks workspac...

image image
  • 6337 Views
  • 5 replies
  • 6 kudos
Latest Reply
Atanu
Databricks Employee
  • 6 kudos

may be you will be interested our db connect . not sure if that resolve your issue to connect with 3rd party tool and setup ur supported IDE notebook serverhttps://docs.databricks.com/dev-tools/databricks-connect.html

  • 6 kudos
4 More Replies
User16826992666
by Valued Contributor
  • 1212 Views
  • 1 replies
  • 0 kudos

Resolved! What is the point of the model staging and promotion functions in MLflow?

Why not just directly deploy the model where you need it in production?

  • 1212 Views
  • 1 replies
  • 0 kudos
Latest Reply
sean_owen
Databricks Employee
  • 0 kudos

The Model Registry is mostly a workflow tool. It helps 'gate' the process, so that (for example) only authorized users can set a model to be the newest Production version of a model - that's not something just anyone should be able to do!The Registry...

  • 0 kudos
Labels