cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

naga_databricks
by Contributor
  • 3047 Views
  • 2 replies
  • 1 kudos

Using Init scripts using DBX

I specify init scripts in my deployment.conf, as below: basic-static-cluster: &basic-static-cluster new_cluster: spark_version: "13.0.x-scala2.12" num_workers: 1 node_type_id: "n2-highmem-2" init_scripts: - worksp...

  • 3047 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Naga Vaibhav Elluru​ We haven't heard from you since the last response from @Debayan Mukherjee​ â€‹, and I was checking back to see if her suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be he...

  • 1 kudos
1 More Replies
Dave_B_
by New Contributor III
  • 1839 Views
  • 2 replies
  • 0 kudos

DBX injected V-Net and Deployment

Due to the need for Azure storage private endpoints, we switched our databricks deployment to use an injected vnet. Now, when our deployment pipeline tries to re-create the workspace (e.g. az databricks workspace delete), it seems to leave the MS cre...

  • 1839 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @David Benedict​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answer...

  • 0 kudos
1 More Replies
Arunsundar
by New Contributor III
  • 2402 Views
  • 4 replies
  • 3 kudos

Automating the initial configuration of dbx

Hi Team,Good morning.As of now, for the deployment of our code to Databricks, dbx is configured providing the parameters such as cloud provider, git provider, etc., Say, I have a code repository in any one of the git providers. Can this process of co...

  • 2402 Views
  • 4 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Arunsundar Muthumanickam​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear fr...

  • 3 kudos
3 More Replies
agagrins
by New Contributor III
  • 2382 Views
  • 3 replies
  • 2 kudos

How to speed up `dbx launch --from-assets`

Hiya,I'm trying to follow the testing workflow of```$ dbx deploy test --assets-only$ dbx launch test --from-assets --trace --include-output stdout```But I find the turnaround time is quite long, even with an instance pool.The `deployment.yaml` looks ...

  • 2382 Views
  • 3 replies
  • 2 kudos
Latest Reply
tonkol
New Contributor II
  • 2 kudos

Hi, I have no solution, actually I've just registered to open a very similar ticket, when saw yours.According to my experiments getting an already running VM from the pool (times between events: CREATING - INIT_SCRIPTS_STARTED) can take anything betw...

  • 2 kudos
2 More Replies
sasidhar
by New Contributor II
  • 8300 Views
  • 4 replies
  • 8 kudos

custom python module not found while using dbx on pycharm

Am new to databricks and pyspark. Building a pyspark application using pycharm IDE. I have tested the code in local and wanted to run on databricks cluster from IDE itself. Following the dbx documentation and able to run the single python file succes...

  • 8300 Views
  • 4 replies
  • 8 kudos
Latest Reply
Meghala
Valued Contributor II
  • 8 kudos

Even I got error​

  • 8 kudos
3 More Replies
aka1
by New Contributor II
  • 2509 Views
  • 1 replies
  • 3 kudos

dbx - run unit test error (java.lang.NoSuchMethodError)

I am setting up dbx for the fist time on Windows 10. Strictly following https://dbx.readthedocs.io/en/latest/guides/python/python_quickstart/openjdk is installed conda install -c conda-forge openjdk=11.0.15winutils.exe for Hadoop 3 is downloaded, pat...

image.png image image
  • 2509 Views
  • 1 replies
  • 3 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 3 kudos

this seems code issue only

  • 3 kudos
mickniz
by Contributor
  • 2498 Views
  • 3 replies
  • 3 kudos

Dbx installation for local development on Vscode

Hi Folks,Since databricks is now asking to use DBX instead of databricks-connect ,We are trying to set up our local environment following the guide.dbx by Databricks Labs - Azure Databricks | Microsoft LearnHave create conf/deployment.yml and dbx/pro...

  • 2498 Views
  • 3 replies
  • 3 kudos
Latest Reply
mickniz
Contributor
  • 3 kudos

fixed this issue. but I am getting another issue while syncing local repo with Workspace in Databricks UI.When I run commanddbx sync repo -d workspace name --source.Command runs fine . I can see that dbfs but not under workspace in Databricks page.An...

  • 3 kudos
2 More Replies
dslin
by New Contributor III
  • 2703 Views
  • 4 replies
  • 0 kudos

FileNotFoundError when running dbx execute

Hi,I'm very new to databricks, this might be a basic question.I can't find a way to run my local python file with databricks successfully. When I run the following `execute` command, I got a FileNotFoundError.`dbx execute --cluster-id=*** --job=Sampl...

  • 2703 Views
  • 4 replies
  • 0 kudos
Latest Reply
Vidula
Honored Contributor
  • 0 kudos

Hi @Di Lin​ Thanks for the quick response.Regards

  • 0 kudos
3 More Replies
isaac_gritz
by Databricks Employee
  • 2092 Views
  • 0 replies
  • 4 kudos

CI/CD Best Practices

Best Practices for CI/CD on DatabricksFor CI/CD and software engineering best practices with Databricks notebooks we recommend checking out this best practices guide (AWS, Azure, GCP).For CI/CD and local development using an IDE, we recommend dbx, a ...

  • 2092 Views
  • 0 replies
  • 4 kudos
sage5616
by Valued Contributor
  • 6478 Views
  • 2 replies
  • 3 kudos

Resolved! Running local python code with arguments in Databricks via dbx utility.

I am trying to execute a local PySpark script on a Databricks cluster via dbx utility to test how passing arguments to python works in Databricks when developing locally. However, the test arguments I am passing are not being read for some reason. Co...

  • 6478 Views
  • 2 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 3 kudos

You can pass parameters using dbx launch --parametersIf you want to define it in the deployment template please try to follow exactly databricks API 2.1 schema https://docs.databricks.com/dev-tools/api/latest/jobs.html#operation/JobsCreate (for examp...

  • 3 kudos
1 More Replies
Labels