- 3419 Views
- 2 replies
- 1 kudos
I specify init scripts in my deployment.conf, as below: basic-static-cluster: &basic-static-cluster
new_cluster:
spark_version: "13.0.x-scala2.12"
num_workers: 1
node_type_id: "n2-highmem-2"
init_scripts:
- worksp...
- 3419 Views
- 2 replies
- 1 kudos
Latest Reply
Hi @Naga Vaibhav Elluru We haven't heard from you since the last response from @Debayan Mukherjee , and I was checking back to see if her suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be he...
1 More Replies
- 2287 Views
- 2 replies
- 0 kudos
Due to the need for Azure storage private endpoints, we switched our databricks deployment to use an injected vnet. Now, when our deployment pipeline tries to re-create the workspace (e.g. az databricks workspace delete), it seems to leave the MS cre...
- 2287 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @David Benedict Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answer...
1 More Replies
- 2703 Views
- 4 replies
- 3 kudos
Hi Team,Good morning.As of now, for the deployment of our code to Databricks, dbx is configured providing the parameters such as cloud provider, git provider, etc., Say, I have a code repository in any one of the git providers. Can this process of co...
- 2703 Views
- 4 replies
- 3 kudos
Latest Reply
Hi @Arunsundar Muthumanickam Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear fr...
3 More Replies
- 2760 Views
- 3 replies
- 2 kudos
Hiya,I'm trying to follow the testing workflow of```$ dbx deploy test --assets-only$ dbx launch test --from-assets --trace --include-output stdout```But I find the turnaround time is quite long, even with an instance pool.The `deployment.yaml` looks ...
- 2760 Views
- 3 replies
- 2 kudos
Latest Reply
Hi, I have no solution, actually I've just registered to open a very similar ticket, when saw yours.According to my experiments getting an already running VM from the pool (times between events: CREATING - INIT_SCRIPTS_STARTED) can take anything betw...
2 More Replies
- 9302 Views
- 4 replies
- 8 kudos
Am new to databricks and pyspark. Building a pyspark application using pycharm IDE. I have tested the code in local and wanted to run on databricks cluster from IDE itself. Following the dbx documentation and able to run the single python file succes...
- 9302 Views
- 4 replies
- 8 kudos
by
aka1
• New Contributor II
- 2913 Views
- 1 replies
- 3 kudos
- 2913 Views
- 1 replies
- 3 kudos
Latest Reply
this seems code issue only
- 2934 Views
- 3 replies
- 3 kudos
Hi Folks,Since databricks is now asking to use DBX instead of databricks-connect ,We are trying to set up our local environment following the guide.dbx by Databricks Labs - Azure Databricks | Microsoft LearnHave create conf/deployment.yml and dbx/pro...
- 2934 Views
- 3 replies
- 3 kudos
Latest Reply
fixed this issue. but I am getting another issue while syncing local repo with Workspace in Databricks UI.When I run commanddbx sync repo -d workspace name --source.Command runs fine . I can see that dbfs but not under workspace in Databricks page.An...
2 More Replies
by
dslin
• New Contributor III
- 3054 Views
- 4 replies
- 0 kudos
Hi,I'm very new to databricks, this might be a basic question.I can't find a way to run my local python file with databricks successfully. When I run the following `execute` command, I got a FileNotFoundError.`dbx execute --cluster-id=*** --job=Sampl...
- 3054 Views
- 4 replies
- 0 kudos
- 2332 Views
- 0 replies
- 4 kudos
Best Practices for CI/CD on DatabricksFor CI/CD and software engineering best practices with Databricks notebooks we recommend checking out this best practices guide (AWS, Azure, GCP).For CI/CD and local development using an IDE, we recommend dbx, a ...
- 2332 Views
- 0 replies
- 4 kudos
- 7453 Views
- 2 replies
- 3 kudos
I am trying to execute a local PySpark script on a Databricks cluster via dbx utility to test how passing arguments to python works in Databricks when developing locally. However, the test arguments I am passing are not being read for some reason. Co...
- 7453 Views
- 2 replies
- 3 kudos
Latest Reply
You can pass parameters using dbx launch --parametersIf you want to define it in the deployment template please try to follow exactly databricks API 2.1 schema https://docs.databricks.com/dev-tools/api/latest/jobs.html#operation/JobsCreate (for examp...
1 More Replies