cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Sulfikkar
by Contributor
  • 2725 Views
  • 8 replies
  • 4 kudos

Cluster Scoped init script through pulumi

I am trying to run a cluster-scoped init script through Pulumi. I have referred to this documentation https://learn.microsoft.com/en-us/azure/databricks/clusters/configure#spark-configuration However, looks like the documentation is not very clear.I ...

  • 2725 Views
  • 8 replies
  • 4 kudos
Latest Reply
Vivian_Wilfred
Honored Contributor
  • 4 kudos

Hi @Sulfikkar Basheer Shylaja​ , Why don't you store the init-script on DBFS and just pass the dbfs:/ path of the init script in Pulumi? You could just run this code on a notebook-%python dbutils.fs.put("/databricks/init-scripts/set-private-pip-repos...

  • 4 kudos
7 More Replies
Jonas89
by New Contributor
  • 1437 Views
  • 2 replies
  • 0 kudos

Databricks Devops Release Pipeline Abort

We've built a release pipeline to our Databricks Workspaces, using the VNET Template. It's working end-to-end but intermittent aborts are occurring when workspace is recreated.For example, 4th of April (Monday) We recreated the workspaces and no abor...

1_PipelineError 2_PipelineError 3_PipelineError 4_PipelineError
  • 1437 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Jonas Oliveira de Souza​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that be...

  • 0 kudos
1 More Replies
Chris_Shehu
by Valued Contributor III
  • 1981 Views
  • 1 replies
  • 5 kudos

Resolved! Getting errors while following Microsoft Databricks Best-Practices for DevOps Integration

I'm currently trying to follow the Software engineering best practices for notebooks - Azure Databricks guide but I keep running into the following during step 4.5: Run the test============================= test session starts =======================...

image.png image image image
  • 1981 Views
  • 1 replies
  • 5 kudos
Latest Reply
Chris_Shehu
Valued Contributor III
  • 5 kudos

Closing the loop on this in case anyone gets stuck in the same situation. You can see in the images that the transforms_test.py shows a different icon then the testdata.csv. This is because it was saved as a juypter notebook not a .py file. When the ...

  • 5 kudos
MadelynM
by New Contributor III
  • 550 Views
  • 0 replies
  • 1 kudos

vimeo.com

Repos let you use Git functionality such as cloning a remote repo, managing branches, pushing and pulling changes and visually comparing differences upon commit. Here's a quick video (3:56) on setting up a repo for Databricks on AWS. Pre-reqs: Git in...

  • 550 Views
  • 0 replies
  • 1 kudos
SEOCO
by New Contributor II
  • 1640 Views
  • 3 replies
  • 3 kudos

Passing parameters from DevOps Pipeline/release to DataBricks Notebook

Hi,This is all a bit new to me.Does anybody have any idea how to pass a parameter to the Databricks notebook.I have a DevOps pipeline/release that moves my databricks notebooks towards QA and Production environment. The only problem I am facing is th...

  • 1640 Views
  • 3 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

@Mario Walle​ - If @Hubert Dudek​'s answer solved the issue, would you be happy to mark his answer as best so that it will be more visible to other members?

  • 3 kudos
2 More Replies
User16137833804
by New Contributor III
  • 822 Views
  • 1 replies
  • 1 kudos
  • 822 Views
  • 1 replies
  • 1 kudos
Latest Reply
sajith_appukutt
Honored Contributor II
  • 1 kudos

You could have the single node cluster where proxy is installed monitored by one of the tools like cloudwatch, azure monitor, datadog etc and have it configured to send alerts on node failure

  • 1 kudos
Labels