I'm wondering if you can help me with a google auth issue related to structured streaming and long running databricks jobs in general. I will get this error after running for 8+ hours. Any tips on this? GCP auth issues for long running jobs?Caused by...
I have uploaded parquet files to hive meta store tables then performed some transformations on data and generated some visualizations. All this is done in a notebook. I have scheduled the notebook for every morning so that I get a refreshed view of d...
Hi @Sneha Mulrajani​ Does @Prabakar Ammeappin​ response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks!
I have setup a spring boot application which works as expected as a standalone spring boot app.When i build the jar and try to set it up as a databricks job, i am facing these issues.i am getting same error in local as well.I have tried using maven-s...
I have created an Azure AD Group in "Microsoft 365" type with its own email address, which being added to the Notification of a Databricks Job (on failure). But there is no mail sent to the Azure Group mailbox when the job fails.I am able to send a d...
Hi @Md Tahseen Anam​ Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best? If not, please tell us so we can help you.Thanks!
Hey there @Rajesh Kannan R​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from...
Hi, I have Azure Hbase cluster and Databricks. I want to run jobs on Databricks that write data to Hbase. To connect to Hbase I need to get Hbase-site.xml and have it in the classpath or env of a job.Question: How can I run the Databricks job with an...
Hello,We have some Scala code which is compiled and published to an Azure DevOps Artifacts feed.The issue is we're trying to now add this JAR to a Databricks job (through Terraform) to automate the creation.To do this I'm trying to authenticate using...
As of right now, Databricks can't use non-public Maven repositories as resolving of the maven coordinates happens in the control plane. That's different from the R & Python libraries. As workaround you may try to install libraries via init script or ...
Hey everyone! I'm close but can't seem to figure this out. I'm trying to add 2 notebooks to a Databricks Job. Instead of the first command in both notebooks being a connection to an RDS/Redshift cluster, I'd prefer to make that connection once and ha...
I have configured a Databricks job to send email alters to me whenever my job fails. However, I would very much like to alter the text in the Alert Email to something a little more bespoke. Is there any way to alter the text in email or even just t...
Hey there @Peter Mayers​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from yo...
Using above configuration in cluster, when I run databricks job parallelly with multiple request at a same time, then I am getting mount/unmount issue. For an example : When I make three request to databricks job , it run 3 jobs parallelly but somet...
Hi @rahul upadhyay​, We haven’t heard from you on the last response from @Prabakar Ammeappin​ . If you have any solution, please share it with the community as it can be helpful to others. Otherwise, we will respond with more details and try to help.
This is part of the configuration of the task itself, so if no timeout is specified, it can theoretically run forever (e.g. streaming use case). Please refer timeout section in below link.https://docs.databricks.com/dev-tools/api/latest/jobs.html#ope...
Hi, I wondered if some of you have had this issue before and how it can be solved. In a Databricks Job, we have a UBQ with a Painless script for ES. these are the options. Staging and prod are the same configurations, but Staging is failing with the ...
I have a databricks job on E2 architecture in which I want to retrieve the workspace instance name within a notebook running in a Job cluster context so that I can use it further in my use case. While the call dbutils.notebook.entry_point.getDbutils(...
Found workaround for Azure Databricks question above: dbutils.notebook.getContext().apiUrl will return the regional URI, but this forwards to the workspace-specific one if the workspace id is specified with o=.
I created a Databricks job with multiple tasks. Is there a way to pass variable values from one task to another. For example, if I have tasks A and B as Databricks notebooks. Can I create a variable (e.g. x) in notebook A and later use that value in ...
you could also consider using an orchestration tool like Data Factory (Azure) or Glue (AWS). there you can inject and use parameters from notebooks.The job scheduling of databricks also has the possibility to add parameters, but I do not know if yo...