- 4907 Views
- 2 replies
- 3 kudos
User is running a job triggered from ADF in Databricks. In this job they need to use custom libraries that are in jars. Most of the times jobs are running fine, however sometimes it fails with:java.lang.NoClassDefFoundError: Could not initializeAny s...
- 4907 Views
- 2 replies
- 3 kudos
Latest Reply
Can you please check if there are more than one jar containing this class . If multiple jars of the same type are available on the cluster, then there is no guarantee of JVM picking the proper classes for processing, which results in the intermittent...
1 More Replies
- 4506 Views
- 3 replies
- 0 kudos
I would like to set the permissions to jobs such as granting "CAN_VIEW" or "CAN_MANAGE" to specific groups that run from ADF. It appears that we need to set permissions in pipe line where job runs from ADF, But I could not figure it out.
- 4506 Views
- 3 replies
- 0 kudos
Latest Reply
Thank you @Debayan Mukherjee and @Vidula Khanna for getting back to me. But, it didn't help my case. I am specifically looking for setting permissions to the job so that our team can see the job cluster including Spark UI with that privilege. ...
2 More Replies
- 9398 Views
- 2 replies
- 3 kudos
Hi Team ,I am using job cluster while setting Linked Service in ADF to call Data bricks Notebook activity .Cluster Detail - Policy - UnrestrictedAccess Mode - Single userUnity Catalog Enabled.databrick run time - 12.2 LTS (includes Apache Spark 3.3.2...
- 9398 Views
- 2 replies
- 3 kudos
Latest Reply
Hi @Akshay Patni We haven't heard from you since the last response from @Debayan Mukherjee . Kindly share the information with us, and in return, we will provide you with the necessary solution. Thanks and Regards
1 More Replies
- 1100 Views
- 0 replies
- 0 kudos
In my particular use case, the creation of the mount is initiated through a notebook activity in Azure Data Factory (ADF). This activity utilizes a job cluster for the current execution. However, it has come to my attention that the mounts generated ...
- 1100 Views
- 0 replies
- 0 kudos
by
KVNARK
• Honored Contributor II
- 3744 Views
- 2 replies
- 1 kudos
Notebook activity is getting timed out after certain time of running (5 hours) in ADF pipeline and getting timeout error.Its just simply getting timed out error. Problem is this will process TB of data daily. can anyone have any idea to fix this.
- 3744 Views
- 2 replies
- 1 kudos
by
Jkb
• New Contributor II
- 3812 Views
- 0 replies
- 1 kudos
- 3812 Views
- 0 replies
- 1 kudos
- 1419 Views
- 2 replies
- 0 kudos
hi , I want date for runtime from ADF as @utcnow() -- base paramater of notebook activity in ADF and take the data in ADB using widgets as runtime_date, further i want that column to be added in my table X with the populated value from the widget.Eve...
- 1419 Views
- 2 replies
- 0 kudos
Latest Reply
sher
Valued Contributor II
you can use as current_timestamp() or now()refer link: https://docs.databricks.com/sql/language-manual/functions/current_timestamp.html
1 More Replies
by
KVNARK
• Honored Contributor II
- 3664 Views
- 4 replies
- 6 kudos
how can we parameterize key of the spark-config in the job cluster linked service from Azure datafactory, we can parameterize the values but any idea how can we parameterize the key so that when deploying to further environment it takes the PROD/QA v...
- 3664 Views
- 4 replies
- 6 kudos
Latest Reply
@KVNARK . You can use Databricks Secrets (create a Secret scope from AKV https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secret-scopes) and then reference a secret in spark configuration (https://learn.microsoft.com/en-us/azure/d...
3 More Replies
- 3833 Views
- 4 replies
- 4 kudos
How to run the databricks notebook through ADF ???
- 3833 Views
- 4 replies
- 4 kudos
Latest Reply
Hi @Punit Chauhan you can use databricks notebook activity in ADF to trigger you databricks notebook via ADF-
3 More Replies
by
g96g
• New Contributor III
- 6920 Views
- 8 replies
- 0 kudos
I have project where I have to read the data from NETSUITE using API. Databricks Notebook runs perfectly when I manually insert the table names I want to read from the source. I have dataset (csv) file in adf with all the table names that I need to r...
- 6920 Views
- 8 replies
- 0 kudos
Latest Reply
Have you tried do debug the json payload of adf trigger ? maybe it wrongly conveys tables names
7 More Replies
- 748 Views
- 0 replies
- 0 kudos
Hi Everyone ,My Business requirement is to schedule single job from 1st to 10th of the month on 12 AM , 3AM , 12PM , 4PM , 8PM and from 10th to Month End 1AM , 12PM , 4PM , 8PM right now we have created 2 schedular to meet the requirement and using c...
- 748 Views
- 0 replies
- 0 kudos
- 8061 Views
- 4 replies
- 1 kudos
I have noticed that my orchestrated pipelines (in ADF) sometimes fail due to this error:ErrorCode=FailedToReadFromAzureDatabricksDeltaLake,Failed to read from azure databricks delta lake.
Error message : Failed to send request to Azure Databricks Clu...
- 8061 Views
- 4 replies
- 1 kudos
Latest Reply
Hi @Oscar Dyremyhr Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Tha...
3 More Replies
- 12972 Views
- 10 replies
- 10 kudos
Hi,I would like to capture notebook custom log exceptions(python) from ADF pipeline based on the exceptions pipeline should got succeed or failed.Is there any mechanism to implement it. In my testing ADF pipeline is successful irrespective of the log...
- 12972 Views
- 10 replies
- 10 kudos
Latest Reply
Hi SailajaB,Try this out.Notebook, once executed successfully return a long JSON formatted output. We need to specify appropriate nodes to fetch the output. In below screenshot we can see that when notebook ran it returns empName & empCity as output....
9 More Replies
- 2242 Views
- 2 replies
- 1 kudos
Hi Team,How do we check the existence of a table in ADF container using SQL query in Databricks?Thanks in advance.
- 2242 Views
- 2 replies
- 1 kudos
Latest Reply
Hi, please elaborate on the issue for us to help you resolve it.
1 More Replies
- 7448 Views
- 2 replies
- 3 kudos
I am trying to execute a local PySpark script on a Databricks cluster via dbx utility to test how passing arguments to python works in Databricks when developing locally. However, the test arguments I am passing are not being read for some reason. Co...
- 7448 Views
- 2 replies
- 3 kudos
Latest Reply
You can pass parameters using dbx launch --parametersIf you want to define it in the deployment template please try to follow exactly databricks API 2.1 schema https://docs.databricks.com/dev-tools/api/latest/jobs.html#operation/JobsCreate (for examp...
1 More Replies