cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

RajeshRK
by Contributor
  • 4932 Views
  • 6 replies
  • 0 kudos

Resolved! Need help to analyze databricks logs for a long-running job.

Hi Team,We have a job it completes in 3 minutes in one Databricks cluster, if we run the same job in another databricks cluster it is taking 3 hours to complete.I am quite new to Databricks and need your guidance on how to find out where databricks s...

  • 4932 Views
  • 6 replies
  • 0 kudos
Latest Reply
AmitKP
New Contributor II
  • 0 kudos

Hi @Kaniz ,I am saving logs of my databricks Job Compute From ADF, How can i open those files that present in dbfs location.

  • 0 kudos
5 More Replies
Binesh
by New Contributor II
  • 1287 Views
  • 2 replies
  • 0 kudos

Databricks Logs some error messages while trying to read data using databricks-jdbc dependency

I have tried to read data from Databricks using the following java code.String TOKEN = "token..."; String url = "url...";   Properties properties = new Properties(); properties.setProperty("user", "token"); properties.setProperty("PWD", TOKEN);   Con...

Logger Errors
  • 1287 Views
  • 2 replies
  • 0 kudos
Latest Reply
shan_chandra
Honored Contributor III
  • 0 kudos

@Binesh J​ - The issue could be due to the data type of the column is not compatible with getString() method in line#17. use getObject() method to retrieve the value as a generic value and then convert to string.

  • 0 kudos
1 More Replies
psps
by New Contributor III
  • 2171 Views
  • 3 replies
  • 4 kudos

Databricks Job run logs only shows prints/logs from driver and not executors

Hi,​In Databricks Job run output, only logs from driver are displayed. We have a function parallelized to run on executor nodes. The logs/prints from that function are not displayed in job run output. Is there a way to configure and show those logs i...

  • 2171 Views
  • 3 replies
  • 4 kudos
Latest Reply
psps
New Contributor III
  • 4 kudos

Thanks @Debayan Mukherjee​ . This is to enable executor logging. However, the executor logs do not appear in Databricks Job run output. Only driver logs are displayed.

  • 4 kudos
2 More Replies
Dean_Lovelace
by New Contributor III
  • 1781 Views
  • 3 replies
  • 0 kudos

How to filter the Spark UI for a notebook

When running spark under yarn each script has it's own self contained set of logs:- In databricks all I see if a list of jobs and stages that have been run on the cluster:- From a support perspective this is a nightmare.How can notebooks logs be grou...

image image
  • 1781 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Dean Lovelace​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers...

  • 0 kudos
2 More Replies
jtorr
by New Contributor
  • 1059 Views
  • 1 replies
  • 0 kudos

What are executeAdhocQuery and executeFastQuery operations in the Azure SQL Logs?

Hi,-Im performing some analysis using the databricks sql logs, and seeing these operation names.-I notice these events dont seem to have a duration nor query text, unlike commandSubmit operations.-Any explanation on what these operations mean exactly...

  • 1059 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Jose Torres​, executeAdhocQuery and executeFastQuery are two types of operations that can appear in the Azure SQL Logs.executeAdhocQuery refers to the execution of an ad hoc query, which is a one-time query that is not stored as a prepared statem...

  • 0 kudos
RajeshRK
by Contributor
  • 4551 Views
  • 7 replies
  • 3 kudos

Resolved! Download event, driver, and executor logs

Hi Team, I can see logs in Databricks console by navigating workflow -> job name -> logs. These logs are very generic like stdout, stderr and log4-avtive.log. How to download event, driver, and executor logs at once for a job? Regards,Rajesh.

  • 4551 Views
  • 7 replies
  • 3 kudos
Latest Reply
RajeshRK
Contributor
  • 3 kudos

@Kaniz Fatma​ @John Lourdu​ @Vidula Khanna​ Hi Team,I managed to download logs using the Databricks command line as below: Installed the Databricks command line on my Desktop (pip install databricks-cli)Configured the Databricks cluster URL and perso...

  • 3 kudos
6 More Replies
sagiatul
by New Contributor II
  • 1535 Views
  • 2 replies
  • 3 kudos

Databricks driver logs

I am running jobs on databricks clusters. When the cluster is running I am able to find the executor logs by going to Spark Cluster UI Master dropdown, selecting a worker and going through the stderr logs. However, once the job is finished and cluste...

image
  • 1535 Views
  • 2 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Atul Arora​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedback w...

  • 3 kudos
1 More Replies
Mado
by Valued Contributor II
  • 858 Views
  • 1 replies
  • 1 kudos

Resolved! How to query Databricks audit logs?

Hi,I would like to ask where the Databricks Audit Log files are stored on the DBFS.And is there any way that I can query log files?Thanks.

  • 858 Views
  • 1 replies
  • 1 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 1 kudos

Hi @Mohammad Saber​ ,I think first you need to configure audit log in databricks then you use it.Please refer below blog that will help you in this.Configure audit logging | Databricks on AWS

  • 1 kudos
Murthy1
by Contributor II
  • 3128 Views
  • 5 replies
  • 4 kudos

Send custom logs to AWS cloudwatch from Notebook

I would like to send some custom logs (in Python) from my Databricks notebook to AWS Cloudwatch. For example: df = spark.read.json(".......................")logger.info("Successfully ingested data from json")Has someone succeeded in doing this before...

  • 3128 Views
  • 5 replies
  • 4 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 4 kudos

Hi, You can integrate, please refer: https://aws.amazon.com/blogs/mt/how-to-monitor-databricks-with-amazon-cloudwatch/ and also you can configure audit logging to S3 and redirect it to cloudwatch from AWS. , refer: https://aws.amazon.com/blogs/mt/how...

  • 4 kudos
4 More Replies
Optum
by New Contributor III
  • 5298 Views
  • 10 replies
  • 4 kudos

Resolved! Databricks JDBC & Remote Write

Hello,I'm trying to write to a Delta Table in my Databricks instance from a remote Spark session on a different cluster with the Simba Spark driver. I can do reads, but when I attempt to do a write, I get the following error:{  df.write.format("jdbc...

  • 5298 Views
  • 10 replies
  • 4 kudos
Latest Reply
Atanu
Esteemed Contributor
  • 4 kudos

Could you try setting the flag to ignore transactions? I’m not sure what the exact flag is, but there should be more details in the JDBC manual on how to do this

  • 4 kudos
9 More Replies
vs_29
by New Contributor II
  • 1272 Views
  • 2 replies
  • 3 kudos

Custom Log4j logs are not being written to the DBFS storage.

 I used custom Log4j appender to write the custom logs through the init script and I can see the Custom Log file on the Driver logs but Databricks is not writing those custom logs to the DBFS. I have configured Logging Destination in the Advanced sec...

init script driver logs logs destination
  • 1272 Views
  • 2 replies
  • 3 kudos
Latest Reply
Kaniz
Community Manager
  • 3 kudos

Hi @VIjeet Sharma​, We haven’t heard from you since the last response from @Debayan Mukherjee​ and I was checking back to see if his suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be helpful to...

  • 3 kudos
1 More Replies
User16869510359
by Esteemed Contributor
  • 975 Views
  • 1 replies
  • 1 kudos

Resolved! Cluster logs missing

On the Databricks cluster UI, when I click on the Driver logs, sometimes I see historic logs and sometimes I see logs for the last few hours. Why do we see this inconsistency

  • 975 Views
  • 1 replies
  • 1 kudos
Latest Reply
User16869510359
Esteemed Contributor
  • 1 kudos

This is working per design! This is the expected behavior. When the cluster is in terminated state, the logs are serviced by the Spark History server hosted on the Databricks control plane. When the cluster is up and running the logs are serviced by ...

  • 1 kudos
aladda
by Honored Contributor II
  • 3773 Views
  • 1 replies
  • 1 kudos
  • 3773 Views
  • 1 replies
  • 1 kudos
Latest Reply
aladda
Honored Contributor II
  • 1 kudos

The Databricks Add-on for Splunk built as part of Databricks Labs can be leveraged for Splunk integrationIt’s a bi-directional framework that allows for in-place querying of data in databricks from within Splunk by running queries, notebooks or jobs ...

  • 1 kudos
Labels