cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

RajeshRK
by Contributor
  • 10350 Views
  • 7 replies
  • 3 kudos

Resolved! Download event, driver, and executor logs

Hi Team, I can see logs in Databricks console by navigating workflow -> job name -> logs. These logs are very generic like stdout, stderr and log4-avtive.log. How to download event, driver, and executor logs at once for a job? Regards,Rajesh.

  • 10350 Views
  • 7 replies
  • 3 kudos
Latest Reply
RajeshRK
Contributor
  • 3 kudos

@Kaniz Fatma​ @John Lourdu​ @Vidula Khanna​ Hi Team,I managed to download logs using the Databricks command line as below: Installed the Databricks command line on my Desktop (pip install databricks-cli)Configured the Databricks cluster URL and perso...

  • 3 kudos
6 More Replies
RajeshRK
by Contributor
  • 7975 Views
  • 3 replies
  • 0 kudos

Need help to analyze databricks logs for a long-running job.

Hi Team,We have a job it completes in 3 minutes in one Databricks cluster, if we run the same job in another databricks cluster it is taking 3 hours to complete.I am quite new to Databricks and need your guidance on how to find out where databricks s...

  • 7975 Views
  • 3 replies
  • 0 kudos
Latest Reply
AmitKP
New Contributor II
  • 0 kudos

Hi @Retired_mod ,I am saving logs of my databricks Job Compute From ADF, How can i open those files that present in dbfs location.

  • 0 kudos
2 More Replies
Binesh
by New Contributor II
  • 7777 Views
  • 2 replies
  • 0 kudos

Databricks Logs some error messages while trying to read data using databricks-jdbc dependency

I have tried to read data from Databricks using the following java code.String TOKEN = "token..."; String url = "url...";   Properties properties = new Properties(); properties.setProperty("user", "token"); properties.setProperty("PWD", TOKEN);   Con...

Logger Errors
  • 7777 Views
  • 2 replies
  • 0 kudos
Latest Reply
shan_chandra
Databricks Employee
  • 0 kudos

@Binesh J​ - The issue could be due to the data type of the column is not compatible with getString() method in line#17. use getObject() method to retrieve the value as a generic value and then convert to string.

  • 0 kudos
1 More Replies
psps
by New Contributor III
  • 4385 Views
  • 3 replies
  • 4 kudos

Databricks Job run logs only shows prints/logs from driver and not executors

Hi,​In Databricks Job run output, only logs from driver are displayed. We have a function parallelized to run on executor nodes. The logs/prints from that function are not displayed in job run output. Is there a way to configure and show those logs i...

  • 4385 Views
  • 3 replies
  • 4 kudos
Latest Reply
psps
New Contributor III
  • 4 kudos

Thanks @Debayan Mukherjee​ . This is to enable executor logging. However, the executor logs do not appear in Databricks Job run output. Only driver logs are displayed.

  • 4 kudos
2 More Replies
Dean_Lovelace
by New Contributor III
  • 4050 Views
  • 3 replies
  • 0 kudos

How to filter the Spark UI for a notebook

When running spark under yarn each script has it's own self contained set of logs:- In databricks all I see if a list of jobs and stages that have been run on the cluster:- From a support perspective this is a nightmare.How can notebooks logs be grou...

image image
  • 4050 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Dean Lovelace​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers...

  • 0 kudos
2 More Replies
sagiatul
by New Contributor II
  • 4991 Views
  • 2 replies
  • 3 kudos

Databricks driver logs

I am running jobs on databricks clusters. When the cluster is running I am able to find the executor logs by going to Spark Cluster UI Master dropdown, selecting a worker and going through the stderr logs. However, once the job is finished and cluste...

image
  • 4991 Views
  • 2 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Atul Arora​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedback w...

  • 3 kudos
1 More Replies
Mado
by Valued Contributor II
  • 2088 Views
  • 1 replies
  • 1 kudos

Resolved! How to query Databricks audit logs?

Hi,I would like to ask where the Databricks Audit Log files are stored on the DBFS.And is there any way that I can query log files?Thanks.

  • 2088 Views
  • 1 replies
  • 1 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 1 kudos

Hi @Mohammad Saber​ ,I think first you need to configure audit log in databricks then you use it.Please refer below blog that will help you in this.Configure audit logging | Databricks on AWS

  • 1 kudos
Murthy1
by Contributor II
  • 5954 Views
  • 5 replies
  • 4 kudos

Send custom logs to AWS cloudwatch from Notebook

I would like to send some custom logs (in Python) from my Databricks notebook to AWS Cloudwatch. For example: df = spark.read.json(".......................")logger.info("Successfully ingested data from json")Has someone succeeded in doing this before...

  • 5954 Views
  • 5 replies
  • 4 kudos
Latest Reply
Debayan
Databricks Employee
  • 4 kudos

Hi, You can integrate, please refer: https://aws.amazon.com/blogs/mt/how-to-monitor-databricks-with-amazon-cloudwatch/ and also you can configure audit logging to S3 and redirect it to cloudwatch from AWS. , refer: https://aws.amazon.com/blogs/mt/how...

  • 4 kudos
4 More Replies
Optum
by New Contributor III
  • 9026 Views
  • 8 replies
  • 4 kudos

Databricks JDBC & Remote Write

Hello,I'm trying to write to a Delta Table in my Databricks instance from a remote Spark session on a different cluster with the Simba Spark driver. I can do reads, but when I attempt to do a write, I get the following error:{  df.write.format("jdbc...

  • 9026 Views
  • 8 replies
  • 4 kudos
Latest Reply
Atanu
Databricks Employee
  • 4 kudos

Could you try setting the flag to ignore transactions? I’m not sure what the exact flag is, but there should be more details in the JDBC manual on how to do this

  • 4 kudos
7 More Replies
vs_29
by New Contributor II
  • 2620 Views
  • 1 replies
  • 3 kudos

Custom Log4j logs are not being written to the DBFS storage.

 I used custom Log4j appender to write the custom logs through the init script and I can see the Custom Log file on the Driver logs but Databricks is not writing those custom logs to the DBFS. I have configured Logging Destination in the Advanced sec...

init script driver logs logs destination
  • 2620 Views
  • 1 replies
  • 3 kudos
Latest Reply
Debayan
Databricks Employee
  • 3 kudos

Hi @VIjeet Sharma​ , Do you receive any error? This can be an issue using DBFS mount point /dbfs in an init script: the DBFS mount point is installed asynchronously, so at the very beginning of init script execution, that mount point might not be ava...

  • 3 kudos
brickster_2018
by Databricks Employee
  • 1698 Views
  • 1 replies
  • 1 kudos

Resolved! Cluster logs missing

On the Databricks cluster UI, when I click on the Driver logs, sometimes I see historic logs and sometimes I see logs for the last few hours. Why do we see this inconsistency

  • 1698 Views
  • 1 replies
  • 1 kudos
Latest Reply
brickster_2018
Databricks Employee
  • 1 kudos

This is working per design! This is the expected behavior. When the cluster is in terminated state, the logs are serviced by the Spark History server hosted on the Databricks control plane. When the cluster is up and running the logs are serviced by ...

  • 1 kudos
aladda
by Databricks Employee
  • 8751 Views
  • 1 replies
  • 1 kudos
  • 8751 Views
  • 1 replies
  • 1 kudos
Latest Reply
aladda
Databricks Employee
  • 1 kudos

The Databricks Add-on for Splunk built as part of Databricks Labs can be leveraged for Splunk integrationIt’s a bi-directional framework that allows for in-place querying of data in databricks from within Splunk by running queries, notebooks or jobs ...

  • 1 kudos
Labels