cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
cancel
Showing results for 
Search instead for 
Did you mean: 

Where are driver logs for SQL Pro Warehouse?

BamBam
New Contributor II

In an All-Purpose Cluster, it is pretty easy to get at the Driver logs.  Where do I find the Driver Logs for a SQL Pro Warehouse?  The reason I ask is because sometimes in a SQL Editor we get generic error messages like "Task failed while writing rows to abfss://...".  I am guessing the rest of the error message (i.e. full stack error) is in the Spark driver log.  Where are those driver logs located in SQL Pro Warehouse?

I was just in a Databricks Open Office Hours webinar and Clayton mentioned you can find this in the Query Profile.  I looked at the query profile and the 3 dots at the top-right corner only have the option to "Download" which downloads the query profile as JSON.

2 REPLIES 2

Kaniz
Community Manager
Community Manager

Hi @BamBam , 

In Azure Synapse Analytics (formerly SQL DW), the driver logs are not directly accessible as they are in Databricks.

However, you can access the error in SQL Pro, the driver node is not exposed to users since SQL Pro uses a managed service for the driver node. Therefore, you cannot access the driver logs directly for a SQL Pro Warehouse cluster.

When a query is run in SQL Pro, it is processed by the driver node, which then distributes the work to the worker nodes in the cluster. The driver node logs are collected by the managed service and stored on the backend in a location that is not accessible from the user interface.

That being said, you can still investigate and troubleshoot errors by using the Query Profile feature in SQL Pro. The Query Profile provides detailed information about the execution of SQL statements in your cluster, including the duration, stages, and tasks involved. To locate the full stack trace or error message for your failed query, follow these steps:

  1. In SQL Pro, navigate to the "Activity" tab and select the failed query from the list.

  2. Click on the "Query Profile" tab to open the query profile.

  3. Scroll down to the bottom of the page to find the "Error Message" section, which contains the full error message associated with the failed query.

If you still need to access the driver logs for troubleshooting purposes, you can contact Databricks Support and they can help you get access to the managed service logs for your SQL Pro Warehouse cluster.

BamBam
New Contributor II

Hi Kaniz,

Thanks for providing this information.  Yes, I am able to see the error in the Query Profile but it is the same error I saw in the SQL Editor UI.  Most errors in the SQL Editor are helpful, there are some that are vary vague and don't provide any insight without more log information.  For example:


"org.apache.spark.sql.execution.datasources.v2.DataSourceV2Relation cannot be cast to org.apache.spark.sql.execution.datasources.LogicalRelation"

 Or

[TASK_WRITE_FAILED] Task failed while writing rows to abfss://silver@<REDACTED>.dfs.core.windows.net/__unitystorage/catalogs/<REDACTED>/tables/<REDACTED>. 

I was able to write to the ADLS storage account just fine, it turned out to be a different error in the SQL.  So these generic errors didn't provide much insight and was looking for more detailed information in the SQL Pro Warehouse driver logs which sounds like is only available with the help of Databricks support.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.