cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

How can I get Databricks notebooks to stop cutting off the explain plans?

User15787040559
New Contributor III
New Contributor III

(since Spark 3.0)

Dataset.queryExecution.debug.toFile

will dump the full plan to a file, without concatenating the output as a fully materialized Java string in memory.

2 REPLIES 2

Kaniz_Fatma
Community Manager
Community Manager

Hi @User15787040559729892342! My name is Kaniz, and I'm the technical moderator here. Great to meet you, and thanks for your question! Let's see if your peers on the Forum have an answer to your questions first. Or else I will follow up shortly with a response.

dazfuller
Contributor III

Notebooks really aren't the best method of viewing large files. Two methods you could employ are

  1. Save the file to dbfs and then use databricks CLI to download the file
  2. Use the web terminal

In the web terminal option you can do something like "cat my_large_file.txt | less" and then you can scroll through the entire content of the file.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group