cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Can we get the actual query execution plan programmatically after a query is executed? Apart from UI

Ak_0926
New Contributor

Let's say i have run a query and it showed me results. we can find the respective query execution plan on the UI. Is there any way we can get that execution plan through programmatically or through API?

2 REPLIES 2

Walter_C
Databricks Employee
Databricks Employee

You can obtain the query execution plan programmatically using the EXPLAIN statement in SQL. The EXPLAIN statement displays the execution plan that the database planner generates for the supplied statement. The execution plan shows how the table(s) referenced by the statement will be scanned โ€” by plain sequential scan, index scan, etc. โ€” and if multiple tables are referenced, what join algorithms will be used to bring together the required rows from each input table.

Here is an example of how you can use it:

 

# Spark SQL
query = "SELECT * FROM table"
plan = spark.sql(f"EXPLAIN {query}")
plan.show(truncate=False)

This will return a DataFrame with a single row and column that contains the execution plan as a string.

EXPLAIN command will only provide the logical and physical plans. It will not provide the runtime details like how much time each stage took, how much data was read, etc. For that level of detail, you would need to parse the Spark UI or logs.

Danny_Lee
Valued Contributor

The EXPLAIN docs show some extra functionality: https://docs.databricks.com/en/sql/language-manual/sql-ref-syntax-qry-explain.html

EXPLAIN [ EXTENDED | CODEGEN | COST | FORMATTED ] statement
--
The heart that breaks open can contain the whole universe. - Joanna Macy

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group