cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Pivot in Databricks SQL

Gilg
Contributor II

Hi Team,

I have a table that has a key column (column name) and value column (value of the column name). These values are generated dynamically and wanted to pivot the table.

Question 1: Is there a way that we can do this without specifying all the columns in the expression_list? 

i.e. 

 

 

 

select id, date, col1, col2, col3, col4
from table1
pivot (
 max(value) as a 
 for key in ('col1', 'col2', 'col3', 'col4') <-- how to do this without specifying each column
)
group by all

 

 

 

 Question 2: When I tried specifying a column in the expression_list (same code above)

getting these results

Gilg_0-1695088239719.png

 

Wanted to collapse the result so that it will only be one row.

Cheers,

G

 

 

1 ACCEPTED SOLUTION

Accepted Solutions

Kaniz_Fatma
Community Manager
Community Manager

Hi @GilgIn Databricks, performing a pivot operation is impossible without specifying all the columns in the expression list. This is because the pivot operation needs to know exactly which columns to pivot on. However, you can dynamically generate the column names if you can retrieve them from your data.

To collapse the result into one row, you can use the first function in Spark SQL. 

View solution in original post

1 REPLY 1

Kaniz_Fatma
Community Manager
Community Manager

Hi @GilgIn Databricks, performing a pivot operation is impossible without specifying all the columns in the expression list. This is because the pivot operation needs to know exactly which columns to pivot on. However, you can dynamically generate the column names if you can retrieve them from your data.

To collapse the result into one row, you can use the first function in Spark SQL. 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group