cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

SQL query with leads and lags

ejloh
New Contributor II

I'm trying to create a new column that fills in the nulls below. I tried using leads and lags but isn't turning out right. Basically trying to figure out who is in "possession" of the record, given the TransferFrom and TransferTo columns and sequence of events. For instance, 35 was in possession of the record until they transferred it to 57. My current query will only populate 35 in record 2 since it only leads by one record. I need the query to populate all records prior to this as well if 35 is the first value found in the TransferFrom column. Any ideas? (btw, I also tried to use outer apply which works in SQL Server but isn't supported by spark SQL)

Create table script:

  CREATE TABLE results (
    OrderID int
   ,TransferFrom string
   ,TransferTo string
   ,ActionTime timestamp)
     
  INSERT INTO results
  VALUES
   (1,null,null,'2020-01-01 00:00:00'),
   (1,null,null,'2020-01-02 00:00:00'),
   (1,null,null,'2020-01-03 00:00:00'),
   (1,'35','57','2020-01-04 00:00:00'),
   (1,null,null,'2020-01-05 00:00:00'),
   (1,null,null,'2020-01-06 00:00:00'),
   (1,'57','45','2020-01-07 00:00:00'),
   (1,null,null,'2020-01-08 00:00:00'),
   (1,null,null,'2020-01-09 00:00:00'),
   (1,null,null,'2020-01-10 00:00:00')

  

Current query that doesn't work:

  SELECT *
     ,coalesce(
      lead(TransferFrom) over (partition by OrderID order by ActionTime)
      ,TransferFrom
      ,lag(TransferTo) over (partition by OrderID order by ActionTime)) as NewColumn
  FROM results

Current query result that is incorrect:

image 

Desired query result:

image 

3 REPLIES 3

ejloh
New Contributor II

This is likely sub-optimal for spark SQL, but it got me the right answer.

with cte1 as (
    select *
    from results
)
, cte2 as (
    select *
    from cte1
    where TransferFrom is not null
    or TransferTo is not null
    order by ActionTime
)
, cte3 as (
    select distinct OrderID, TransferFrom as Team from cte2
    union
    select distinct OrderID, TransferTo as Team from cte2
)
, cte4 as (
    select a.*
    ,ifnull(timestampadd(MICROSECOND, 1, c.ActionTime),'2000-01-01T00:00:00.000+0000') as StartTime
    ,ifnull(b.ActionTime,'9999-12-31T00:00:00.000+0000') as EndTime
    from cte3 as a
    left join cte2 as b
      on a.OrderID = b.OrderID
      and a.Team = b.TransferFrom
    left join cte2 as c
      on a.OrderID = c.OrderID
      and a.Team = c.TransferTo
    order by OrderID, StartTime
)
, cte5 as (
    select a.*, b.Team
    from cte1 as a
    join cte4 as b
      on a.OrderID = b.OrderID
      and a.ActionTime between b.StartTime and b.EndTime
)
 
select * 
from cte5
order by 1, 4

Hubert-Dudek
Esteemed Contributor III

Please just replace OVER with

IGNORE NULLS OVER

Vidula
Honored Contributor

Hi there @Eric Lohbeckโ€‹ 

Does @Hubert Dudekโ€‹  response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?

We'd love to hear from you.

Thanks!

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group