cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

SQL query with leads and lags

ejloh
New Contributor II

I'm trying to create a new column that fills in the nulls below. I tried using leads and lags but isn't turning out right. Basically trying to figure out who is in "possession" of the record, given the TransferFrom and TransferTo columns and sequence of events. For instance, 35 was in possession of the record until they transferred it to 57. My current query will only populate 35 in record 2 since it only leads by one record. I need the query to populate all records prior to this as well if 35 is the first value found in the TransferFrom column. Any ideas? (btw, I also tried to use outer apply which works in SQL Server but isn't supported by spark SQL)

Create table script:

  CREATE TABLE results (
    OrderID int
   ,TransferFrom string
   ,TransferTo string
   ,ActionTime timestamp)
     
  INSERT INTO results
  VALUES
   (1,null,null,'2020-01-01 00:00:00'),
   (1,null,null,'2020-01-02 00:00:00'),
   (1,null,null,'2020-01-03 00:00:00'),
   (1,'35','57','2020-01-04 00:00:00'),
   (1,null,null,'2020-01-05 00:00:00'),
   (1,null,null,'2020-01-06 00:00:00'),
   (1,'57','45','2020-01-07 00:00:00'),
   (1,null,null,'2020-01-08 00:00:00'),
   (1,null,null,'2020-01-09 00:00:00'),
   (1,null,null,'2020-01-10 00:00:00')

  

Current query that doesn't work:

  SELECT *
     ,coalesce(
      lead(TransferFrom) over (partition by OrderID order by ActionTime)
      ,TransferFrom
      ,lag(TransferTo) over (partition by OrderID order by ActionTime)) as NewColumn
  FROM results

Current query result that is incorrect:

image 

Desired query result:

image 

3 REPLIES 3

ejloh
New Contributor II

This is likely sub-optimal for spark SQL, but it got me the right answer.

with cte1 as (
    select *
    from results
)
, cte2 as (
    select *
    from cte1
    where TransferFrom is not null
    or TransferTo is not null
    order by ActionTime
)
, cte3 as (
    select distinct OrderID, TransferFrom as Team from cte2
    union
    select distinct OrderID, TransferTo as Team from cte2
)
, cte4 as (
    select a.*
    ,ifnull(timestampadd(MICROSECOND, 1, c.ActionTime),'2000-01-01T00:00:00.000+0000') as StartTime
    ,ifnull(b.ActionTime,'9999-12-31T00:00:00.000+0000') as EndTime
    from cte3 as a
    left join cte2 as b
      on a.OrderID = b.OrderID
      and a.Team = b.TransferFrom
    left join cte2 as c
      on a.OrderID = c.OrderID
      and a.Team = c.TransferTo
    order by OrderID, StartTime
)
, cte5 as (
    select a.*, b.Team
    from cte1 as a
    join cte4 as b
      on a.OrderID = b.OrderID
      and a.ActionTime between b.StartTime and b.EndTime
)
 
select * 
from cte5
order by 1, 4

Hubert-Dudek
Esteemed Contributor III

Please just replace OVER with

IGNORE NULLS OVER

Vidula
Honored Contributor

Hi there @Eric Lohbeckโ€‹ 

Does @Hubert Dudekโ€‹  response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?

We'd love to hear from you.

Thanks!

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.