cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Why CTE is having issues with Databricks here?

THIAM_HUATTAN
Valued Contributor

df = spark.createDataFrame([(2018,'Apple1',45000),(2019,'Apple1',35000),(2020,'Apple1',75000),

              (2018,'Samsung',15000),(2019,'Samsung',20000),(2020,'Samsung',25000),

              (2018,'Nokia',21000),(2019,'Nokia',17000),(2020,'Nokia',14000)],

              ['Year', 'Brand', 'Sales'])

df.createOrReplaceTempView('PhoneBrandSales')

spark.sql('SELECT * FROM PhoneBrandSales').show()

%sql

SELECT *, 

Lead(Sales, 1, 0) 

OVER(PARTITION BY Brand ORDER BY Year desc) AS PreviousYearSales 

FROM PhoneBrandSales;

%sql

-- https://www.sqlservertutorial.net/sql-server-window-functions/sql-server-lag-function/

WITH CTE AS

(

 SELECT *,

 Lead(Sales, 1, 0)

 OVER(PARTITION BY Brand ORDER BY Year desc) AS PreviousYearSales

 FROM PhoneBrandSales;

)

SELECT * FROM CTE

ParseException:

no viable alternative at input 'WITH CTE AS\n(\n SELECT *, \n Lead(Sales, 1, 0) \n OVER(PARTITION BY Brand ORDER BY Year desc) AS PreviousYearSales \n FROM PhoneBrandSales'(line 6, pos 22)

== SQL ==

WITH CTE AS

(

SELECT *,

Lead(Sales, 1, 0)

OVER(PARTITION BY Brand ORDER BY Year desc) AS PreviousYearSales

FROM PhoneBrandSales

----------------------^^^

1 ACCEPTED SOLUTION

Accepted Solutions

Varshith
New Contributor III

Issue is coming because of the semicolon beside PhoneBrandSales. Try removing that ; issue will be resolved. Please refer to the screenshot below.

Please select this answer as best answer if it resolved your issue

Thanks,

Varshith

View solution in original post

2 REPLIES 2

Varshith
New Contributor III

Issue is coming because of the semicolon beside PhoneBrandSales. Try removing that ; issue will be resolved. Please refer to the screenshot below.

Please select this answer as best answer if it resolved your issue

Thanks,

Varshith

FANTASTIC, it is solved, thanks a lot for the prompt answer.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.