Need help in a pyspark code in Databricks to calculate a new measure column.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-19-2022 04:33 AM
Details of the requirement is as below:
I have a table with below structure:
So i have to write a code in pyspark to calculate a new column.
Logic for new column is Sum of Magnitude for different Categories divided by the total Magnitude.And it should be multiplied with 100 to show it in percentage.
For Example for Category OAE New Column should show (23.98+50.54+84.95)/Sum(Total Magnitude).
So there should be one row for each Date and Category.
Please help me in framing the code.
Let me know if you have any question.
I have attached sample data in the excel.
I am struct at this code. Basically how to divide the Sum for Each Category with the total sum of Magnitude.
import pyspark
from pyspark.sql import SparkSession
from pyspark.sql.functions import col,sum
from pyspark.sql import Window
from pyspark.sql import functions
df = sqlContext.sql(" select * from Table")
df1=df.withColumn("NewColumn",functions.sum("Magnitude").over(Window.partitionBy("Category")))
display(df1)
Thanks
Faizan
- Labels:
-
Azure databricks
-
Details
-
Pyspark Code
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-31-2022 08:25 PM
df1=df.withColumn("NewColumn",functions.sum("Magnitude").over(Window.partitionBy("Category"))
/functions.sum("Magnitude").over(Window.partitionBy(functions.lit("1"))))

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-27-2022 05:36 AM
Hi @Faizan Arefin
Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!

