Can someone Please help me how should i Implemnet below sql in Pyspark Dataframe.
(SUM(Cash) /SUM(cash + credit)) * 100 AS Percentage,
df1=df.withColumn("cash_credit",sf.col("cash") + sf.col("credit"))
df1.show(5)
-------------+---------------+ +--------+-------+------+------|
Credit |Cash | MTH|YR | cash_credit |
-------------+---------------+ -------+--------|--------------|
100.00| 400.00| 10| 2019 | 500.00 |
0.00 | 500.00| 6 | 2019 | 500.00 |
200.00| 600.00| 12| 2018 | 800.00 |
0.00 | 0.00 | 10| 2019 | 0.00 |
300.00| 700.00| 7| 2019 | 1000.00 |
-------------+---------------+----------+--------+-------+--- |
I have tried below Pyspark Code.
df2 = df1.groupBy('MTH', 'YR').agg(sf.sum("Cash").alias("sum_Cash"))\
.withColumn("final_column",sf.col("sum_Cash") + sf.col("cash_credit"))\
.withColumn("div",sf.col("sum_Cash")/sf.col("final_column"))\
.withColumn("Percentage",sf.col("div")*100)
But not able to execute it. It's showing below error.
cannot resolve '`cash_credit`' given input columns: [MTH, YR, sum_Cash];;
groupBy('MTH', 'YR').agg(sf.sum("Cash").alias("sum_Cash"))you will have onlyMTH, YR, sum_Cash.