3

i'm getting below error while executing the mentioned 'join' statement. i'm using pyspark setup. Any change required in the join statement or code.

TypeError: 'DataFrame' object is not callable

df11 = spark.read.option("header","true").option("delimiter", ",").csv("s3://mybucket/file1.csv")
df22 = spark.read.option("header","true").option("delimiter", ",").csv("s3://mybucket/file2.csv")
df11.createOrReplaceTempView("table1")
df22.createOrReplaceTempView("table2")
df1 = spark.sql( "select * from table1" )
df2 = spark.sql( "select * from table2" )

df_d = df1.join(df2, df1.NO == df2.NO, 'left').filter(F.isnull(df2.NO)).select(df1.NO,df1.NAME,df1.LAT,df1.LONG, F.lit('DELETE').alias('FLAG'))

Thanks

3 Answers 3

8

use the col names as string like this, it should work

df_d = df1.join(df2, df1['NO'] == df2['NO'], 'left').filter(F.isnull(df2['NO'])).select(df1['NO'],df1['NAME'],df1['LAT'],df1['LONG'], F.lit('DELETE').alias('FLAG'))
Sign up to request clarification or add additional context in comments.

Comments

0

This will help to add new column from the list of existing column

for col_name in partition_key_list:
    print(col_name)
    #df_final_recs_i_u_n = df_final_recs_i_u_n.withColumn(f"{col_name}_partition_by", df_final_recs_i_u_n.date_tgt)
    df_final_recs_i_u_n=df_final_recs_i_u_n.withColumn(f"{col_name}_partition_by",df_final_recs_i_u_n[f"{col_name}_tgt"])

Comments

0

Once you have created the temporary view you can use Spark SQL to create the final dataframe. Please check the screenshot below:

enter image description here

Relevant SQL:

spark.sql("select table1.NO, table1.NAME, table1.LAT, table1.LONG, 'DELETE' as FLAG  from table1 left join table2 on table1.NO = table2.NO where table2.NO is null").show()

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.