1

How we pass variable which is a floating-point value to spark expr?

myVariable=0.50;


expr("data.mark = myVariable")

This is what I expect after substituting myVariable.

expr("data.mark = 0.50"). --- value 0.50 should be used during execution . 

2 Answers 2

1

String interpolation? Like so:

expr(String.format("data.mark = %s", myVariable))


Sign up to request clarification or add additional context in comments.

3 Comments

Does java support adding s? I get a compilation error.
My answer was for Scala. I have updated it for Java now.
No it is java unfortunately
1

This is one way I found to do it, making f string. Two input variables within an expression.

beginDate = '2000-01-01'
endDate = '2050-12-31'
df = spark.createDataFrame([(1,)], ["id"])

df1 = df.withColumn(
    "date", 
    explode(expr(f"sequence(to_date('{beginDate}'), to_date('{endDate}'), interval 1 day)"))
)

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.