0

When executing the following query

SELECT * FROM companies JOIN employments on companies.company_normalized LIKE CONCAT('%',
    replace(employments.displayname_normalized, '\\', ''), '%') or employments.displayname_normalized LIKE CONCAT(
    '%', replace(companies.company_normalized,'\\', ''), '%')

it works fine on databricks sql notebook cell.

However, when attempting to run this same query using spark sql executor, ie spark.sql(query), this errors out saying: Extraneous input '' expecting {')', ','} within the replace clause.

Is there a fix for this?

1
  • Try replacing \\ by \\\\ when using spark.sql() Commented Feb 10, 2022 at 9:51

1 Answer 1

1

You need either a single backslash, if you are using regex:

spark.sql("SELECT * FROM companies JOIN employments on companies.company_normalized LIKE CONCAT('%',
replace(employments.displayname_normalized, '\', ''), '%') or employments.displayname_normalized LIKE CONCAT(
'%', replace(companies.company_normalized,'\', ''), '%')")

or you need to escape slashes, then you need to double escape using "\\\\":

spark.sql("SELECT * FROM companies JOIN employments on companies.company_normalized LIKE CONCAT('%',
replace(employments.displayname_normalized, '\\\\', ''), '%') or employments.displayname_normalized LIKE CONCAT(
'%', replace(companies.company_normalized,'\\\\', ''), '%')")
Sign up to request clarification or add additional context in comments.

1 Comment

does this still apply if I want to replace the \ character?

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.