0

We can use the following function to covert a single integer value.

val x=100
Integer.toString(x, 16).toUpperCase

But how to apply it to a integer column to generate a new column with hex string? thanks!

The below method does not work.

testDF = testDF.withColumn("data_hex_string", Integer.toString(testDF("data"), 16).toUpperCase)
2

2 Answers 2

2

AFAIK, there isn't a spark native function, so make a udf function to do that.

import org.apache.spark.sql.functions.udf
def toHex = udf((int: Int) => java.lang.Integer.toString(int, 16).toUpperCase)

df.withColumn("hex", toHex($"int")).show()

+---+---+---+
| id|int|hex|
+---+---+---+
|  1|  1|  1|
|  2| 11|  B|
|  3| 23| 17|
+---+---+---+
Sign up to request clarification or add additional context in comments.

Comments

2

As @jxc already mentioned in a comment, use the conv function:

import org.apache.spark.sql.functions.{conv, lower}
df.withColumn("hex", conv($"int_col",10,16)).show

For those who want lowercase, wrap it with lower:

df.withColumn("hex", lower(conv($"int_col",10,16))).show

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.