2

when i execute

sparkSession.sql("SELECT to_timestamp('2018-08-04.11:18:29 AM', 'yyyy-MM-dd.hh:mm:ss a') as timestamp")

am/pm is missing from the answer

+-------------------+
|          timestamp|
+-------------------+
|2018-08-04 11:18:29|
+-------------------+

but if AM/PM is not present, then it gives the correct answer.

using unix_timestamp

sparkSession.sql("select from_unixtime(unix_timestamp('08-04-2018.11:18:29 AM','dd-MM-yyyy.HH:mm:ss a'), 'dd-MM-yyyy.HH:mm:ss a') as timestamp")

gives the correct answer but the datatype becomes string, whereas my requirement is to convert the datatype to timestamp without data loss.

has anyone has suggestions?

Thanks in advance.

2 Answers 2

1

The AM/PM is not missing from the Timestamp datatype. Its just showing the time in 24 hour format. You don't lose any information.

For example,

scala> spark.sql("SELECT to_timestamp('2018-08-04.11:18:29 PM', 'yyyy-MM-dd.hh:mm:ss a') as timestamp").show(false)
+-------------------+
|timestamp          |
+-------------------+
|2018-08-04 23:18:29|
+-------------------+

Whenever you want your timestamp represented with AM/PM, just use a date/time formatter function

Sign up to request clarification or add additional context in comments.

Comments

0

Format of the printed representation is fixed (ISO 8601 compliant string in a local timezone) and cannot be modified.

There is no conversion that can help you here, because any that would satisfy the output format, would have to covet data to string.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.