0

I am using Spark SQL to import data from Oracle as below-

Class.forName("oracle.jdbc.driver.OracleDriver")  
var info : java.util.Properties = new java.util.Properties()  
info.put("user", user)  
info.put("password", password) 
val jdbcDF = spark.read.jdbc(jdbcURL, tableFullName, info)

Table Schema:

SERVICE_DATE - DATE

Spark while importing converts columns having "date" data type to "java.sql.timestamp" data type. During this conversion, I am facing an issue for dates which follows daylight saving time.

For eg:

oracle: SERVICE_DATE = 2008-03-09 02:49:00.0 [DATE]
spark: SERVICE_DATE = 2008-03-09 03:49:00.0 [TIMESTAMP]

Is this issue with the conversion of oracle date to JDBC timestamp?

1
  • Is this solved? Commented Aug 26, 2020 at 9:26

1 Answer 1

4

From https://github.com/apache/spark/pull/18411/files/aefd028883bc27cd5929e80dff29d2b15aa114b2

You can see, there is a property called oracle.jdbc.mapDateToTimestamp which is true by default. If you set it to false the Date columns won't be marked as Timestamp.

Sign up to request clarification or add additional context in comments.

1 Comment

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.