I am using Spark SQL to import data from Oracle as below-
Class.forName("oracle.jdbc.driver.OracleDriver")
var info : java.util.Properties = new java.util.Properties()
info.put("user", user)
info.put("password", password)
val jdbcDF = spark.read.jdbc(jdbcURL, tableFullName, info)
Table Schema:
SERVICE_DATE - DATE
Spark while importing converts columns having "date" data type to "java.sql.timestamp" data type. During this conversion, I am facing an issue for dates which follows daylight saving time.
For eg:
oracle: SERVICE_DATE = 2008-03-09 02:49:00.0 [DATE]
spark: SERVICE_DATE = 2008-03-09 03:49:00.0 [TIMESTAMP]
Is this issue with the conversion of oracle date to JDBC timestamp?