Spark is used to get schema of a table from SQL server DB. I am facing issue while creating Hive tables using this schema due to datatype mismatch. How can we convert the SQL Server datatype to Hive datatype in Spark Scala.
val df = sqlContext.read.format("jdbc")
.option("url", "jdbc:sqlserver://host:port;databaseName=DB")
.option("driver", "com.microsoft.sqlserver.jdbc.SQLServerDriver")
.option("dbtable", "schema.tableName")
.option("user", "Userid").option("password", "pswd")
.load().schema