We are using Spark 3.3.2 with Scala 2.12.15 and JDK 11.0.16
I don't have any scala code. Purely python and RDD.
Still we are getting the below error in rancher.
java.io.InvalidClassException: org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages$RetrieveSparkAppConfig$; local class incompatible: stream classdesc serialVersionUID = .....
I have searched so many topics in Stack overflow. But everywhere they are telling version incompatability. We have all versions up to date and we can't modify..
Please suggest some proper solution. Thanks in advance
org.apachesa typo? Should it beorg.apache? "I don't have any scala code" Are you using PySpark? Then you do have Scala code. The classorg.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RetrieveSparkAppConfigis from spark-core. Just in case, hardly this is the reason but did you try to run with JDK 8 rather than 11?java.io.InvalidClassException: local class incompatible: stream classdescis because incompatible versions of a dependency are used (with differentserialVersionUID).