I am using following class to create bean from Spark Encoders
Class OuterClass implements Serializable {
int id;
ArrayList<InnerClass> listofInner;
public int getId() {
return id;
}
public void setId (int num) {
this.id = num;
}
public ArrayList<InnerClass> getListofInner() {
return listofInner;
}
public void setListofInner(ArrayList<InnerClass> list) {
this.listofInner = list;
}
}
public static class InnerClass implements Serializable {
String streetno;
public void setStreetno(String streetno) {
this.streetno= streetno;
}
public String getStreetno() {
return streetno;
}
}
Encoder<OuterClass> outerClassEncoder = Encoders.bean(OuterClass.class);
Dataset<OuterClass> ds = spark.createDataset(Collections.singeltonList(outerclassList), outerClassEncoder)
And I am getting the following error
Exception in thread "main" java.lang.UnsupportedOperationException: Cannot infer type for class OuterClass$InnerClass because it is not bean-compliant
How can I implement this type of use case for Spark in Java? This worked fine if I remove the inner class. But I need to have an inner class for my use case.