I am trying to convert my Java code to scala in Spark, but found it very complicated. Is it possible to convert the following Java code to scala? Thanks!
JavaPairRDD<String,Tuple2<String,String>> newDataPair = newRecords.mapToPair(new PairFunction<String, String, Tuple2<String, String>>() {
private static final long serialVersionUID = 1L;
@Override
public Tuple2<String, Tuple2<String, String>> call(String t) throws Exception {
MyPerson p = (new Gson()).fromJson(t, MyPerson.class);
String nameAgeKey = p.getName() + "_" + p.getAge() ;
Tuple2<String, String> value = new Tuple2<String, String>(p.getNationality(), t);
Tuple2<String, Tuple2<String, String>> kvp =
new Tuple2<String, Tuple2<String, String>>(nameAgeKey.toLowerCase(), value);
return kvp;
}
});
I tried the following, but I am sure I have missed many things. And actually it is not clear to me how to do the override function in scala ... Please suggest or share some examples. Thank you!
val newDataPair = newRecords.mapToPair(new PairFunction<String, String, Tuple2<String, String>>() {
@Override
public val call(String t) throws Exception {
val p = (new Gson()).fromJson(t, MyPerson.class);
val nameAgeKey = p.getName() + "_" + p.getAge() ;
val value = new Tuple2<String, String>(p.getNationality(), t);
val kvp =
new Tuple2<String, Tuple2<String, String>>(nameAgeKey.toLowerCase(), value);
return kvp;
}
});