I have the following RDD:
1:AAAAABAAAAABAAAAABAAAAAB
2:BBAAAAAAAAAABBAAAAAAAAAA
Every character is an event. Mi desired output is to obtain the number of occurrences per group of event. For this first example, the output should be:
{ "A" -> 6 , "B" -> 6 }
With my code, I get the desired output:
val rdd = sqlContext.sparkContext.makeRDD(Seq(
"1:AAAAABAAAAABAAAAABAAAAAB","2:BBAAAAAAAAAABBAAAAAAAAAA"))
val rddSplited = rdd.map(_.split(":")(1).toList)
val values = scala.collection.mutable.Map[String, Long]()
var iteracion = 0
for (ocurrences <- rddSplited) {
var previousVal = "0"
for (listValues <- ocurrences) {
if (listValues.toString != previousVal) {
values.get(listValues.toString) match {
case Some(e) => values.update(listValues.toString, e + 1)
case None => values.put(listValues.toString, 1)
}
previousVal = listValues.toString()
}
}
//println(values) //return the values
}
println(values) //returns an empty Map
}
The problem is that the
println(values)
doesn't return any data, but if change it when the commented println is placed, the Map values does return values.
How can I return the final values of the map after the main for loop?
Sorry if my implementation is not the best one, I'm new in this Scala/Spark world.
Thanks in advance.
I'm editing the question to explain better what I'm trying to achieve, The code providing in the answers (thanks for all your help), does not return the desired output. I'm not trying to count the numbers of events, what I need is to count the numbers of occurrences when an event changes to another, ie:
AAAAABAAAAABAAAAABAAAAAB => A-> 4 , B-> 4
BBAAAAAAAAAABBAAAAAAAAAA => A-> 2 , B-> 2
So the final output should be A-> 6 , B-> 6
I'm really sorry for the misunderstanding.