0

Using Below code I am getting tweets for a particular filter :

val topCounts60 = tweetMap.map((_, 1)).

reduceByKeyAndWindow(_+_, Seconds(60*60))

one of the sample Output of topCounts60 is in below format if i do topCounts60.println():

(Map(UserLang -> en, UserName -> Harmeet Singh, UserScreenName -> 
harmeetsingh060, HashTags -> , UserVerification -> false, Spam -> true,     UserFollowersCount -> 44, UserLocation -> भारत, UserStatusCount -> 50,   UserCreated -> 2016-07-04T06:32:49.000+0530, UserDescription -> Punjabi Music,   TextLength -> 118, Text -> RT @PMOIndia: The Prime Minister is chairing a high   level meeting on the situation in Kashmir,    UserFollowersRatio -> 0.32116788625717163, UserFavouritesCount -> 67,   UserFriendsCount -> 137, StatusCreatedAt -> 2016-07-12T21:07:30.000+0530,   UserID -> 749770405867556865),1)

Now I am trying to print each key pair values like below:

for ((k,v) <- topCounts60) printf("key: %s, value: %s\n", k, v)

I am getting below exception:

Error:(125, 10) constructor cannot be instantiated to expected type;
found   : (T1, T2)
required:     org.apache.spark.rdd.RDD[(scala.collection.immutable.Map[String,Any], Int)]
for ((k,v) <- topCounts60) printf("key: %s, value: %s\n", k, v)

How to get output like below :

UserLang -> en,

UserName -> Harmeet Singh

I am beginner in scala,have no clue how to print all values seperatley,please help me on this.

2 Answers 2

0

Use foreach and string interpolation:

rdd.collect().foreach { case (k,v) => println(s"key: $s, value: $v")`}
Sign up to request clarification or add additional context in comments.

1 Comment

@AkshayAnand ah, you use for comprehension for RDD types :) That's not supported.
0

Try

topCounts60.foreachRDD {
    rdd => for ((k,v) <- rdd.collect) printf("key: %s, value: %s\n", k, v)
}

1 Comment

its coming in below format: key: Map(UserLang -> en, UserName -> Deepesh Kumar, UserScreenName -> afg5859, HashTags -> , UserVerification -> false, Spam -> true, UserFollowersCount -> 49, UserLocation -> Planet Earth, UserStatusCount -> 640, UserCreated -> 2016-01-04T14:51:59.000+0530, UserDescription -> I Practice, TextLength -> 110, Text ->@AnupamPkher exceptions are not the rule, the support for terrorism in Kashmir is appalling., UserFollowersRatio -> 0.19521912932395935), value: 1 Cant I break it more individually..I want to use case case class and store in spark sql

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.