I'm trying to save the content of a text file in hdfs with Spark:
import org.apache.spark.{SparkContext, SparkConf}
object FormatTlfHdfs { def main(args : Array[String]) {
val conf = new SparkConf().setAppName("Clean data")
.setMaster("local").setSparkHome("/usr/lib/spark")
val sc = new SparkContext(conf)
var vertices = sc.textFile("hdfs:///user/cloudera/dstlf.txt").flatMap{
line => line.split("\\s+") }.distinct()
I'm getting the error :
Exception in thread "main" java.io.IOException: Incomplete HDFS URI, no host: hdfs:///user/cloudera/metadata-lookup-tlf
Doing hdfs dfs -ls looks that is correct
cloudera@quickstart grafoTelefonos]$ hdfs dfs -ls /user/cloudera
Found 6 items
drwx------ - cloudera cloudera 0 2016-02-04 18:37 /user/cloudera/.Trash
drwxr-xr-x - cloudera cloudera 0 2016-05-02 13:38 /user/cloudera/.sparkStaging
-rw-r--r-- 1 cloudera cloudera 1294 2016-05-02 13:34 /user/cloudera /dstlf.txt