0

I'm using spark 2.3.0 and hadoop 2.9.1 I'm trying to load a CSV file located in hdfs with spark

scala> val dataframe = spark.read.format("com.databricks.spark.csv").option("header","true").schema(schema).load("hdfs://127.0.0.1:50075/filesHDFS/data.csv")

But I get the following error:

2018-11-14 11:47:58 WARN  FileStreamSink:66 - Error while looking for metadata directory.
java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "Desktop-Presario-CQ42-Notebook-PC/127.0.0.1"; destination host is: "localhost":50070;

1 Answer 1

1

Instead of using 127.0.0.1 use the default FS name. You can find it in the core-site.xml file under the property fs.defaultFS

It should solve your problem.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.