5

I want to add hbase classpath to my spark, but I got error when I run the hbase classpath command.

I have the hadoop 3.2.0 set up locally with java 1.8 in env.

$ hbase classpath

/usr/lib/hadoop/libexec/hadoop-functions.sh: line 2364: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_USER: invalid variable name /usr/lib/hadoop/libexec/hadoop-functions.sh: line 2459: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_OPTS: invalid variable name Error: Could not find or load main class org.apache.hadoop.hbase.util.GetJavaProperty

1 Answer 1

2

Obviously old question, but your configuration may be wrong.

This is potentially caused by insufficient privilege. You may want to try "sudo" and the command to troubleshoot. This fixed confirmed that I had a privilege issue when the command in hadoop-functions.sh was being executed.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.