0

I am using Hadoop-1.2.1 and Sqoop-1.4.6. I am using sqoop to import the table test from the database meshtree into HDFS using this command:

`sqoop import --connect jdbc:mysql://localhost/meshtree --username user --password password --table test`

But, it shows this error:

17/06/17 18:15:21 WARN tool.BaseSqoopTool: Setting your password on the     command-line is insecure. Consider using -P instead.
17/06/17 18:15:21 INFO manager.MySQLManager: Preparing to use a MySQL     streaming resultset.
17/06/17 18:15:21 INFO tool.CodeGenTool: Beginning code generation
17/06/17 18:15:22 INFO manager.SqlManager: Executing SQL statement: SELECT     t.* FROM `test` AS t LIMIT 1
17/06/17 18:15:22 INFO orm.CompilationManager: HADOOP_HOME is /home/student    /Installations/hadoop-1.2.1/libexec/..
Note: /tmp/sqoop-student/compile/6bab6efaa3dc60e67a50885b26c1d14b/test.java     uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/06/17 18:15:24 ERROR orm.CompilationManager: Could not rename /tmp/sqoop-    student/compile/6bab6efaa3dc60e67a50885b26c1d14b/test.java to /home/student    /Installations/hadoop-1.2.1/./test.java
org.apache.commons.io.FileExistsException: Destination '/home/student    /Installations/hadoop-1.2.1/./test.java' already exists
at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
at     org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:227)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:83)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:367)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:453)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
17/06/17 18:15:24 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-    student/compile/6bab6efaa3dc60e67a50885b26c1d14b/test.jar
17/06/17 18:15:24 WARN manager.MySQLManager: It looks like you are importing     from mysql.
17/06/17 18:15:24 WARN manager.MySQLManager: This transfer can be faster! Use     the --direct
17/06/17 18:15:24 WARN manager.MySQLManager: option to exercise a MySQL-    specific fast path.
17/06/17 18:15:24 INFO manager.MySQLManager: Setting zero DATETIME behavior     to convertToNull (mysql)
17/06/17 18:15:24 INFO mapreduce.ImportJobBase: Beginning import of test
17/06/17 18:15:27 INFO mapred.JobClient: Cleaning up the staging area     hdfs://localhost:9000/home/student/Installations/hadoop-1.2.1/data/mapred    /staging/student/.staging/job_201706171814_0001
17/06/17 18:15:27 ERROR security.UserGroupInformation:     PriviledgedActionException as:student     cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory     test already exists
17/06/17 18:15:27 ERROR tool.ImportTool: Encountered IOException running     import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output     directory test already exists
at     org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileO    utputFormat.java:137)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:973)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at     org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at     org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:550)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:580)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:141)
at     org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:201)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:413)
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:97)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:380)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:453)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)

Is there any way to figure out this problem?

3 Answers 3

1

It’s important that you do not use the URL localhost if you intend to use Sqoop with a distributed Hadoop cluster. The connect string you supply will be used on TaskTracker nodes throughout your MapReduce cluster; if you specify the literal name localhost, each node will connect to a different database (or more likely, no database at all). Instead, you should use the full hostname or IP address of the database host that can be seen by all your remote nodes.

Please visit Sqoop Document Connecting to a Database Server section for more information.

Sign up to request clarification or add additional context in comments.

4 Comments

Thank you for your explanation. I read the Sqoop Document, it was very helpful and the previous error has been resolved. But there is another error that appears. Can you please help me to resolve it ? The error: 'Exception in thread "main" java.lang.IncompatibleClassChangeError: Found class org.apache.hadoop.mapreduce.JobContext, but interface was expected at org.apache.sqoop.config.ConfigurationHelper.getJobNumMaps(ConfigurationHelper.java:65)'
You are using Hadoop 1.x but looks like your Sqoop is compiled with Hadoop 2.x. Please compile your Sqoop with Hadoop 1.x or download lower version of Sqoop which is compatible with Hadoop 1.x
Thank you for your reply, I tried to download a lower version of sqoop (sqoop-1.4.1). But, it still doesn't work with me. It shows this error: 'ERROR security.UserGroupInformation: PriviledgedActionException: Connection refused ERROR tool.ImportTool: Encountered IOException running import job: Connection refused'
I edited my first question that I posted and I repost the full error. Can you please help me to resolve it ?
1

You don't permissions.So contact myql dba to grant you the same. Or you may do yourself if you have admin access to mysql.

grant all privileges on databasename.* to 'username'@'%' identified by 'password';

*-for all tables %- allow from any host

The above syntax is to grant permission to user in mysql server.In your case it will be:-

grant all privileges on meshtree.test to 'root'@'localhost' identified by 'yourpassword';

4 Comments

Thank you for your explanation. When I try to import the data again using the sqoop import command, it shows me another error: 'Exception in thread "main" java.lang.IncompatibleClassChangeError: Found class org.apache.hadoop.mapreduce.JobContext, but interface was expected at org.apache.sqoop.config.ConfigurationHelper.getJobNumMaps(ConfigurationHelper.java:65)' Can you please help me to resolve this error ?
@Christine The Incompitable class change error depicts that your Sqoop version doesn't matches Hadoop release. Use SQOOP VERSION that matches the hadoop release that you are using..
@Christine if the problem you asked in question is resolved , then accept the answer and ask the question again. that will be great!!
Thank you for your reply. I used the command that you sent but it shows 'command not found'. So, I create a user with GRANT Privileges. Then, I used a sqoop version (sqoop-1.4.1) that is compatible with the Hadoop release. But, it still doesn't work. It shows this error: 'ERROR security.UserGroupInformation: PriviledgedActionException: Connection refused' 'ERROR tool.ImportTool: Encountered IOException running import job: Connection refused'
0
  • you are importing without providing target directory of hdfs. when we are not providing any target directory sqoop run import only once and create directory in hdfs with your mysql table name.

So your query

sqoop import --connect jdbc:mysql://localhost/meshtree --username user --password password --table test

this create a directory with the name test1 in hdfs

  • Just add following script

sqoop import --connect jdbc:mysql://localhost/meshtree --username user --password password --table test --target-dir test1

hope fully its work fine and just refer sqoop import and all related sqoop

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.