0

In my system I have database in Mysql. I want to import that to hadoop file system. I found something about Sqoop, but i'm not getting command to do that.

1

4 Answers 4

2

sqoop import --connect jdbc:mysql://mysql-server-name/db_name --username user --password password --table table_name --target-dir target_directory_name -m1

Hope it helps..

Sign up to request clarification or add additional context in comments.

Comments

0

You need to install mysql jdbc/java connector and run sqoop command.

sudo yum install mysql-connector-java
ln -s /usr/share/java/mysql-connector-java.jar /var/lib/sqoop/mysql-connector-java.jar

You can run sqoop command from Save data into mysql from hive hadoop through sqoop?

Comments

0

1) Install & configure MySQL first. Create database in MySQL

2) sqoop import --connect jdbc:mysql://localhost/databasename --username $USER_NAME --password $PASSWORD$ --table tablename --m 1 command will import data.

e.g.

sqoop import --connect jdbc:mysql://localhost/testDb --username root --password hadoop123 --table student --m 1

In above command, values of various parameters database:‘testDb’ , username: ‘root’, password: ‘hadoop123’, and table student.

Have a look at this article 1 and article 2 for better understanding in step-by-step manner

Comments

0

There are multiple ways to achieve this,

  1. The old way is to use Sqoop

  2. Another way is to use Shell Script,

    a. Connect with MySQL(mysql -h<host> -u<username> -p<password>)

    b. Open connectivity with HDFS(hadoop fs)

    c. Run SELECT on table and do put hdfs

  3. Recommended way is to use Apache Nifi

    a. Use ExecuteSQL, PutHDFS processors

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.