In my system I have database in Mysql. I want to import that to hadoop file system. I found something about Sqoop, but i'm not getting command to do that.
4 Answers
You need to install mysql jdbc/java connector and run sqoop command.
sudo yum install mysql-connector-java
ln -s /usr/share/java/mysql-connector-java.jar /var/lib/sqoop/mysql-connector-java.jar
You can run sqoop command from Save data into mysql from hive hadoop through sqoop?
Comments
1) Install & configure MySQL first. Create database in MySQL
2) sqoop import --connect jdbc:mysql://localhost/databasename --username $USER_NAME --password $PASSWORD$ --table tablename --m 1 command will import data.
e.g.
sqoop import --connect jdbc:mysql://localhost/testDb --username root --password hadoop123 --table student --m 1
In above command, values of various parameters database:‘testDb’ , username: ‘root’, password: ‘hadoop123’, and table student.
Have a look at this article 1 and article 2 for better understanding in step-by-step manner
Comments
There are multiple ways to achieve this,
The old way is to use Sqoop
Another way is to use Shell Script,
a. Connect with MySQL(
mysql -h<host> -u<username> -p<password>)b. Open connectivity with HDFS(
hadoop fs)c. Run
SELECTon table and doput hdfsRecommended way is to use Apache Nifi
a. Use ExecuteSQL, PutHDFS processors