Is there any way to import multiple tables at a single HDFS location using sqoop.I Have 5 tables in db2 schema(tab1,tab2,tab3,tab4,tab5),and one hdfs location /user/hive/db2_tables. I want to store db2 data as below.
/user/hive/db2_tables/tab1
/user/hive/db2_tables/tab2
/user/hive/db2_tables/tab3
/user/hive/db2_tables/tab4
/user/hive/db2_tables/tab5
but after importing first table when I'm giving same target directory in 2nd sqoop command its showing directory already exists. I'm using below sqoop command.
sqoop import -Dmapreduce.output.basename="tab1" --connect jdbc:db2://xx.x.x.xxx:xxxxx/db2 --query
"select * from schemaname.tablename where \$CONDITIONS " --username username -P --split-by id
--target-dir /user/hive/db2_tables/ --fields-terminated-by '|'