I have 9 million records in sqlserver. I am trying to import it into csv files so that I can put that data into mongo db. I have written Java code for sql2csv import. But I have two issue
- If I read all the data in list and then try to insert into CSV, I got outofmemorry exception.
- If I read line by line and try to insert every line in CSV, it took very long time to export data.
My code is some thing like
List list = new ArrayList();
try {
Class.forName(driver).newInstance();
conn = DriverManager.getConnection(url, databaseUserName, databasePassword);
stmt = conn.prepareStatement("select OptimisationId from SubReports");
result = null;
result = stmt.executeQuery();
// stmt.executeQuery("select * from Subscription_OptimisationReports");
result.setFetchSize(1000);
while (result.next()) {
//System.out.println("Inside while");
SubReportsBean bean = new SubReportsBean();
bean.setOptimisationId(result.getLong(("OptimisationId")));
list.add(bean);
generateExcel(list);
}
//generateExcel(list);
conn.close();
}
Can there be a faster approach to export all data quickly? Or even better if it can directly be exported to mongo instead of csv.