How long will take insert about 500.000 records from CSV file to MySQL database by Java code? Database hosted on localhost.
Table structure: AI id, | varchar(8) | datetime | int | varchar(2). My code need to insert 70.000 records over 40 minutes. Is there any way to do it faster?
Here is the main part of my code:
CsvReader pro
ducts = new CsvReader(path);
products.readHeaders();
stmt = con.createStatement();
String updateString = "INSERT INTO table (T_V1, date, T_V2, T_V3) VALUES (?,?,?,?)";
PreparedStatement preparedStatement = con.prepareStatement(updateString);
while (products.readRecord()) {
v1= products.get("V1");
date = format.parse(products.get("Date") + " " + products.get("Hour"));
java.sql.Date dateDB = new java.sql.Date(data.getTime());
v2 = products.get("V2");
v3 = products.get("V3");
preparedStatement.setString(1, v1);
preparedStatement.setDate(2,dateDB);
preparedStatement.setInt(3, Integer.parseInt(v2));
preparedStatement.setString(4, v3);
preparedStatement.executeUpdate();
}
According to your advice I moved creation of the statement out of the loop. Now I have 33 records per second, after I had 29 rps.
pcapfiles to the DB. Partitioning the data into chunks and running it using Threads was way faster. Don't know how it works in MySQL. I used Oracle DB and had prodcuer threads creating table type (representing a set of DB records) and consumer threads creating DB transactions and calling stored procedure with those table type parameters inserting the data to the DB. But generally, if you already have CSV files, the fastest way would be to use some SQL Loader alternative dev.mysql.com/doc/refman/5.1/en/load-data.htmlPreparedStatementonly once, outside the loop and additionally, consider using batch inserts (preparedStatement.addBatch()) as suggested in this post.