0


I am trying to read data from csv file and insert into DB, using batch insertion. But it’s throwing multiple exceptions.

java.sql.BatchUpdateException: A statement attempted to return a result set in executeBatch().<br>
SEVERE: Servlet.service() for servlet oms threw exception
java.lang.OutOfMemoryError: Java heap space ……. <br>

Inserting code is bellow given :-

 public long updateOpenOrdData(Connection conn, String[] paramStrObj)throws Exception {
            long updatedRow = 0;
            CallableStatement cstmt = null;
             // System.out.println("StrAtt length :"+ strArr.length);
            try{
                for(int i=0; i<paramStrObj.length; i++){
                    int count =1;
                    int index = 0;
                    String[] dataArr = paramStrObj[i].split(",(?=([^\"]*\"[^\"]*\")*[^\"]*$)", -1); 
                    if(!dataArr[0].equals("SLC_Code_Desc") && dataArr.length >= 24 ){
                        cstmt = conn.prepareCall(PROC_INSERT_OPEN_ORD_TEMP);
                        System.out.print(dataArr[index]+", ");
                        cstmt.setString(count++, dataArr[index++]);
                        System.out.print(dataArr[index]+", ");
                        cstmt.setString(count++, dataArr[index++]);
                        cstmt.setString(count++, dataArr[index++]);
                        cstmt.setString(count++, dataArr[index++]);
                        cstmt.setString(count++, dataArr[index++]);
                        cstmt.setString(count++, dataArr[index++]);
                        cstmt.setString(count++, dataArr[index++]);
                        cstmt.setString(count++, dataArr[index++]);
                        cstmt.setString(count++, dataArr[index++]);
                        cstmt.setString(count++, dataArr[index++]);
                        cstmt.setString(count++, dataArr[index++]);
                        cstmt.setString(count++, dataArr[index++]);
                        cstmt.setString(count++, dataArr[index++]);
                        cstmt.setString(count++, dataArr[index++]);
                        cstmt.setString(count++, dataArr[index++]);
                        cstmt.setString(count++, dataArr[index++]);
                        cstmt.setString(count++, dataArr[index++]);
                        cstmt.setString(count++, dataArr[index++]);
                        cstmt.setString(count++, dataArr[index++]);
                        cstmt.setString(count++, dataArr[index++]);
                        cstmt.setString(count++, dataArr[index++]);
                        cstmt.setString(count++, dataArr[index++]);
                        cstmt.setString(count++, dataArr[index++]);
                        cstmt.setString(count++, dataArr[index]);
                        cstmt.setString(count++, null);
                        cstmt.setString(count++, null);
                        cstmt.setString(count++, null);
                        cstmt.addBatch();
                        //cstmt.executeUpdate();
                        }
                    }
                int[] insertRow =cstmt.executeBatch();
                System.out.println("Inserted row are :: "+insertRow);
                }catch (Exception e) {
                //System.out.println("Wrong data at line "+i+" and column "+ --index);
                e.printStackTrace();
            }finally{
                new DBService().releaseResources(null, cstmt);
            }

            return updatedRow;
        }

I need to insert data up to 100 GB. In this code one row is inserting every time.

1
  • I had a very similar issue a while back. To make a long story short, I found it much easier to perform some pre-processing on the file to ensure it was valid. Once I could confirm that, I found it MUCH quicker to import into a temporary table. By that I mean an actual SQL Table I created for this express purpose in a different schema. I used .NET and SqlBulkCopy but you could look at BULK INSERT or something similar. From memory, there are loads of CSV readers for Java. Commented Jan 4, 2016 at 8:56

3 Answers 3

3

You're facing an OutOfMemoryError - which is not surprising if you try to stuff all your INSERTs into one batch.

Setup a reasonable batchsize (10000 in this example) and do like this:

  cstmt.addBatch();
  if(++batchCounter % 10000 == 0) {
  cstmt.executeBatch();
  }
}
cstmt.executeBatch(); //one final time for the remaining rows

Just the same, you should not call your method with the full array of lines read from CSV - so maybe you already have the batchsize implemented on that level?

Just as well, a batched statement is not supposed to return result sets - so instead of calling a procedure which does return a result, you re well advised to do this as "classical" INSERT operations.

Sign up to request clarification or add additional context in comments.

2 Comments

I am reading data in chunk using : 'ByteBuffer buffer = ByteBuffer.allocate(512); StringBuilder sb = new StringBuilder();'
how do you get full lines that way? Can you share the code calling the method you provided?
1

For this kind of work you can use mysql tool : mysqlimport You 'll have better performance and no deal with java limitation.

edit : You can use an ETL like talend or sql server integration services to do that and use bulk insert node.

Comments

0
public SettingsDto uploadInChunck(Connection conn, SettingsDto oSettingsDto)throws Exception{
    SettingsDto objSettingsDto = new SettingsDto();
    List<SettingsDto> lstObj = null;
    FileChannel fc =null;
    File file = null;
    file = new File(oSettingsDto.getFormFile().getOriginalFilename());
    oSettingsDto.getFormFile().transferTo(file);
    RandomAccessFile raf = new RandomAccessFile(file, "r");
    fc = raf.getChannel();
    long startTime = System.currentTimeMillis();
        try{
            ByteBuffer buffer = ByteBuffer.allocate(512);
            StringBuilder sb = new StringBuilder();
            while(fc.read(buffer) > 0)
            {
                buffer.flip();
                for (int i = 0; i < buffer.limit(); i++)
                {
                    sb.append((char) buffer.get());
                }

                String[] strArr = sb.toString().split("\\n");

                if(null != oSettingsDto.getUploadFileTemplate() && oSettingsDto.getUploadFileTemplate().equals("Open Order Data")){
                    settingsDao.updateOpenOrdData(conn, strArr);
                } else if(null != oSettingsDto.getUploadFileTemplate() && oSettingsDto.getUploadFileTemplate().equals("Shipment Forecast Data")){
                    settingsDao.updateShipmentForcastData(conn, strArr);
                }else if(null != oSettingsDto.getUploadFileTemplate() && oSettingsDto.getUploadFileTemplate().equals("Inventory Target Data")){
                    settingsDao.updateInventoryTargetData(conn, strArr);
                }else if(null != oSettingsDto.getUploadFileTemplate() && oSettingsDto.getUploadFileTemplate().equals("Current Inventory Data")){
                    settingsDao.updateCurrentInventoryData(conn, strArr);
                }

                buffer.clear(); // do something with the data and clear/compact it.
                long endTime = System.currentTimeMillis();
                System.out.println("startTime :"+startTime);
                System.out.println("endTime :"+endTime);

            }
        }catch(Exception e){
        //e.printStackTrace();
        fc.close();
        raf.close();
        file.deleteOnExit();
    }


    return objSettingsDto;
}

1 Comment

please edit your question instead. And check out BufferedReader which reads a file line-by-line. You should also keep the e.printStackTrace() - it might contain vital information. Closing the streams only in case of an exception seems off as well

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.