7

I am trying to import 300 mg csv file into mySql table. I am using this command:

LOAD DATA INFILE 'c:/csv/bigCSV.csv' IGNORE 
INTO TABLE table
FIELDS TERMINATED BY ',' 
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES

And it works great with small files (1 mg and so on), but when I try to load a big file like the one mentioned, MySql Workbench (which I use to execute my queries) runs the command, everything ok and green but 0 rows affected. No changes at all in the table.

I am 10000% sure that the table is ok because when I take a portion of that file, eg 1mg and load it into the same table it works fine.

Did anyone had this kind of problems?

Thank you.

10
  • Anything from the logs? Server or client? It is possible that you are running into a timeout problem, to be sure I would need to see the logs. Commented Nov 26, 2015 at 18:17
  • Try from the MySQL Command-Line. Commented Nov 26, 2015 at 18:17
  • I am using my own computer as a Server. It doesn look as a timeout because it states as finished. Here is the log: 15:44:29 LOAD DATA INFILE 'c:/csv/bigSCV.csv' IGNORE INTO TABLE table FIELDS TERMINATED BY ',' LINES TERMINATED BY '\r\n' IGNORE 1 LINES 0 row(s) affected Records: 0 Deleted: 0 Skipped: 0 Warnings: 0 1.529 sec Commented Nov 26, 2015 at 18:18
  • @wchiquito - Did't tried that. Good idea, I'l give it a try now. Commented Nov 26, 2015 at 18:20
  • @Mihai, Yes, I just put that as a reference, my table is called test1 on my Server. Commented Nov 26, 2015 at 18:29

2 Answers 2

14

I have "solved" it. Don`t know why and I feel stupid for not playing with the statement earlier but like this:

LOAD DATA INFILE 'c:/csv/eventsbig.csv' IGNORE 
INTO TABLE std9
FIELDS TERMINATED BY ',' 
LINES TERMINATED BY '\n'

Without the "IGNORE 1 LINES" at the end and it works with files of any size.

Sign up to request clarification or add additional context in comments.

4 Comments

Sadly this doesn't work well for more complex CSV data, where you have escaping complying with the CSV RFC.
thank you so much I just imported 9.000.000 entrys(4gb) in 2mins <3
@DouglasGaskell, what's the solution in that case?
@GitHunter0 Write your own parser (Not recommended), or use an existing one (easy, recommended), and then handle the insertions yourself? It's not as easy as this, and probably not as fast, but it'd probably take an afternoon to setup, and would probably run in reasonable time. I did this with SQLite a while back, and was managing to handle ~100k records/s, which is ~6million/m, for >1billion records (~350GB). Though that took days to setup because I wanted it to be performant. Only worth it for repeated use, not one-and-done. Or, there might be a utility that does this for you by now.
1
LOAD DATA LOW_PRIORITY LOCAL INFILE 'C:\\Learning\\App6_DBconnection\\CC.csv' 
INTO TABLE `test`.`ccd` 
CHARACTER SET armscii8 
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' ESCAPED BY '"' LINES TERMINATED BY '\r\n' 
IGNORE 1 LINES (`Cd_Type`, `Full_Name`, `Billing_Date`);

This will work even for large data sets of more than 1.5 million records.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.