0

I have a csv file more than one lakhs ( 100,000 ) records. when I'm using these file for inserting records into my database it will be take more than half an hour. Is there any other way increase the efficiency and speed for these process or any frameworks are available?

3
  • How often do you have to add these records, and how are you currently doing it? Commented Oct 30, 2015 at 7:24
  • how are you trying to insert using the csv? If you are using phpmyadmin you can import a csv into a table, and it wouldn't take more than 30 seconds. Commented Oct 30, 2015 at 7:26
  • Add some more info. Are you inserting the csv data like firstname and lastname into the database like firstname='firstname', lastname='lastname'? Commented Oct 30, 2015 at 7:27

2 Answers 2

2

Use "mysqlimport" command. It works fast and is suited for large CSV files.

mysqlimport --ignore-lines=1 \
            --fields-terminated-by=, \
            --local -u root \
            -p Database \
             TableName.csv
Sign up to request clarification or add additional context in comments.

Comments

1

Convert the CSV file into a file containing a SQL request in a text editor. Execute in console the request using:

mysql -u user -ppassword db_name < file_path.sql

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.