0

I have a large number of records (around 40,000) in a csv file to update into a database table.

I know how to write code (PHP) to update the records but I'm concerned about the large amount of data.

How should I handle updating a large number of records at once?

1
  • I just inserted ~300,000 rows to a table. It doesn't take much time. Try using HeidiSQL's table/database manager. Commented Oct 3, 2012 at 5:12

3 Answers 3

2

You can use this. May be helpful to you

LOAD DATA INFILE 'data.csv' INTO TABLE my_table;


LOAD DATA INFILE 'data.txt' INTO TABLE table2
FIELDS TERMINATED BY '\t';
Sign up to request clarification or add additional context in comments.

Comments

0

Loop through all the lines in the csv files. Parse each line into your different variables to update your database. Use your normal SQL calls.

Only thing different, you may need to add set_time_limit(0); at the very top of your file. This will prevent PHP from timing out.

Comments

0

Follow some kind of distributed processing flows. Have a distribution thread, and a number of worker threads. Distribution thread can read the data from csv files, allocate the work to a free worker thread in the worker pool. You can limit the thread pool size, balancing performance and resource usage. And you already told, you know how to do the update., so worker thread code is understood. :) Hopes this helps

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.