1

I have very big .csv file, it's consists of 1000 rows. While importing into my database it's taking too much time and sometimes it display error like Time Out.

To avoid this I want read first 100 rows and import into database, and so on. How to do in PHP ?. Is there any easy method available?

1
  • 1
    I have imported CSV with over 100,000 rows using LOAD DATA INFILE query. It's pretty fast. But if PHP times out, then increase your max_execution_time Commented Jul 28, 2016 at 18:10

2 Answers 2

1

You sould probably consider using MySQL LOAD DATA syntax. It allows you, from a SQL query, to import data from a CSV in a table. You can parameter delimiter, line endings, etc... It could be much faster than parsing the file in PHP and generating INSERT's. Note that you have two options : LOAD DATA INFILE (file must be readable by mysql deamon) and LOAD DATA LOCAL INFILE (file must be readable from mysql client).

Sign up to request clarification or add additional context in comments.

Comments

0
 while (($data = fgetcsv($handle,1000, ",")) != FALSE) 
                {
                    $num = count($data); 
                    for ($c=0; $c < $num; $c++) 
                    {
                      $col[$c] = $data[$c];
                    }
     }

this code will be help for you

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.