5

I am making a module which you upload records on a database from an excel file. Those are just phone numbers. So here's my code:

        $file = Input::file('file');
        Excel::load($file, function($reader) {
            // Getting all results
            $results = $reader->get()->toArray();
            //var_dump($results);exit;
            foreach ($results as $key => $value) {
                $phone = new Phone();
                $phone->msisdn          =  $value['msisdn'];
                $phone->save();
            }
        });

I'm using https://github.com/Maatwebsite/Laravel-Excel to read the excel file. It works fine, 20,000 records uploads in 20mins I guess, is there a way to to it or upload it faster? I know that it depends also in the server but is there other factors? I'm using MySQL

Thanks

6
  • 1
    20000 is 20 minutes seems very slow. Can you log how long each operation takes to see where the bottle neck is? Commented Sep 8, 2015 at 14:21
  • Can you not save the (relevant bits of the) Excel file as a CSV and then use LOAD DATA or mysqlimport? Commented Sep 8, 2015 at 14:25
  • The main issue here is that you're inserting in a loop. Instead of using Eloquent, go for the Query Builder. Here is an example to get you going. Commented Sep 8, 2015 at 14:35
  • The problem is almost certainly on the Excel side of things. That package uses PHPExcel, which is an incredible package, but extremely resource intensive. This question asks about different packages for reading Excel files in PHP. I haven't researched these, so I can't speak for any of them. If you can save the Excel file as CSV, that's going to give you the best performance Commented Sep 8, 2015 at 15:01
  • I would think looking at bulk inserts or LOAD DATA FROM INFILE are your best bets here. I would also think you could save time by not reading entire into memory if going the bulk insert route. Just read each line of file and continue to build bulk insert query, triggering the actual insert when query gets to number of records you want to insert at a time. I definitely would not take extra step of instantiating an Phone object with each insert unless you need to do something like validate/transform the data before making the insert. Commented Sep 8, 2015 at 15:45

2 Answers 2

0

From their documentation possibly chunk the results as you are using such a large file. That way you can read in chunks of the file rather than loading the whole file into memory.

http://www.maatwebsite.nl/laravel-excel/docs/import#chunk

Sign up to request clarification or add additional context in comments.

Comments

0

You could try this faster alternative (https://github.com/rap2hpoutre/fast-excel):

(new FastExcel)->import($file, function ($line) {
    $phone = new Phone();
    $phone->msisdn = $value['msisdn'];
    $phone->save();
});

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.