I am making a module which you upload records on a database from an excel file. Those are just phone numbers. So here's my code:
$file = Input::file('file');
Excel::load($file, function($reader) {
// Getting all results
$results = $reader->get()->toArray();
//var_dump($results);exit;
foreach ($results as $key => $value) {
$phone = new Phone();
$phone->msisdn = $value['msisdn'];
$phone->save();
}
});
I'm using https://github.com/Maatwebsite/Laravel-Excel to read the excel file. It works fine, 20,000 records uploads in 20mins I guess, is there a way to to it or upload it faster? I know that it depends also in the server but is there other factors? I'm using MySQL
Thanks
LOAD DATAor mysqlimport?LOAD DATA FROM INFILEare your best bets here. I would also think you could save time by not reading entire into memory if going the bulk insert route. Just read each line of file and continue to build bulk insert query, triggering the actual insert when query gets to number of records you want to insert at a time. I definitely would not take extra step of instantiating an Phone object with each insert unless you need to do something like validate/transform the data before making the insert.