I managed to build a project similar to the video: "Import CSV in Laravel 5.5 + Matching Fields"
Using excel import instead of CSV, it works fine with small excel files (less than 1000 rows) but I have excel files with more than 13000 rows the app keeps issuing the following error:
Maximum execution time of 30 seconds exceeded
Level
ERROR
Exception
{
"class": "Symfony\\Component\\Debug\\Exception\\FatalErrorException",
"message": "Maximum execution time of 30 seconds exceeded",
"code": 1,
.
.
.
I tried different ways and I read Laravel Excel documentation > Import Section > Chunk Reading and Queued reading but this also didn't work as I import excel file to collection then match the fields and then create new models and save them.
Please, advice for any tip can help me to import large excel files and match database fields with excel file column headings.