I have a total of 2000 rows in my csv file. I noticed that it stops the upload around 280-300. I am using Laravel for my application.
I am using fgetcsv() and my code is like this.
if (($handle = fopen($destinationPath . $filename, "r")) !== FALSE) {
while (($data = fgetcsv($handle, 3000, ",")) !== FALSE) {
if($row == 1){ $row++; continue; }
}
}
Inside the while loop has a lot of conditional statements, insert query and update query.
What causes this bug? In my localhost, it is properly working, but in my server it isn't. Here's the specs of my shared hosting:
What should be the solution for this? Thanks.
UPDATE: I tried adding the set_time_limit(0); then tried uploading the csv. Then I tried to use stopwatch and it exactly stops at 00:02:03.02 . That means it stops around 2 minutes.

set_time_limit(0)at the top of your script for testing only if that works set it to some safe value equal to time you can guess for csv processing