So, I have a form that allows users to upload an excel sheet and then the system will import that excel data into mySQL.
However, every time I submit the form using AJAX, it starts the process, saves about half of the data, and then gives me a 504 gateway error. I have already changed the PHP config timeout to 300 but it still gives in half way through. I do not think that an excel sheet with a little under 1000 rows should be taking 5+ minutes?
Here is my code:
public function postImportGroup(Request $request)
{
if($request->hasFile('import_numbers')) {
$file = $request->file('import_numbers');
$file_extension = Input::file('import_numbers')->getClientOriginalExtension();
$supportedExt = array('csv', 'xls', 'xlsx');
if (!in_array_r($file_extension, $supportedExt)) {
return response()->json([
'status' => 'error',
'msg' => 'Please make sure that the uploaded file is a valid CSV, XLS, XLSX sheet.',
]);
}
}
$results = Excel::load($file)->get();
$results = json_decode($results[0], true);
$class = new DailyGroup();
$class->title = $request->group_name;
$class->user_id = Auth::guard('client')->user()->id;
$class->entries = count($results);
$class->save();
foreach ($results as $r => $value) {
//$data = array_values($value);
//return $value["employee_number"];
$group = new DailyGroupLocations();
$address = $value["address"] . ',' . $value["city"] . ',' . $value["state"] . ',' . $value["zip"];
$c = $value["country"];
$file_contents = file_get_contents('https://maps.googleapis.com/maps/api/geocode/json?address=' . urlencode($address) . '&components=country:' . urlencode($c) .'&sensor=false&key=xxx');
$json_decode = json_decode($file_contents);
if (isset($json_decode->results[0])) {
$group->lat = $json_decode->results[0]->geometry->location->lat;
$group->lng = $json_decode->results[0]->geometry->location->lng;
}
$phone = preg_replace('/\D+/', '', $value["ph"]);
$phone = '1'.$phone;
$group->user_id = Auth::guard('client')->user()->id;
$group->group_id = $class->id;
$group->employee_number = $value["employee_number"];
$group->work_date = $value["work_date"]["date"];
$group->first_name = $value["name"];
$group->last_name = $value["lastname"];
$group->phone = $phone;
$group->email = $value["email"];
$group->job_number = $value["job_number"];
$group->address = $value["address"];
$group->city = $value["city"];
$group->state = $value["state"];
$group->zip = $value["zip"];
$group->country = $value["country"];
$group->job_name = $value["job_name"];
$group->location = $value["location"];
$group->shift_description = $value["shift_description"];
$group->shift_start = $value["shift_start_time"];
$group->shift_end = $value["shift_end_time"];
$group->post_hours = $value["post_hours"];
$group->save();
}
return response()->json([
'status' => 'success',
'msg' => 'All data uploaded successfully. Please wait for tables to refresh.',
//'url' => '/user/location/location-areas'
]);
}
Is there anything I can do to optimize this? Am I running to many things in the foreach statement? Any tips or tricks I can use?