I'm using Laravel Excel "maatwebsite/excel": "^3.1", to import just 400 records with 10 fields. The process of just reading these records needs around 2GB of RAM (just to read data into a collection) reading them in chunks 20 rows per once (I tried increasing chunk size to 50 and it consumed more RAM)
try {
$collection = Excel::toCollection(
new UsersImport(
$entity,
$user,
$unique_fields,
$update_duplicates,
new JobHistory,
$this->userTypeRepository,
),
storage_path().'/uploads/' . $usersFileName
);
} catch (\Throwable $th) {
throw new Exception($th);
}
UsersImport.php
class UsersImport implements ToCollection, WithHeadingRow, WithChunkReading, ShouldQueue, WithEvents
{
private $entity, $user, $updateDuplicates, $uniqueFields, $jobHistory, $userTypeRepository;
use RemembersChunkOffset;
public function __construct(Entity $entity, User $user, array $uniqueFields, string $updateDuplicates, JobHistory $jobHistory, UserTypeRepository $userTypeRepository)
{
$this->entity = $entity;
$this->user = $user;
$this->updateDuplicates = $updateDuplicates;
$this->uniqueFields = $uniqueFields;
$this->jobHistory = $jobHistory;
$this->userTypeRepository = $userTypeRepository;
}
public function collection(Collection $rows) {
/* foreach ($rows as $row) {
$data = $row->toArray();
array_walk($data, array($this, 'parseData'));
} */
}
public function chunkSize(): int
{
return 20;
}
public function registerEvents(): array
{
return [];
}
public function headingRow(): int
{
return 15;
}
}
Why is this happening and how to make it consume less RAM? If I try to import 800 users I'll need 4GB RAM to make it work!!