I have a Laravel app, with an underlying database with long sequences. I would like to provide the opportunity for the users to download the database in JSON format. For that, I first get the records, then convert them to JSON format and write to a file. However, when I moved the app to the production server, it gave me
You don't have permission to access /saveJson on this server.
From the logs, is the error message:
[2021-01-07 19:08:50] local.ERROR: Allowed memory size of 1887436800 bytes exhausted (tried to allocate 675803754 bytes) {"userId":1,"exception":"[object] (Symfony\\Component\\Debug\\Exception\\FatalErrorException(code: 1): Allowed memory size of 1887436800 bytes exhausted (tried to allocate 675803754 bytes) at /home/xxx/vendor/league/flysystem/src/Util/MimeType.php:205)
[stacktrace]
The memory_limit in the server is already set to -1 and restarted after setting it.
The relevant parts of the code:
DB::connection()->disableQueryLog();
$large_db = Model::with('regions', 'genes.gene_models.aro_categories')
->where('curated', '<>', 4)
->get();
$db_JSON = $large_db->toJSON();
Storage::disk('public')->put("database_$date.json", $db_JSON);
DB::connection()->enableQueryLog();
According to MySQL, the whole database is around 170M, according to meminfo, the server has 1962MB memory, so I do not totally understand why it cannot load the records - although since the database would be updated, I would like to create it in a way that it won't crash for similar reasons in the future either.
Is there any smarter way to export the database without exceeding the memory limit? Some kind of buffering?
Thank you for any suggestions!