My scenario is like this ,i have a huge dataset fetched from mysql table
$data = $somearray; //say the number of records in this array is 200000
iam looping this data,processing some functionalities and writing this data to an excel file
$my_file = 'somefile.csv';
$handle = fopen($my_file, 'w') or die('Cannot open file: ' . $my_file); file
for($i=0;$i<count($data);$i++){
//do something with the data
self::someOtherFunctionalities($data[$i]); //just some function
fwrite($handle, $data[$i]['index']); //here iam writing this data to a file
}
fclose($handle);
My problem is that the loop gets memory exhaustion ...it shows "fatal error allowed memory size of.." is there anyway to process this loop without exhaustion
Due to the server limitation im unable to increase php memory limit like
ini_set("memory_limit","2048M");
Im not concerned about the time it takes..even if it takes hours..so i did set_time_limit(0)